Published on July 16, 2025 7:08 PM GMT
( Any feedback from the mods would be greatly appreciated!)
This is both my first post here and a question I've been mulling over for quite a while. I came across some discussion adjacent to this topic here on Lesswrong, though I couldn't find any more. This is what I found:
- https://www.lesswrong.com/posts/N7KYWJPmyzB6bJSYT/the-next-ai-winter-will-be-due-to-energy-costs-1https://www.lesswrong.com/posts/xrxh3usuoYMckkKom/preserving-and-continuing-alignment-research-through-ahttps://www.lesswrong.com/w/civilization-scale-energy
As it appears to me, many predictions of AI development seem to ignore energy (specifically oil). It appears to me that energy is the lifeblood of AI development, particularly due to the fact that data-centers themselves require energy to operate ( whether by being plugged into the grid or being plugged into SMRs, or a hydroelectric plant, or a billion very motivated hamsters on wheels). Furthermore, the very construction of data centers requires inputs from oil. Particularly in mining the required materials needed, as well as shipping and processing them.
There are those who argue that we are running out of energetically cheap, high quality oil. Some relevant literature is below:
[ Energy Return on Investment, Peak Oil, and the End of Economic Growth]
( 📘 DOI: 10.1016/j.ecolecon.2011.01.021)
[ Assessing the feasibility of the energy transition with the MEDEAS model]
( 📘 DOI: 10.1016/j.enpol.2020.111247 )
[ Comparative net energy analysis of renewable transition pathways]
(📘 DOI: 10.1016/j.energy.2016.03.089)
[ Resource constraints for clean energy technologies ]
(📘 DOI: 10.1016/j.resourpol.2020.101529 )
There appear to be multiple camps on this issue
1) " When we run out of high EROI ( Energy return on investment ) oil, our civilization will inevitably regress back to a pre-industrial standard of living"
2) " There is no need to worry, oil companies don't care about EROI, and peak oilers have been wrong before"
3) "Running out of oil may force us to simplify certain aspects of our civilization, while building up renewables and nuclear early"
So taking all that into consideration, I'm merely curious as to the probability that future AI development will be seriously delayed or ended due to energy decline. Camp 1) would argue that we should be learning how to farm right now and I wouldn't assign a 0% probability to their view, though I don't know what to assign to it either...
Discuss