Towardsai 2024年06月03日
Fueling (literally) the AI Boom
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Last Updated on June 3, 2024 by Editorial Team Author(s): Aneesh Patil Originally published on Towards AI. Photo by NASA on Unsplash Let’s take a moment to step back in time to our 5th-grade selves, a nostalgic #Throwback____ (insert today’s date) if you will. Picture ourselves in science class, perhaps doodling at our desks, daydreaming, or diligently listening to the teacher. Regardless of our activities, chances are we’ve encountered the term ‘greenhouse gas’ in one form or another. Just to jog our memories, greenhouse gases are those that trap heat within our atmosphere, leading to rising temperatures, global warming, and various environmental repercussions. Among the array of greenhouse gases, carbon dioxidestands out as particularly impactful. Its prevalence in our atmosphere is largely attributed to its release during the combustion of fossil fuels for energy production. In fact, approximately 80% of the energy consumed in the United States is derived from petroleum, natural gas, and coal, all of which emit carbon dioxide when burned. This perpetuates a cycle where the combustion process not only generates energy but also contributes significantly to the accumulation of greenhouse gases in our atmosphere. While the contribution of the information and communications technology (ICT) sector to global CO₂ emissions may seem like a drop in the ocean, it’s worth noting that it accounts for approximately 2%, as highlighted in a 2020 study by the International Telecommunication Union. However, this statistic may be just the tip of the iceberg. Since the study’s publication, significant shifts have occurred within the sector, particularly with the rise of artificial intelligence (AI), which has captured widespread attention and could potentially alter this emission estimate. Yet, despite AI’s burgeoning prominence, companies holding AI models have been notably reluctant to disclose data regarding their models’ energy consumption, often citing ‘competition’ as a reason for their silence. Nevertheless, numerous researchers have endeavored to provide rough approximations. For instance, a report from the New Yorker estimated that ChatGPT alone handles approximately 200 million queries daily, translating to an electricity consumption of around 500,000 kilowatt-hours. On doing some math, we get => 500,000 kWh / 200,000,000 requests => 0.0025 kWh/request = 2.5W/request To provide some context, let’s consider a standard household lightbulb, typically rated at 60W. Leaving this lightbulb on for an hour consumes approximately 0.06 kWh of electricity. Now, when we compare this to ChatGPT, each query made to the model is akin to leaving a lightbulb on for roughly 3 minutes (0.05 hours). It’s essential to note that this calculation is an estimate specific to OpenAI’s ChatGPT. Photo by Nikola Johnny Mirkovic on Unsplash Among the major Large Language Models (LLMs) in the market are Google’s Gemini, Meta’s LLaMA 3, and Anthropic’s Claude 3. When combined, their daily query volume could reach approximately 600 million, translating to 1,500 megawatt-hours (MWh) of electricity consumption. To put this in perspective, according to the U.S. Energy Information Administration, the average household in the U.S. consumes 11 MWh of electricity annually. Therefore, the energy used by these LLMs in just one day would be comparable to the electricity consumed by approximately 140 households over the course of a year. If this does not seem baffling enough already, Dutch researcher Alex de Vries performed an in-depth study in 2023 to explore trends in energy appetite for AI models and the results are shocking. With the current trajectory, by 2027 NVIDIA will have an estimated 1.5 million server units running AI workloads which will observe 85.4–134 TWh of electricity consumption annually. This is just NVIDIA’s servers! With the competition in the AI hardware space proliferating, NVIDIA’s market share is anticipated to be lower in 2027 than what it is today so we can expect more than 2 million server units running AI workloads at the minimum based on de Vries’ study. On a yearly comparison, 134.0 TWh is equivalent to the amount of electricity consumed by Argentina, a country with over 45 million people. So, AI is only scraping the surface when it comes to tech’s growing appetite for energy. Think about it — cryptocurrency, edge computing, virtual/augmented reality, robotics, and the Internet of Things are just a handful of technologies poised to demand vast amounts of computational power once they hit the commercial stage at the scale AI enjoys today. And with ongoing research in each of these areas, it’s not far-fetched to imagine them revolutionizing our daily lives in ways we haven’t even dreamed of yet. But here’s the kicker: all this innovation comes with a hefty energy bill, and considering the limitations of resources and the environmental impact of electricity generation, it seems like a daunting challenge to keep up. Yet, amidst these concerns, there’s reason for optimism. There are promising and sustainable alternatives emerging to challenge our current energy consumption practices. Photo by Alexandre Debiève on Unsplash What could serve as a solid starting point? Well, if you’re wondering if companies can work towards optimizing for a more power-efficient hardware architecture, you might be onto something. A study published by Harvard University states that “a large portion of their energy use is spent simply passing data back-and-forth between chips”. In other words, a computer chip might need to fetch data from an external memory bank and it does so via signals; the larger the distance between the chip and the memory bank, the larger the amount of energy consumed. Gage Hills, an Electrical Engineering professor at Harvard, is researching ways to prevent a chip from sending signals to larger distances to save up on energy. One way is by stacking multiple layers of an integrated circuit in three dimensions and cutting out the need for passing any data outside the chip. The drawback to this approach is that when building these layers, there is a possibility of melting wires in one of the lower layers due to high temperatures. Engineers at the University of Virginia are exploring a way to cut out the intermediary by establishing direct contact between the chip and the memory bank […]

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

相关文章