TechCrunch News 02月12日
ChatGPT may not be as power-hungry as once assumed
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

一项新的研究表明,OpenAI的ChatGPT平台的能耗可能没有之前认为的那么高,但具体消耗取决于使用方式和所用AI模型。Epoch AI的分析显示,使用GPT-4o模型时,平均每次ChatGPT查询消耗约0.3瓦时,低于许多家用电器。尽管如此,AI的整体能耗及其环境影响仍备受关注,尤其是在AI公司迅速扩张基础设施的情况下。报告还指出,随着AI技术的发展和应用范围的扩大,未来的能耗可能会增加,推理模型和大规模部署都将带来更高的能源需求。

💡Epoch AI的分析指出,使用OpenAI最新的GPT-4o模型,平均每次ChatGPT查询消耗约0.3瓦时,这比之前普遍引用的3瓦时(相当于Google搜索的10倍)的估计要低得多,更接近普通家用电器的能耗水平。

🌱AI能耗问题日益凸显,100多家组织呼吁AI行业和监管机构确保新的AI数据中心不会耗尽自然资源,并迫使公用事业依赖不可再生能源。AI数据中心可能需要接近加州2022年的全部电力容量,到2030年,训练前沿模型可能需要相当于八个核反应堆的电力输出。

🧠OpenAI和其他公司正转向推理模型,这些模型功能更强大,但需要更多的计算资源,从而消耗更多的电力。虽然OpenAI发布了更节能的推理模型,如o3-mini,但效率的提高可能无法抵消推理模型的“思考”过程和全球AI使用量增长带来的电力需求增加。

✅ 如果用户担心AI的能源足迹,可以减少使用ChatGPT等应用,或选择计算需求较低的模型,例如OpenAI的GPT-4o-mini,并尽量避免需要大量数据处理或生成的任务。

ChatGPT, OpenAI’s chatbot platform, may not be as power-hungry as once assumed. But its appetite largely depends on how ChatGPT is being used, and the AI models that are answering the queries, according to a new study.

A recent analysis by Epoch AI, a nonprofit AI research institute, attempted to calculate how much energy a typical ChatGPT query consumes. A commonly-cited stat is that ChatGPT requires around 3 watt-hours of power to answer a single question, or 10 times as much as a Google search.

Epoch believes that’s an overestimate.

Using OpenAI’s latest default model for ChatGPT, GPT-4o, as a reference, Epoch found the average ChatGPT query consumes around 0.3 watt-hours — less than many household appliances.

“The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car,” Joshua You, the data analyst at Epoch who conducted the analysis, told TechCrunch.

AI’s energy usage — and its environmental impact, broadly speaking — is the subject of contentious debate as AI companies look to rapidly expand their infrastructure footprints. Just last week, a group of over 100 organizations published an open letter calling on the AI industry and regulators to ensure that new AI data centers don’t deplete natural resources and force utilities to rely on non-renewable sources of energy.

You told TechCrunch his analysis was spurred by what he characterized as outdated previous research. You pointed out, for example, that the author of the report that arrived at the 3-watt-hours estimate assumed OpenAI used older, less efficient chips to run its models.

Image Credits:Epoch AI

“I’ve seen a lot of public discourse that correctly recognized that AI was going to consume a lot of energy in the coming years, but didn’t really accurately describe the energy that was going to AI today,” You said. “Also, some of my colleagues noticed that the most widely-reported estimate of 3 watt-hours per query was based on fairly old research, and based on some napkin math seemed to be too high.”

Granted, Epoch’s 0.3 watt-hours figure is an approximation, as well; OpenAI hasn’t published the details needed to make a precise calculation.

The analysis also doesn’t consider the additional energy costs incurred by ChatGPT features like image generation, or input processing. You acknowledged that “long input” ChatGPT queries — queries with long files attached, for instance — likely consume more electricity upfront than a typical question.

You said he does expect baseline ChatGPT power consumption to rise, however.

“[The] AI will get more advanced, training this AI will probably require much more energy, and this future AI may be used much more intensely — handling much more tasks, and more complex tasks, than how people use ChatGPT today,” You said.

While there have been remarkable breakthroughs in AI efficiency in recent months, the scale at which AI is being deployed is expected to drive enormous, power-hungry infrastructure expansion. In the next two years, AI data centers may need close to all of California’s 2022 power capacity (68 GW), according to a Rand report. By 2030, training a frontier model could demand power output equivalent to that of eight nuclear reactors (8 GW), the report predicted.

ChatGPT alone reaches an enormous — and expanding — number of people, making its server demands similarly massive. OpenAI, along with several investment partners, plans to spend billions of dollars on new AI data center projects over the next few years.

OpenAI’s attention — along with the rest of the AI industry’s — is also shifting to so-called reasoning models, which are generally more capable in terms of the tasks they can accomplish, but require more computing to run. As opposed to models like GPT-4o, which respond to queries nearly instantaneously, reasoning models “think” for seconds to minutes before answering, a process that sucks up more computing — and thus power.

“Reasoning models will increasingly take on tasks that older models can’t, and generate more [data] to do so, and both require more data centers,” You said.

OpenAI has begun to release more power-efficient reasoning models like o3-mini. But it seems unlikely, at least at this juncture, the efficiency gains will offset the increased power demands from reasoning models’ “thinking” process and growing AI usage around the world.

You suggested that people worried about their AI energy footprint use apps such as ChatGPT infrequently, or select models that minimize the computing necessary — to the extent that’s realistic.

“You could try using smaller AI models like [OpenAI’s] GPT-4o-mini,” You said, “and sparingly use them in a way that requires processing or generating a ton of data.”

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

ChatGPT OpenAI AI能耗 GPT-4o 推理模型
相关文章