Mashable 05月22日 07:59
Report: Creating a 5-second AI video is like running a microwave for an hour
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

麻省理工科技评论发布报告,揭示了人工智能产业的能源消耗情况。报告指出,ChatGPT等大型语言模型的每次响应消耗114到6706焦耳的能量,相当于微波炉工作0.1到8秒。AI生成视频耗能更高,生成5秒视频消耗340万焦耳,相当于微波炉工作一小时。数据中心因AI技术能耗翻倍,预计到2028年,一半电力将用于AI工具。随着AI应用普及,能源消耗问题日益突出。

💡ChatGPT等大型语言模型的能源消耗量根据模型参数和准确性的不同,每次响应消耗114到6706焦耳的能量,相当于微波炉工作0.1到8秒。

🎬AI生成视频比生成文本内容消耗更多能源。生成一个5秒的视频需要约340万焦耳,是生成高质量图像所需能量的700多倍,相当于微波炉运行超过一小时。

📈由于人工智能技术的应用,美国数据中心的能耗自2017年以来翻了一番。政府数据显示,到2028年,数据中心一半的电力将用于支持人工智能工具。

🤔如果一个人向AI聊天机器人提出15个问题,请求生成10张图片和3个5秒的视频,总共将消耗约2.9千瓦时的电力,相当于微波炉运行超过3.5小时。

You've probably heard that statistic that every search on ChatGPT uses the equivalent of a bottle of water. And while that's technically true, it misses some of the nuance.

The MIT Technology Review dropped a massive report that reveals how the artificial intelligence industry uses energy — and exactly how much energy it costs to use a service like ChatGPT.

The report determined that the energy cost of large-language models like ChatGPT cost anywhere from 114 joules per response to 6,706 joules per response — that's the difference between running a microwave for one-tenth of a second to running a microwave for eight seconds. The lower-energy models, according to the report, use less energy because they uses fewer parameters, which also means the answers tend to be less accurate.

It makes sense, then, that AI-produced video takes a whole lot more energy. According to the MIT Technology Report's investigation, to create a five-second video, a newer AI model uses "about 3.4 million joules, more than 700 times the energy required to generate a high-quality image". That's the equivalent of running a microwave for over an hour.

The researchers tallied up the amount of energy it would cost if someone, hypothetically, asked an AI chatbot 15 questions, asked for 10 images, and three five-second videos. The answer? Roughly 2.9 kilowatt-hours of electricity, which is the equivalent of running a microwave for over 3.5 hours.

The investigation also examined the rising energy costs of the data centers that power the AI industry.

The report found that prior to the advent of AI, the electricity usage of data centers was largely flat thanks to increased efficiency. However, due to energy-intensive AI technology, the energy consumed by data centers in the United States has doubled since 2017. And according to government data, half the electricity used by data centers will go toward powering AI tools by 2028.

This report arrives at a time in which people are using generative AI for absolutely everything. Google announced at its annual I/O event that it's leaning into AI with fervor. Google Search, Gmail, Docs, and Meet are all seeing AI integrations. People are using AI to lead job interviews, create deepfakes of OnlyFans models, and cheat in college. And all of that, according to this in-depth new report, comes at a pretty high cost.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

人工智能 能源消耗 ChatGPT 数据中心
相关文章