TechCrunch News 03月04日
No part of Amazon is ‘unaffected’ by AI, says its head of AGI
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

亚马逊VP Vishal Sharma称公司各部分都受AI影响,亚马逊在AWS、仓库机器人、Alexa等方面部署AI。其还谈到多种AI相关问题,如开源模型对计算需求的影响、不同场景的AI应用等。

🎯亚马逊在多领域部署AI,包括AWS、仓库机器人和Alexa等

💡认为开源模型不太可能减少计算资源需求

🛠️推出Bedrock,可在AWS内切换不同基础模型

🤔对欧洲公司GenAI策略因地缘政治变化的影响看法

“There’s scarcely a part of the company that is unaffected by AI,” said Vishal Sharma, Amazon’s VP of Artificial General Intelligence, today at Mobile World Congress in Barcelona, Spain. He also dismissed the idea that open-source models might reduce compute needs and demurred over the question of whether European companies would change their GenAI strategies in light of geopolitical tensions with the US.

Interviewed by TechCrunch’s Mike Butcher on stage at the 4YFN startup conference, Sharma — a former AI entrepreneur-turned AI head — said Amazon was now deploying AI in the form of its own foundational models across AWS, the robotics in its warehouses, and the Alexa consumer product, among many other incarnations. 

“We have something like three-quarters of a million robots now, and they are doing everything from picking things to running themselves within the warehouse. The Alexa product is probably the most widely deployed home AI product in existence… There’s no part of Amazon that’s untouched by generative AI.”

Last December, Amazon Web Services (AWS), Amazon’s cloud computing division, announced a new family of four text-generating models, multimodal generative AI models it calls Nova. 

Sharma said these are all tested against public benchmarks: “It became pretty clear there’s a huge diversity of use cases. There’s not a one-size-fits-all. There are some places where you need video generation… and other places, like Alexa, where you ask it to do specific things, and the response needs to be very, very quick, and it needs to be highly predictable. You can’t hallucinate ‘unlock the back door’.”

However, he said the scenario of reducing the amount of compute resources — because of smaller, Open Source models — was unlikely to happen: “As you begin to implement it in different scenarios, you just need more and more and more intelligence,” he said.

Amazon, which has also launched “Bedrock” its product aimed at companies and startups that want to mix and match various foundational models — even China’s DeepSeek — is a service within Amazon Web Services, and one where “you can switch you from one model to another,” he said. 

Amazon is also building a huge AI compute cluster on its Trainium 2 chips in partnership with Anthropic (in which it’s invested $8 billion). But in the meantime, Elon Musk’s xAI recently released its latest flagship AI model, Grok 3, using an enormous data center in Memphis containing around 200,000 GPUs to train Grok 3.

Asked to comment on this level of compute resources, Sharma said: “My personal opinion is that compute will be a part of the conversation for a very long time to come.”

Mike Butcher, TechCrunch and Vishal Sharma, AmazonImage Credits:Mobile World Congress

He did not think Amazon was under pressure from the blizzard of open source models which had recently emerged from China: “I wouldn’t describe it like that,” he said. Thus, Amazon is relaxed about deploying DeepSeek and other models on AWS: “We’re a company that believes in choice… We are open to adopting whatever trends and technologies are good from a customer perspective,” said Sharma.

When Open AI appeared in late 2022 with ChatGPT, did he think Amazon was caught napping?

“No, I think I would disagree with that line of thought,” he said. “Amazon has been working on AI for about 25 years. If you look at something like Alexa, there’s something like 20 different AI models that are running at Alexa… We had billions of parameters that existed already for language. We’ve been looking at this for quite some time.”

On the issue of the recent controversy surrounding Trump and Zelensky, and the subsequent cooling of relations between the current US Administration and many European nations, did he think European companies might look elsewhere for GenAI resources in the future? 

Sharma admitted this issue was “outside” of his “zone of expertise” and the consequences are “very hard for me to predict…” But he did, somewhat diplomatically, hint that some companies might adjust their strategy: “What I will say is that it is the case that technical innovation responds to incentives,” he said.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

亚马逊 AI 计算资源 基础模型
相关文章