TechCrunch News 03月22日 04:51
Meta has revenue sharing agreements with Llama AI model hosts, filing reveals
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

根据一份最新解密的法庭文件,Meta公司通过与Llama AI模型的托管公司签订收入分成协议来获取收益,这与此前Meta CEO Mark Zuckerberg 声明的“不出售Llama AI模型的使用权”有所出入。该文件揭示了Meta从AWS、Nvidia、Databricks等托管公司处获得部分收入。尽管如此,Meta 仍坚持认为Llama AI的主要价值在于通过AI研究社区改进模型,并将其用于Meta AI等产品。此外,Meta正计划增加在AI领域的投资,并可能推出Meta AI的订阅服务。

💰Meta通过与Llama AI模型托管公司签订收入分成协议来获取收益,尽管Meta CEO曾表示不会出售Llama的使用权。法庭文件显示,Meta与多家公司共享Llama模型带来的收入。

💻Meta列出了多家Llama模型托管合作伙伴,包括AWS、Nvidia、Databricks、Groq、Dell、Azure、Google Cloud和Snowflake。这些合作伙伴提供额外的服务和工具,简化了Llama模型的部署和运行。

💡Meta认为Llama AI的主要价值在于AI研究社区对模型的改进,并将其用于Meta AI等产品。Zuckerberg表示,开放Llama AI有利于Meta产品的改进,而不是闭门造车。

💸Meta计划大幅增加在AI领域的资本支出,预计2025年的资本支出将达到600亿至800亿美元,主要用于数据中心和AI开发团队的建设。为了抵消部分成本,Meta正在考虑推出Meta AI的订阅服务。

In a blog post last July, Meta CEO Mark Zuckerberg said that “selling access” to Meta’s openly available Llama AI models “isn’t [Meta’s] business model.” Yet Meta does make at least some money from Llama through revenue-sharing agreements, according to a newly unredacted court filing.

The filing, submitted by attorneys for the plaintiffs in the copyright lawsuit Kadrey v. Meta, in which Meta stands accused of training its Llama models on hundreds of terabytes of pirated ebooks, reveals that Meta “shares a percentage of the revenue” that companies hosting its Llama models generate from users of those models.

The filing doesn’t reveal which specific hosts pay Meta. But Meta lists a number of Llama host partners in various blog posts, including AWS, Nvidia, Databricks, Groq, Dell, Azure, Google Cloud, and Snowflake.

Developers aren’t required to use a Llama model through a host partner. The models can be downloaded, fine-tuned, and run on a range of different hardware. But many hosts provide additional services and tooling that makes getting Llama models up and running simpler and easier.

Zuckerberg mentioned the possibility of licensing access to Llama models during an earnings call last April, when he also floated monetizing Llama in other ways, like through business messaging services and ads in “AI interactions.” But he didn’t outline specifics.

More recently, Zuckerberg asserted that most of the value Meta derives from Llama comes in the form of improvements to the models from the AI research community. Meta uses Llama models to power a number of products across its platforms and properties, including Meta’s AI assistant, Meta AI.

“I think it’s good business for us to do this in an open way,” Zuckerberg said during Meta’s Q3 2024 earnings call. “[I]t makes our products better rather than if we were just on an island building a model that no one was kind of standardizing around in the industry.”

The fact that Meta may generate revenue in a rather direct way from Llama is significant because plaintiffs in Kadrey v. Meta claim that Meta not only used pirated works to develop Llama, but facilitated infringement by “seeding,” or uploading, these works. Plaintiffs allege that Meta used surreptitious torrenting methods to obtain ebooks for training, and in the process — due to the way torrenting works — shared the ebooks with other torrenters.

Meta plans to significantly up its capital expenditures this year, largely thanks to its increasing investments in AI. In January, the company said it would spend $60 billion-$80 billion on CapEx in 2025 — roughly double Meta’s CapEx in 2024 — primarily on data centers and growing the company’s AI development teams.

Likely to offset a portion of the costs, Meta is reportedly considering launching a subscription service for Meta AI that’ll add unspecified capabilities to the assistant.

Meta didn’t immediately respond to a request for comment. We’ll update this piece if we hear back.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Meta Llama AI 盈利模式 人工智能 收入分成
相关文章