MarkTechPost@AI 2024年10月09日
Anthropic AI Introduces the Message Batches API: A Powerful and Cost-Effective Way to Process Large Volumes of Queries Asynchronously
index_new5.html
../../../zaker_core/zaker_tpl_static/wap/tpl_guoji1.html

 

Anthropic AI最近推出了新的Message Batches API,为处理大量数据集的开发者提供了一个有效的解决方案。该API允许一次提交多达10,000个查询,并提供高效的异步处理功能。Message Batches API专为不需要实时响应,但需要有效处理批量操作的任务而设计。它特别适用于非紧急查询,结果将在24小时内处理完成,与传统API调用相比,成本降低了50%。

🚀 **高吞吐量:**发送和处理大量请求,而不会遇到速率限制。

💰 **经济实惠:**对于批量操作,API成本可降低50%。

📈 **可扩展性:**处理大规模数据任务,从内容审核到数据分析,无需担心基础设施限制。

📦 **批量处理:**每个批次最多可以提交10,000个请求,结果通常在24小时内准备好。

⚠️ **批量限制:**虽然Anthropic的Message Batches API提供了令人印象深刻的可扩展性,但它也有一些限制: - 最大批次大小:10,000个请求或32 MB。 - 处理时间:最长24小时。 - 批次在29天后过期。 - 速率限制适用于API请求,而不是批次中的请求数量。

🤖 **支持的模型:**Message Batches API目前与几个Claude模型一起使用: - Claude 3.5 Sonnet - Claude 3 Haiku - Claude 3 Opus

💡 **工作原理:**使用Anthropic的API时,开发者可以发送大量请求进行异步处理。这非常适合分析大量数据集或进行内容审核等任务。 - 使用您提供的请求创建了一个批次。 - 每个请求都独立处理,但结果只有在完成所有任务后才能获得。 - 该过程适用于不需要立即获得结果的任务。

Anthropic AI recently launched a new Message Batches API, which is a useful solution for developers handling large datasets. It allows the submission of up to 10,000 queries at once, offering efficient, asynchronous processing. The API is designed for tasks where speed isn’t crucial, but handling bulk operations effectively matters. It’s especially helpful for non-urgent queries, with results processed within 24 hours and a 50% cost reduction compared to traditional API calls.

What is the Message Batches API?

The Anthropic’s Message Batches API is a service that allows developers to process large amounts of data asynchronously. This means tasks are queued and processed in bulk.

The API makes it suitable for large-scale operations where real-time responses aren’t necessary. Once a Message Batch is created, it begins processing immediately. Developers can use it to process multiple Messages API requests at once.

Main Features and Benefits

Here’s a breakdown of the key features that make the Anthropic Message Batches API stand out:

Batch Limitations

While the Anthropic’s Message Batches API offers impressive scalability, it comes with some limitations:

Supported Models

The Message Batches API currently works with several Claude models:

According to Anthropic, Amazon Bedrock customers can already access batch inference, and Google Cloud’s Vertex AI support is coming. Developers can batch requests for vision, system messages, multi-turn conversations, and more. Each request within a batch is handled independently, allowing flexibility in combining different types of operations.

How Does the Message Batches API Work?

When using the Anthropic’s API, developers can send large batches of requests to be processed asynchronously. This is ideal for tasks like analyzing massive data sets or conducting content moderation.

Here’s the Python code showing how to interact with Anthropic’s Message Batches API and send batch requests to one of their AI models, Claude 3.5.

import anthropicclient = anthropic.Anthropic()client.beta.messages.batches.create(    requests=[        {            "custom_id": "my-first-request",            "params": {                "model": "claude-3-5-sonnet-20240620",                "max_tokens": 1024,                "messages": [                    {"role": "user", "content": "Hello, world"}                ]            }        },        {            "custom_id": "my-second-request",            "params": {                "model": "claude-3-5-sonnet-20240620",                "max_tokens": 1024,                "messages": [                    {"role": "user", "content": "Hi again, friend"}                ]            }        },    ])

For cURL and JavaScript, you can check out Anthropic’s API reference here.

Conclusion

Anthropic’s Message Batches API is a game-changer for developers handling large-scale data operations. It provides an efficient, cost-effective way to process bulk requests. It takes the stress out of managing big data tasks. You can analyze large datasets or moderate content. This Anthropic’s API simplifies bulk operations, giving you the flexibility and scale you need.


Check out the Details. All credit for this research goes to the researchers of this project. Also, don’t forget to follow us on Twitter and join our Telegram Channel and LinkedIn Group. If you like our work, you will love our newsletter.. Don’t Forget to join our 50k+ ML SubReddit

[Upcoming Event- Oct 17 202] RetrieveX – The GenAI Data Retrieval Conference (Promoted)

The post Anthropic AI Introduces the Message Batches API: A Powerful and Cost-Effective Way to Process Large Volumes of Queries Asynchronously appeared first on MarkTechPost.

Fish AI Reader

Fish AI Reader

AI辅助创作,多种专业模板,深度分析,高质量内容生成。从观点提取到深度思考,FishAI为您提供全方位的创作支持。新版本引入自定义参数,让您的创作更加个性化和精准。

FishAI

FishAI

鱼阅,AI 时代的下一个智能信息助手,助你摆脱信息焦虑

联系邮箱 441953276@qq.com

相关标签

Anthropic AI Message Batches API 异步处理 大规模数据 Claude
相关文章