×
Anthropic launches ‘Message Batches API’ to streamline large-scale data tasks
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

New Message Batches API revolutionizes large-scale data processing: Anthropic has introduced a powerful and cost-effective solution for processing high volumes of queries asynchronously, offering significant benefits for developers and businesses.

Key features and advantages: The Message Batches API allows developers to send up to 10,000 queries per batch, with processing completed within 24 hours at half the cost of standard API calls.

  • The API is currently available in public beta, supporting Claude 3.5 Sonnet, Claude 3 Opus, and Claude 3 Haiku on the Anthropic API.
  • Amazon Bedrock customers can utilize batch inference with Claude, while support for Google Cloud’s Vertex AI is forthcoming.
  • The service offers enhanced throughput with higher rate limits, without impacting standard API rate limits.
  • It provides scalability for big data tasks such as dataset analysis, classification, and extensive model evaluations.

Cost-effective solution for non-time-sensitive tasks: The Batches API addresses the need for efficient processing of large volumes of data where real-time responses are not critical.

  • Developers can submit groups of up to 10,000 queries, leaving the processing to Anthropic at a 50% discount.
  • This approach eliminates the need for complex queuing systems and concerns about rate limits.
  • The service makes large-scale data processing more economically viable, opening up new possibilities for tasks like analyzing entire corporate document repositories.

Pricing structure: The Batches API offers a 50% discount on both input and output tokens compared to standard API calls.

  • Claude 3.5 Sonnet: Batch Input $1.50 / MTok, Batch Output $7.50 / MTok
  • Claude 3 Opus: Batch Input $7.50 / MTok, Batch Output $37.50 / MTok
  • Claude 3 Haiku: Batch Input $0.125 / MTok, Batch Output $0.625 / MTok

Real-world application: Quora, a popular question-and-answer platform, has already implemented the Batches API to enhance its services.

  • The company uses the API for summarization and highlight extraction to create new end-user features.
  • Andy Edmonds, Product Manager at Quora, praised the API for its cost savings and reduced complexity in processing large numbers of non-real-time queries.
  • The convenience of submitting batches and downloading results within 24 hours allows Quora’s engineers to focus on more challenging problems.

Availability and implementation: Developers interested in utilizing the Batches API can access it through Anthropic’s documentation and pricing page.

  • The service is currently in public beta on the Anthropic API.
  • Detailed information and guidelines for implementation are available in the official documentation.

Potential impact on AI-driven data processing: The introduction of the Message Batches API could significantly alter the landscape of large-scale data analysis and AI-powered applications.

  • By making it more cost-effective and efficient to process vast amounts of data, the API may enable new use cases and innovations across various industries.
  • The reduced complexity and infrastructure concerns could lower barriers to entry for smaller companies and researchers looking to leverage AI for data-intensive tasks.
  • As more businesses adopt this technology, we may see an acceleration in AI-driven insights and decision-making processes across sectors.
Introducing the Message Batches API

Recent News

Niantic plans $3.5B ‘Pokemon Go’ sale as HP acquires AI Pin

As gaming companies cut AR assets loose, Niantic is looking to sell its most valuable property while HP absorbs a struggling hardware startup.

This AI-powered wireless tree network detects and autonomously suppresses wildfires

A network of solar-powered sensors installed beneath forest canopies detects smoke and alerts authorities within minutes of a fire's start.

DeepSeek goes beyond ‘open weights’ with plans to release source code

Open-source AI firm will release internal code and model training infrastructure used in its commercial products.