×
AI21 Debuts Jamba 1.5 With An Eye on Agentic AI
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AI21 launches Jamba 1.5: AI21 has unveiled new versions of its Jamba model, combining transformer and Structured State Space (SSM) approaches to enhance AI capabilities.

  • The Jamba 1.5 series includes mini and large versions, building upon the innovations introduced in Jamba 1.0 released in March.
  • Jamba utilizes an SSM approach known as Mamba, aiming to leverage the strengths of both transformers and SSM for improved performance and accuracy.
  • The name Jamba is an acronym for Joint Attention and Mamba architecture, reflecting its hybrid nature.

Key features and enhancements: Jamba 1.5 introduces several new capabilities designed to facilitate the development of agentic AI systems.

  • Function calling, JSON mode, structured document objects, and citation mode have been added to both Jamba 1.5 mini and large models.
  • Both models feature a large context window of 256K and are Mixture-of-Experts (MoE) models.
  • Jamba 1.5 mini boasts 52 billion total and 12 billion active parameters, while Jamba 1.5 large has 398 billion total and 94 billion active parameters.

Availability and partnerships: AI21 has made Jamba 1.5 models accessible through various channels and collaborations.

  • Both Jamba 1.5 models are available under an open license, with AI21 offering commercial support and services.
  • AI21 has established partnerships with major cloud providers and tech companies, including AWS, Google Cloud, Microsoft Azure, Snowflake, Databricks, and Nvidia.

Advancing agentic AI development: The new features in Jamba 1.5 are particularly significant for developers working on agentic AI systems.

  • JSON mode enables structured data handling, facilitating the creation of complex AI systems with structured input/output relationships.
  • The citation feature, working in conjunction with the new document API, allows the model to attribute generated content to relevant input documents.
  • These additions aim to support more sophisticated AI workflows that go beyond simple language model applications.

Citation mode vs. RAG: Jamba 1.5’s citation mode offers a more integrated approach compared to traditional Retrieval Augmented Generation (RAG) techniques.

  • Unlike RAG, which typically connects language models to external vector databases, Jamba 1.5’s citation mode is tightly integrated with the model itself.
  • The model is trained to retrieve, incorporate, and explicitly cite relevant information sources, providing greater transparency and traceability in its outputs.
  • AI21 also offers a separate end-to-end RAG solution as a managed service for those who prefer traditional RAG workflows.

Future developments: AI21 plans to continue advancing its models and focusing on enabling agentic AI systems.

  • The company aims to push the boundaries of agentic AI, particularly in areas of planning and execution.
  • Ongoing efforts will be directed towards serving customer needs and expanding the capabilities of AI systems.

Implications for AI development: The release of Jamba 1.5 represents a significant step forward in hybrid AI architectures and agentic AI capabilities.

  • By combining transformer and SSM approaches, AI21 is exploring new avenues for improving AI performance and versatility.
  • The focus on agentic AI and structured data handling could lead to more sophisticated and transparent AI systems, potentially addressing some of the current limitations in AI applications.
  • As the field continues to evolve, innovations like Jamba 1.5 may play a crucial role in shaping the future of AI architecture and capabilities.
AI21 debuts Jamba 1.5, boosting hybrid SSM transformer model to enable agentic AI

Recent News

6 places where Google’s Gemini AI should be but isn’t

Despite impressive expansion, Gemini misses crucial opportunities where users need AI assistance most.

How to protect your portfolio from a potential AI bubble burst

Even AI champions like Altman and Zuckerberg are whispering about bubble risks.