back
Get SIGNAL/NOISE in your inbox daily

AI21 launches Jamba 1.5: AI21 has unveiled new versions of its Jamba model, combining transformer and Structured State Space (SSM) approaches to enhance AI capabilities.

  • The Jamba 1.5 series includes mini and large versions, building upon the innovations introduced in Jamba 1.0 released in March.
  • Jamba utilizes an SSM approach known as Mamba, aiming to leverage the strengths of both transformers and SSM for improved performance and accuracy.
  • The name Jamba is an acronym for Joint Attention and Mamba architecture, reflecting its hybrid nature.

Key features and enhancements: Jamba 1.5 introduces several new capabilities designed to facilitate the development of agentic AI systems.

  • Function calling, JSON mode, structured document objects, and citation mode have been added to both Jamba 1.5 mini and large models.
  • Both models feature a large context window of 256K and are Mixture-of-Experts (MoE) models.
  • Jamba 1.5 mini boasts 52 billion total and 12 billion active parameters, while Jamba 1.5 large has 398 billion total and 94 billion active parameters.

Availability and partnerships: AI21 has made Jamba 1.5 models accessible through various channels and collaborations.

  • Both Jamba 1.5 models are available under an open license, with AI21 offering commercial support and services.
  • AI21 has established partnerships with major cloud providers and tech companies, including AWS, Google Cloud, Microsoft Azure, Snowflake, Databricks, and Nvidia.

Advancing agentic AI development: The new features in Jamba 1.5 are particularly significant for developers working on agentic AI systems.

  • JSON mode enables structured data handling, facilitating the creation of complex AI systems with structured input/output relationships.
  • The citation feature, working in conjunction with the new document API, allows the model to attribute generated content to relevant input documents.
  • These additions aim to support more sophisticated AI workflows that go beyond simple language model applications.

Citation mode vs. RAG: Jamba 1.5’s citation mode offers a more integrated approach compared to traditional Retrieval Augmented Generation (RAG) techniques.

  • Unlike RAG, which typically connects language models to external vector databases, Jamba 1.5’s citation mode is tightly integrated with the model itself.
  • The model is trained to retrieve, incorporate, and explicitly cite relevant information sources, providing greater transparency and traceability in its outputs.
  • AI21 also offers a separate end-to-end RAG solution as a managed service for those who prefer traditional RAG workflows.

Future developments: AI21 plans to continue advancing its models and focusing on enabling agentic AI systems.

  • The company aims to push the boundaries of agentic AI, particularly in areas of planning and execution.
  • Ongoing efforts will be directed towards serving customer needs and expanding the capabilities of AI systems.

Implications for AI development: The release of Jamba 1.5 represents a significant step forward in hybrid AI architectures and agentic AI capabilities.

  • By combining transformer and SSM approaches, AI21 is exploring new avenues for improving AI performance and versatility.
  • The focus on agentic AI and structured data handling could lead to more sophisticated and transparent AI systems, potentially addressing some of the current limitations in AI applications.
  • As the field continues to evolve, innovations like Jamba 1.5 may play a crucial role in shaping the future of AI architecture and capabilities.

Recent Stories

Oct 17, 2025

DOE fusion roadmap targets 2030s commercial deployment as AI drives $9B investment

The Department of Energy has released a new roadmap targeting commercial-scale fusion power deployment by the mid-2030s, though the plan lacks specific funding commitments and relies on scientific breakthroughs that have eluded researchers for decades. The strategy emphasizes public-private partnerships and positions AI as both a research tool and motivation for developing fusion energy to meet data centers' growing electricity demands. The big picture: The DOE's roadmap aims to "deliver the public infrastructure that supports the fusion private sector scale up in the 2030s," but acknowledges it cannot commit to specific funding levels and remains subject to Congressional appropriations. Why...

Oct 17, 2025

Tying it all together: Credo’s purple cables power the $4B AI data center boom

Credo, a Silicon Valley semiconductor company specializing in data center cables and chips, has seen its stock price more than double this year to $143.61, following a 245% surge in 2024. The company's signature purple cables, which cost between $300-$500 each, have become essential infrastructure for AI data centers, positioning Credo to capitalize on the trillion-dollar AI infrastructure expansion as hyperscalers like Amazon, Microsoft, and Elon Musk's xAI rapidly build out massive computing facilities. What you should know: Credo's active electrical cables (AECs) are becoming indispensable for connecting the massive GPU clusters required for AI training and inference. The company...

Oct 17, 2025

Vatican launches Latin American AI network for human development

The Vatican hosted a two-day conference bringing together 50 global experts to explore how artificial intelligence can advance peace, social justice, and human development. The event launched the Latin American AI Network for Integral Human Development and established principles for ethical AI governance that prioritize human dignity over technological advancement. What you should know: The Pontifical Academy of Social Sciences, the Vatican's research body for social issues, organized the "Digital Rerum Novarum" conference on October 16-17, combining academic research with practical AI applications. Participants included leading experts from MIT, Microsoft, Columbia University, the UN, and major European institutions. The conference...