×
DeepMind’s Gemma Scope Helps Demystify How LLMs Work
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

DeepMind introduces Gemma Scope, a new toolset for understanding the inner workings of large language models and addressing interpretability challenges, with the potential to enable more robust and transparent AI systems.

Interpreting LLM activations is crucial but challenging: Understanding the decision-making process of large language models (LLMs) is essential for their safe and transparent deployment in critical applications. However, interpreting the billions of neuron activations generated during LLM inferences is a major challenge.

  • LLMs process inputs through a complex network of artificial neurons, and the values emitted by these neurons, known as “activations,” guide the model’s response and represent its understanding of the input.
  • Each concept can trigger millions of activations across different LLM layers, and each neuron might activate across various concepts, making interpretation difficult.

Sparse autoencoders (SAEs) help interpret LLM activations: SAEs are models that can compress the dense activations of LLMs into a more interpretable form, making it easier to understand which input features activate different parts of the model.

  • SAEs are trained on the activations of a layer in a deep learning model, learning to represent the input activations with a smaller set of features and then reconstruct the original activations from these features.
  • Previous research on SAEs mostly focused on studying tiny language models or a single layer in larger models, limiting their effectiveness in providing a comprehensive understanding of LLM decision-making.

Gemma Scope takes a comprehensive approach to LLM interpretability: DeepMind’s Gemma Scope provides SAEs for every layer and sublayer of its Gemma 2 2B and 9B models, enabling researchers to study how different features evolve and interact across the entire LLM.

  • Gemma Scope comprises more than 400 SAEs, collectively representing over 30 million learned features from the Gemma 2 models.
  • The toolset uses DeepMind’s new JumpReLU SAE architecture, which enables the SAE to learn a different activation threshold for each feature, making it easier to detect features and estimate their strength while keeping sparsity low and increasing reconstruction fidelity.

Broader implications for AI transparency and robustness: By making Gemma Scope publicly available on Hugging Face, DeepMind is encouraging researchers to further explore and develop techniques for understanding the inner workings of LLMs, which could lead to more transparent and robust AI systems.

  • Improved interpretability of LLMs is crucial for their safe deployment in critical applications that have a low tolerance for mistakes and require transparency.
  • Tools like Gemma Scope can help researchers gain insights into how LLMs process information and make decisions, enabling the development of more reliable and explainable AI systems.
  • As LLMs continue to advance and find applications in various domains, the ability to understand and interpret their decision-making processes will be essential for fostering trust and accountability in AI-driven systems.
DeepMind’s Gemma Scope peers under the hood of large language models

Recent News

Nvidia’s new AI generates music from text and audio inputs

Tech firms are developing AI tools that can generate and manipulate any type of sound, from music to sound effects, by responding to simple text commands.

Luma launches AI-powered creative platform and mobile app

A startup founded by ex-Google employees has attracted 25 million users to its AI video platform by simplifying creative workflows and offering faster processing speeds.

5 AI prompts to maximize your savings on Black Friday

AI tools are helping shoppers navigate Black Friday's maze of deals by tracking prices, stacking discounts, and monitoring flash sales across both online and physical stores.