×
Meta smart glasses now offer live AI, language translations and Shazam
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Smart glasses are rapidly evolving from simple camera-enabled eyewear into sophisticated AI-powered devices, with Meta leading the charge through significant feature updates to its Ray-Ban smart glasses.

Major feature rollout: Meta has introduced three new capabilities to its Ray-Ban smart glasses: live AI interactions, real-time language translations, and music recognition via Shazam.

  • Early Access Program members can now access live AI and translation features
  • Shazam integration is available to all users in the US and Canada
  • Users must ensure their glasses run v11 software and Meta View app v196 for compatibility

AI capabilities and functionality: The new live AI feature enables natural conversations with Meta’s AI assistant while it observes the user’s environment in real-time.

  • Users can interact with the AI for approximately 30 minutes on a full battery charge
  • The AI can provide contextual suggestions based on visual input, such as recipe recommendations while grocery shopping
  • Live translation supports conversations between English and Spanish, French, or Italian
  • Users can choose between audio translations through the glasses or text transcripts on their phones

Technical implementation: The translation system requires specific setup and configuration for optimal performance.

  • Language pairs must be downloaded in advance
  • Users need to specify both their language and their conversation partner’s language
  • Shazam integration works through voice commands to Meta AI for song identification

Industry context: The smart glasses market is experiencing increased competition and innovation in AI integration.

  • Google recently announced Android XR, a new operating system specifically designed for smart glasses
  • Google positions its Gemini AI assistant as a key differentiating feature
  • Meta’s CTO Andrew Bosworth suggests 2024 marked a turning point for AI glasses adoption
  • Industry leaders view smart glasses as potentially the first truly AI-native device category

Future implications: The integration of AI capabilities into smart glasses represents a significant shift in how we might interact with artificial intelligence in daily life, though questions remain about battery life limitations and practical everyday utility.

Meta rolls out live AI, live translations, and Shazam to its smart glasses

Recent News

AI ethics evolve as LLMs raise questions about virtues for constitutional AI frameworks

Recent shift in AI ethics explores how human virtues like honesty and empathy could form the foundation for better-aligned systems, moving beyond purely technical approaches to value-based frameworks.

AI, flirt for me: AI powers dating app profiles, conversations to questionable degree

AI tools offer to handle profile creation and messaging, raising concerns about authenticity in digital dating relationships.

Lenovo unveils versatile AI-powered ThinkBook Flip concept

The concept laptop transforms from a 13.1-inch device into an 18.1-inch vertical OLED workspace, featuring multiple use modes but facing challenges in weight distribution and thickness.