×
Meta smart glasses now offer live AI, language translations and Shazam
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Smart glasses are rapidly evolving from simple camera-enabled eyewear into sophisticated AI-powered devices, with Meta leading the charge through significant feature updates to its Ray-Ban smart glasses.

Major feature rollout: Meta has introduced three new capabilities to its Ray-Ban smart glasses: live AI interactions, real-time language translations, and music recognition via Shazam.

  • Early Access Program members can now access live AI and translation features
  • Shazam integration is available to all users in the US and Canada
  • Users must ensure their glasses run v11 software and Meta View app v196 for compatibility

AI capabilities and functionality: The new live AI feature enables natural conversations with Meta’s AI assistant while it observes the user’s environment in real-time.

  • Users can interact with the AI for approximately 30 minutes on a full battery charge
  • The AI can provide contextual suggestions based on visual input, such as recipe recommendations while grocery shopping
  • Live translation supports conversations between English and Spanish, French, or Italian
  • Users can choose between audio translations through the glasses or text transcripts on their phones

Technical implementation: The translation system requires specific setup and configuration for optimal performance.

  • Language pairs must be downloaded in advance
  • Users need to specify both their language and their conversation partner’s language
  • Shazam integration works through voice commands to Meta AI for song identification

Industry context: The smart glasses market is experiencing increased competition and innovation in AI integration.

  • Google recently announced Android XR, a new operating system specifically designed for smart glasses
  • Google positions its Gemini AI assistant as a key differentiating feature
  • Meta’s CTO Andrew Bosworth suggests 2024 marked a turning point for AI glasses adoption
  • Industry leaders view smart glasses as potentially the first truly AI-native device category

Future implications: The integration of AI capabilities into smart glasses represents a significant shift in how we might interact with artificial intelligence in daily life, though questions remain about battery life limitations and practical everyday utility.

Meta rolls out live AI, live translations, and Shazam to its smart glasses

Recent News

Veo 2 vs. Sora: A closer look at Google and OpenAI’s latest AI video tools

Tech companies unveil AI tools capable of generating realistic short videos from text prompts, though length and quality limitations persist as major hurdles.

7 essential ways to use ChatGPT’s new mobile search feature

OpenAI's mobile search upgrade enables business users to access current market data and news through conversational queries, marking a departure from traditional search methods.

FastVideo is an open-source framework that accelerates video diffusion models

New optimization techniques reduce the computing power needed for AI video generation from days to hours, though widespread adoption remains limited by hardware costs.