×
Meta smart glasses now offer live AI, language translations and Shazam
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Smart glasses are rapidly evolving from simple camera-enabled eyewear into sophisticated AI-powered devices, with Meta leading the charge through significant feature updates to its Ray-Ban smart glasses.

Major feature rollout: Meta has introduced three new capabilities to its Ray-Ban smart glasses: live AI interactions, real-time language translations, and music recognition via Shazam.

  • Early Access Program members can now access live AI and translation features
  • Shazam integration is available to all users in the US and Canada
  • Users must ensure their glasses run v11 software and Meta View app v196 for compatibility

AI capabilities and functionality: The new live AI feature enables natural conversations with Meta’s AI assistant while it observes the user’s environment in real-time.

  • Users can interact with the AI for approximately 30 minutes on a full battery charge
  • The AI can provide contextual suggestions based on visual input, such as recipe recommendations while grocery shopping
  • Live translation supports conversations between English and Spanish, French, or Italian
  • Users can choose between audio translations through the glasses or text transcripts on their phones

Technical implementation: The translation system requires specific setup and configuration for optimal performance.

  • Language pairs must be downloaded in advance
  • Users need to specify both their language and their conversation partner’s language
  • Shazam integration works through voice commands to Meta AI for song identification

Industry context: The smart glasses market is experiencing increased competition and innovation in AI integration.

  • Google recently announced Android XR, a new operating system specifically designed for smart glasses
  • Google positions its Gemini AI assistant as a key differentiating feature
  • Meta’s CTO Andrew Bosworth suggests 2024 marked a turning point for AI glasses adoption
  • Industry leaders view smart glasses as potentially the first truly AI-native device category

Future implications: The integration of AI capabilities into smart glasses represents a significant shift in how we might interact with artificial intelligence in daily life, though questions remain about battery life limitations and practical everyday utility.

Meta rolls out live AI, live translations, and Shazam to its smart glasses

Recent News

North Korea unveils AI-equipped suicide drones amid deepening Russia ties

North Korea's AI-equipped suicide drones reflect growing technological cooperation with Russia, potentially destabilizing security in an already tense Korean peninsula.

Rookie mistake: Police recruit fired for using ChatGPT on academy essay finds second chance

A promising police career was derailed then revived after an officer's use of AI revealed gaps in how law enforcement is adapting to new technology.

Auburn University launches AI-focused cybersecurity center to counter emerging threats

Auburn's new center brings together experts from multiple disciplines to develop defensive strategies against the rising tide of AI-powered cyber threats affecting 78 percent of security officers surveyed.