×
Meta’s Aria Gen 2 smart glasses track biometrics to train AI robots
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Meta has unveiled its experimental Aria Gen 2 smart glasses, research-grade wearables packed with advanced sensors that can track users’ gaze, movement, and heart rate to understand both their environment and emotional responses. The glasses are currently being used to train robots and develop AI systems that could eventually power consumer smart glasses, representing a significant step toward machines that can interpret the world like humans do.

What you should know: The Aria Gen 2 glasses are a major upgrade from Meta’s 2020 research glasses, featuring lighter weight, higher accuracy, and more natural appearance while packing sophisticated sensory equipment.

  • Four computer vision cameras provide an 80° field of view and can measure depth and distance, determining spatial relationships between objects.
  • Eye-tracking technology monitors where users look, when they blink, pupil changes, and focus points.
  • Additional sensors include an ambient light sensor with UV mode, a contact microphone for voice pickup in noisy environments, and a pulse detector in the nose pad for heart rate monitoring.

How it works: The glasses combine multiple data streams to create a comprehensive understanding of user behavior and emotional state.

  • Hand tracking measures joint movement, which could help train robots or interpret gestures.
  • The system can determine what users are looking at, how they’re holding objects, and whether their heart rate increases due to emotional reactions.
  • Meta uses the example that if someone is holding an egg and sees their “sworn enemy,” the AI might figure out they want to throw it and help them aim accurately.

The bigger picture: These research tools are designed to bridge human interactions with the real world and machine learning, teaching robots to look, listen, and interpret environments like humans.

  • Project Aria’s research-level tools are used by developers working on computer vision, robotics, and contextual AI applications.
  • The goal is to help machines navigate, contextualize, and interact with the world more effectively.
  • Meta’s vision extends far beyond simple message checking to creating AI systems that can remember where users left their keys and send robots to retrieve them.

What’s next: The Aria Gen 2 glasses are not available for consumer purchase and remain research-only tools for now.

  • Researchers must apply for access, with Meta expected to begin accepting applications later this year.
  • While Meta hasn’t confirmed consumer availability, the company suggests it’s “probably only a matter of time” before some version reaches the general public.
  • The technology could eventually enable powerful AI assistants that sit on users’ faces, providing seamless integration between human needs and robotic assistance.
Meta AI's experimental new smart glasses can see everything you do and even tell how you feel about it

Recent News

Study reveals 4 ways AI is transforming sexual wellness

AI-powered tools offer relationship advice rated more empathetic than human responses.

In the Money: Google tests interactive finance charts in AI Mode for stock comparisons

Finance queries serve as Google's testing ground for broader data visualization across other subjects.

30 mathematicians met in secret to stump OpenAI. They (mostly) failed.

Mathematicians may soon shift from solving problems to collaborating with AI reasoning bots.