×
Nvidia’s new AI platform aims to simplify sign language learning
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Artificial intelligence tools are being increasingly deployed to help people with disabilities navigate everyday life, with major tech companies developing new assistive technologies. Nvidia’s latest contribution to this space is Signs, an AI-powered platform designed to teach American Sign Language (ASL), developed in partnership with the American Society for Deaf Children.

Project Overview: Signs is a free AI-powered language learning platform that uses a 3-D avatar to teach American Sign Language, the third most prevalent language in the United States.

  • The platform incorporates real-time video feedback to assess users’ signing accuracy as they practice
  • At launch, Signs features 100 distinct signs, with plans to expand to 1,000 signs
  • ASL speakers can contribute videos of additional signs to help grow the platform’s vocabulary

Technical Features and Development Plans: Nvidia is actively working to enhance the platform’s capabilities and make it more comprehensive for users.

  • The development team is exploring ways to incorporate non-manual signals crucial to ASL, including facial expressions and head movements
  • Future updates will aim to include slang and regional variations in the language
  • The platform’s collected data will be made publicly available for other developers to build upon

Industry Context: This launch represents Nvidia’s strategic expansion beyond its core business of manufacturing AI chips.

  • Nvidia has achieved a market valuation exceeding $3.4 trillion, with its stock rising over 100% in the past year
  • The company is diversifying into AI software platforms and models while maintaining its position as a leading chip supplier
  • Some investors have expressed concerns about tech companies’ substantial investments in AI infrastructure and the timeline for returns

Broader Impact: Signs addresses a crucial communication gap in families with deaf children, while also contributing to the growing ecosystem of AI-powered assistive technologies.

  • Meta, Google, and OpenAI have developed AI features for blind or low-vision users
  • Apple has implemented AI-enabled eye tracking for physically disabled iPhone users
  • The platform enables hearing parents to begin communicating with deaf children as young as 6-8 months old

Future Applications: The data gathered through Signs could enable broader technological advances in accessibility technology.

  • Potential applications include improved sign recognition in video-conferencing software
  • The technology could be adapted for gesture control in vehicles
  • The open data repository could foster innovation in ASL-related technologies across the industry
Nvidia launching AI platform to make learning sign language easier

Recent News

AI courses from Google, Microsoft and more boost skills and résumés for free

As AI becomes critical to business decision-making, professionals can enhance their marketability with free courses teaching essential concepts and applications without requiring technical backgrounds.

Veo 3 brings audio to AI video and tackles the Will Smith Test

Google's latest AI video generation model introduces synchronized audio capabilities, though still struggles with realistic eating sounds when depicting the celebrity in its now-standard benchmark test.

How subtle biases derail LLM evaluations

Study finds language models exhibit pervasive positional preferences and prompt sensitivity when making judgments, raising concerns for their reliability in high-stakes decision-making contexts.