Artificial intelligence tools are being increasingly deployed to help people with disabilities navigate everyday life, with major tech companies developing new assistive technologies. Nvidia’s latest contribution to this space is Signs, an AI-powered platform designed to teach American Sign Language (ASL), developed in partnership with the American Society for Deaf Children.
Project Overview: Signs is a free AI-powered language learning platform that uses a 3-D avatar to teach American Sign Language, the third most prevalent language in the United States.
- The platform incorporates real-time video feedback to assess users’ signing accuracy as they practice
- At launch, Signs features 100 distinct signs, with plans to expand to 1,000 signs
- ASL speakers can contribute videos of additional signs to help grow the platform’s vocabulary
Technical Features and Development Plans: Nvidia is actively working to enhance the platform’s capabilities and make it more comprehensive for users.
- The development team is exploring ways to incorporate non-manual signals crucial to ASL, including facial expressions and head movements
- Future updates will aim to include slang and regional variations in the language
- The platform’s collected data will be made publicly available for other developers to build upon
Industry Context: This launch represents Nvidia’s strategic expansion beyond its core business of manufacturing AI chips.
- Nvidia has achieved a market valuation exceeding $3.4 trillion, with its stock rising over 100% in the past year
- The company is diversifying into AI software platforms and models while maintaining its position as a leading chip supplier
- Some investors have expressed concerns about tech companies’ substantial investments in AI infrastructure and the timeline for returns
Broader Impact: Signs addresses a crucial communication gap in families with deaf children, while also contributing to the growing ecosystem of AI-powered assistive technologies.
- Meta, Google, and OpenAI have developed AI features for blind or low-vision users
- Apple has implemented AI-enabled eye tracking for physically disabled iPhone users
- The platform enables hearing parents to begin communicating with deaf children as young as 6-8 months old
Future Applications: The data gathered through Signs could enable broader technological advances in accessibility technology.
- Potential applications include improved sign recognition in video-conferencing software
- The technology could be adapted for gesture control in vehicles
- The open data repository could foster innovation in ASL-related technologies across the industry
Nvidia launching AI platform to make learning sign language easier