×
Oops: Apple’s AI transcription blunder misinterprets ‘racist’ as ‘Trump’
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Two sentences for context: Users have discovered that Apple‘s speech-to-text Dictation service was incorrectly transcribing the word “racist” as “Trump” on iPhones. This technical issue emerged amid broader discussions about artificial intelligence accuracy and speech recognition technology reliability.

The technical issue: Apple acknowledged a problem with its speech recognition model and announced it was rolling out a fix to address the transcription error.

  • Users reported that when speaking the word “racist” into their iPhones, the Dictation tool would sometimes transcribe it as “Trump” before correcting itself
  • The BBC was unable to replicate the error, suggesting Apple’s fix was already being implemented
  • Apple initially attributed the issue to difficulties distinguishing between words containing the letter “r”

Expert analysis: Speech recognition expert Professor Peter Bell from the University of Edinburgh challenged Apple’s explanation, suggesting deliberate manipulation rather than a technical glitch.

  • Bell explained that the words “racist” and “Trump” are not phonetically similar enough to cause AI confusion
  • Speech-to-text models are trained on hundreds of thousands of hours of speech data and learn to understand words in context
  • A former Apple Siri employee told the New York Times the issue “smells like a serious prank”

Recent AI challenges: This incident follows other AI-related setbacks for Apple in recent months.

  • Apple recently suspended its AI news summary feature after generating false notifications, including an incorrect story about Rafael Nadal
  • The company has announced a $500 billion investment in U.S. operations, including a Texas data center for Apple Intelligence
  • CEO Tim Cook has indicated potential changes to diversity, equity and inclusion policies in response to political pressure

Looking beyond the surface: The transcription error raises questions about the vulnerability of AI systems to manipulation and the challenges of maintaining accuracy in speech recognition technology, particularly as these tools become increasingly integrated into daily life. The incident also highlights the intersection of technology, politics, and corporate responsibility in the AI era.

Apple AI tool transcribed the word 'racist' as 'Trump'

Recent News

MILS AI model sees and hears without training, GitHub code released

Meta researchers develop system enabling language models to process images and audio without specialized training, leveraging existing capabilities through an innovative inference method.

Mayo Clinic combats AI hallucinations with “reverse RAG” technique

Mayo's innovative verification system traces each AI-generated medical fact back to its source, dramatically reducing hallucinations in clinical applications while maintaining healthcare's rigorous accuracy standards.

Columbia dropouts launch Cluely, an AI tool designed for cheating in interviews and exams

Columbia dropouts' desktop AI assistant provides real-time answers during interviews and exams through an overlay invisible during screen sharing.