×
AI subtitles for the hard of hearing increase workload for humans while driving down wages
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Human subtitlers who create captions for deaf and hard-of-hearing viewers are facing an existential threat as artificial intelligence increasingly automates their specialized craft. Despite industry assumptions that AI will streamline subtitle production, professional subtitlers report that current AI tools actually increase their workload while driving down wages to unsustainable levels.

What you should know: Subtitles for the deaf and hard of hearing (SDH) require far more nuanced decision-making than simple transcription, involving creative interpretation of sounds, emotions, and narrative context.

  • Max Deryagin, chair of Subtle, a non-profit association of freelance subtitlers, emphasizes that “SDH is an art, and people in the industry have no idea. They think it’s just a transcription.”
  • The deaf and hard-of-hearing community has diverse needs, with some viewers preferring song titles while others find them useless, and some wanting emotional cues while others consider them intrusive.

The AI problem: Current artificial intelligence tools are creating more work for human subtitlers rather than reducing it, while simultaneously devaluing their expertise.

  • Meredith Cannella, a Subtle committee member with 14-15 years of experience, notes: “There’s an assumption that we now have to do less work because of AI tools. But…there hasn’t been much of a difference in how long it takes me to complete projects over the last five or six years.”
  • Auto transcription shows some positive advancement, but extensive corrections are still needed, providing no net time benefit compared to older software.
  • Many subtitlers are now assigned AI-generated work as “quality control” tasks, receiving minimal payment for what amounts to complete rewrites.

Why human expertise matters: Professional subtitlers make complex creative decisions that current AI cannot replicate, requiring deep understanding of context, emotion, and narrative structure.

  • Rachel Jones, an audiovisual translator, explains her process: “When I first watch a show, I write down how the sounds make me feel, then work out how to transfer my reactions into words.”
  • Subtitlers must determine which sounds are essential and which would overwhelm viewers, balancing audio description with visual storytelling priorities.
  • Context across entire films is crucial—Deryagin cites Blow Out (1981) as an example where “SDH must instantly connect” recurring mysterious sounds to reveal plot points without spoiling suspense.

What they’re saying: Industry professionals stress that AI cannot capture the interpretive complexity required for quality subtitling.

  • “The same sound can mean a million different things. As humans, we interpret what it means and how it’s supposed to feel,” Deryagin explains.
  • “You can’t just give an algorithm a soundtrack and say, ‘here are the sounds, figure it out’. Even if you give it metadata, it can’t get anywhere near the level of professional work.”
  • Jones notes the broader frustration: “In every industry, AI is being used to replace all the creative things that bring us joy instead of the boring, tedious tasks we hate doing.”

Industry impact: Major platforms remain tight-lipped about their AI subtitle practices, while rates have dropped so severely that many professionals can no longer earn living wages.

  • Netflix declined to comment on its AI subtitling use, despite viral attention to creative captions like “[Eleven pants]” or “[Tentacles squelching wetly]” from Stranger Things.
  • The BBC, a British public broadcaster, states it doesn’t use AI for TV subtitles, though much work is outsourced to Red Bee Media, which has published statements about using AI for Australian broadcaster Network 10.
  • According to Subtle, many members can no longer make a living wage due to dramatically reduced rates for AI-assisted work.

Why this matters: Teri Devine, associate director of inclusion at the Royal National Institute for Deaf People, emphasizes that “for people who are deaf or have hearing loss, subtitles are an essential service – allowing them to enjoy film and TV with loved ones and stay connected to popular culture.”

‘Tentacles squelching wetly’: the human subtitle writers under threat from AI

Recent News

Melania Trump launches children’s AI safety coalition amid Ukraine crisis

Ukraine's first lady waits quietly as Trump launches tech initiative without addressing displaced children.

From Harvard to Hacker: Post-romantic, youthful AI founders are upending San Francisco

Brain-computer interfaces and AI mind-reading labs operate from basement spaces across the city.

FBI arrests Michigan man for AI deepfake extortion threats

One victim began carrying a weapon after receiving escalating threats about posting fabricated content.