×
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Olivia Williams is calling for “nudity rider”-type protections for AI body scans, arguing that actors need the same level of control over their digital likenesses as they have over intimate scenes. The Dune: Prophecy star says performers are regularly pressured into body scans on set with minimal guarantees about how the data will be used, potentially allowing studios to train AI models on their physical appearances and eventually replace human actors.

What you should know: Williams and other actors report being “ambushed” into body scans during filming, with contracts containing vague clauses that grant studios sweeping rights over performers’ likenesses.

  • Current contract language often gives studios rights over actor data “on all platforms now existing or yet to be devised throughout the universe in perpetuity.”
  • Williams has unsuccessfully tried to remove these clauses from her contracts and explored owning her own body scan data, but legal fees proved prohibitive and the law remains unclear.
  • The issue affects performers across the industry, including supporting actors, stunt performers, and dancers who have little time to negotiate how their scan data will be treated.

Why this matters: The controversy highlights growing fears that AI technology could eventually eliminate acting jobs by creating digital doubles without ongoing consent or compensation.

  • Recent condemnation of the AI actor “Tilly Norwood” has intensified debates about artificial intelligence’s impact on the entertainment industry.
  • Actors worry that body scan data could be used to train AI models on their likenesses, poses, and movements, paving the way for technology to replace human performers.

What they’re saying: Williams emphasizes her concerns extend beyond financial compensation to fundamental control over her digital identity.

  • “I don’t necessarily want to be paid any more money for the use of my likeness,” she said. “I just don’t want my likeness to appear in places where I haven’t been, doing things I haven’t done, saying things I haven’t said.”
  • “They make up the law as they go along and no one is stopping them – creating a precedent, reinforcing the precedent. I sign it, because if I don’t, I lose the job.”
  • Williams described a particularly concerning case: “I have known a 17-year-old who was persuaded into a scanner – and like the child-catcher scene in Chitty Chitty Bang Bang, she obliged. She was a minor, so her chaperone had to give consent. Her chaperone was her grandmother, unaware of the law.”

The proposed solution: Williams advocates for body scan protections modeled after nudity riders, which strictly limit how intimate footage can be used.

  • “This footage can only be used in the action of that scene. It cannot be used in any other context at all, and when the scene has been edited it must be deleted on all formats,” she explained.
  • The approach would give actors explicit control over their digital data similar to existing protections for sensitive scenes.

Industry response: UK trade organizations are currently negotiating potential protections for performers facing AI-related scanning.

  • Paul W Fleming, general secretary of Equity (the UK performing arts union), said: “We’re demanding that AI protections are mainstreamed in the major film and TV agreements to put consent and transparency at the heart of scanning on set.”
  • Pact, the UK screen sector’s trade body, acknowledged ongoing discussions but declined detailed comment due to active negotiations.
Olivia Williams says actors need ‘nudity rider’-type controls for AI body scans

Recent News

Tying it all together: Credo’s purple cables power the $4B AI data center boom

These $500 purple cables prevent "link flaps" that can shut down entire data centers.

Vatican launches Latin American AI network for human development

Fifty global experts gathered to ensure machines serve people, not the other way around.