×
Former CNN anchor’s AI interview with Parkland victim sparks outrage
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Former CNN anchor Jim Acosta faced widespread backlash for conducting what he called a “one of a kind interview” with an AI avatar of Joaquin Oliver, a 17-year-old victim of the 2018 Parkland school shooting. The controversial segment, which aired on Monday and was created by Oliver’s parents to send a “powerful message on gun violence,” instead sparked outrage over its tone-deaf use of AI technology to recreate a deceased shooting victim.

What happened: Acosta interviewed an AI recreation of Oliver, one of 17 people killed at Marjory Stoneman Douglas High School in Miami, asking the avatar what had happened to him.

  • “I was taken from this world too soon due to gun violence while at school,” the AI responded in a robotic tone, its face appearing “jerky and misshapen” according to reports.
  • The interview was facilitated by Oliver’s father, Manuel Oliver, as part of ongoing efforts to advocate for gun control.

The backlash: Critics immediately condemned the interview as insensitive and ethically questionable, with many pointing out the availability of living survivors who could share authentic experiences.

  • Tech journalist Kelsey Atherton called the AI avatar “a cursed golem and a full delusion,” arguing that “gun control deserves better.”
  • Social media users described the interview as “absolutely deranged” and compared it to “having a conversation with your microwave.”
  • One user noted: “There are living survivors of school shootings you could interview, and it would really be their words and thoughts instead of completely made up.”

Broader AI concerns: The incident reflects growing public wariness about inappropriate uses of AI technology, particularly in sensitive contexts involving death and grief.

  • Users expressed discomfort with AI’s potential to exploit grief, with one person sharing: “Just passed two years without my Mom, and I can’t imagine using AI to make a video or photo that never happened.”
  • Another commented: “It’s a really dangerous precedent to set for people who aren’t dealing with their grief and giving it more power over them than it should.”

Previous precedent: This isn’t the first time AI has been used to recreate Parkland victims or deceased individuals for advocacy purposes.

  • Last year, parents of Parkland victims used deepfaked voices of six deceased students and staff in a robocalling campaign to convince Congress members.
  • Similar AI resurrection projects have emerged, including families using AI to revive road rage victims and startups offering conversations with deceased loved ones.

What Acosta said: Despite the criticism, the former CNN anchor defended the interview, telling Oliver’s father: “I really felt like I was speaking with Joaquin. It’s just a beautiful thing.”

Why this matters: The controversy highlights the ethical challenges surrounding AI’s use in sensitive contexts, particularly when dealing with tragedy and grief, as gun violence remains a leading cause of death for children and teens in the United States.

Public Horrified by Jim Acosta's Latest Stunt

Recent News

AMD beats revenue but stock drops 3% on $800M China export impact

Strong gaming and CPU sales helped offset the AI chip regulatory headwinds.

Perplexity pushes back, calls Cloudflare’s AI crawler claims “embarrassing errors”

Perplexity calls Cloudflare's technical analysis "embarrassing errors" in escalating data scraping dispute.