×
TV Doctors Deepfaked, Exploited to Peddle Health Scams on Social Media
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

TV doctors “deepfaked” to promote health scams on social media: An investigation by the British Medical Journal has revealed that several well-known TV doctors, including Michael Mosley and Hilary Jones, have been “deepfaked” on social media platforms to promote fraudulent health products and scams.

  • “Deepfaking” involves using artificial intelligence to create convincing videos of individuals by mapping their digital likeness onto someone else’s body, essentially making it appear as if they are promoting or endorsing products they have no actual connection to.
  • The investigation found videos on platforms like Facebook and YouTube, with one example showing a deepfaked Dr. Hilary Jones promoting a supposed cure for high blood pressure during a segment on the Lorraine programme.
  • While the exact number of such deepfake videos was not specified, the report suggests that their volume has been growing, indicating some level of success for the “bad actors” creating them.

Challenges in combating deepfake fraud: The doctors targeted by these deepfake scams and the social media platforms hosting the videos face significant hurdles in effectively countering this form of fraud:

  • Despite efforts by Dr. Jones and others to employ specialists to identify and remove misrepresentative videos, new ones often emerge quickly under different names, making it a constant battle.
  • As deepfake technology has advanced, spotting these fraudulent videos has become increasingly difficult, even for experts.
  • Meta, the parent company of Facebook and Instagram, has pledged to investigate and take action against content violating its policies but the scale and persistence of the problem poses ongoing challenges.

Broader implications: The rise of deepfake fraud targeting respected medical professionals highlights the evolving landscape of misinformation and the potential risks it poses to public health:

  • By exploiting the trust and credibility associated with well-known doctors, these scams can mislead people into purchasing ineffective or even dangerous products, while also undermining the reputations of the targeted professionals.
  • The relative ease and low cost of creating deepfake videos compared to conventional product research and development may incentivize more bad actors to engage in this form of fraud.
  • As deepfake technology continues to advance, it is likely that such scams will become increasingly sophisticated and difficult to detect, underscoring the need for robust countermeasures and public awareness.

The use of deepfakes to promote health scams is a concerning development that underscores the evolving challenges posed by misinformation in the digital age. While the doctors targeted by these scams and the platforms hosting the videos are working to combat this fraud, the sophistication and persistence of the problem highlight the need for ongoing vigilance and innovative solutions. As deepfake technology continues to advance, it is crucial that the public remains critical of health claims made online and that regulators, platforms, and other stakeholders work together to counter this growing threat.

Michael Mosley and Hilary Jones TV doctors ‘deepfaked’ to promote health scams

Recent News

Autonomous race car crashes at Abu Dhabi Racing League event

The first autonomous racing event at Suzuka highlighted persistent challenges in AI driving systems when a self-driving car lost control during warmup laps in controlled conditions.

What states may be missing in their rush to regulate AI

State-level AI regulations are testing constitutional precedents on free speech and commerce, as courts grapple with balancing innovation and public safety concerns.

The race to decode animal sounds into human language

New tools and prize money are driving rapid advances in understanding animal vocalizations, though researchers caution against expecting human-like language structures.