×
Ukraine Deploys AI to Expose Russian Disinformation, Safeguarding Democracy
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The AI-powered War of Words tool, launched by Ukraine’s former minister of culture Volodymyr Borodiansky, analyzes vast amounts of Russian media content to expose pro-Kremlin propaganda and fake news narratives that can influence public opinion and democratic processes.

Key details of the AI tool: War of Words harnesses artificial intelligence to scrutinize thousands of hours of video content across Russian TV and Telegram channels dating back to 2012, providing users with a searchable archive updated daily:

  • The tool tracks and analyzes all content aired on Russian media since 2012, generating hundreds of units of propaganda every second.
  • It aims to counter disinformation campaigns that have preceded Russian aggression in Ukraine, Syria, and Georgia, as well as attempts to destabilize some EU and NATO member countries.

Broader context of Russian disinformation: The launch of War of Words comes amid reports of Russian disinformation targeting the EU, with fake news narratives designed to disrupt elections and smear political leaders:

  • AI-generated deepfake videos of political figures like Donald Trump and Joe Biden have been circulated by pro-Kremlin accounts to spread misinformation.
  • The European Council on Foreign Relations (ECFR) recorded a surge in Russian disinformation ahead of the European Parliament elections, attempting to fuel political instability and erode public trust in democratic institutions.

Countering the threat of disinformation: Experts emphasize the need for tools like War of Words to debunk fake information and complement broader research into how disinformation spreads and influences opinions and behavior:

  • Elena Simperl, professor of computer science at King’s College London, notes that while AI has been blamed for current disinformation campaigns, it can also be used effectively to expose false information.
  • The Russian government invests heavily in spreading pro-Kremlin narratives, with an estimated $1.6 billion allocated to propaganda efforts according to disinformation analysis center Debunk.

Broader implications: The development of AI tools to combat state-sponsored disinformation campaigns highlights the increasingly complex landscape of information warfare and the critical need to safeguard democratic processes:

  • As Borodiansky states, “To effectively respond to Russian aggression, we have created a tool that will help policymakers, diplomats, the media, and researchers identify how Russia threatens the peaceful existence of the world.”
  • However, experts caution that debunking disinformation alone is not enough; deep research into how false narratives spread and influence behavior is crucial to mitigating the erosion of public trust in institutions and collective decision-making processes.
Ukraine deploys AI in fight against Putin's disinformation

Recent News

Autonomous race car crashes at Abu Dhabi Racing League event

The first autonomous racing event at Suzuka highlighted persistent challenges in AI driving systems when a self-driving car lost control during warmup laps in controlled conditions.

What states may be missing in their rush to regulate AI

State-level AI regulations are testing constitutional precedents on free speech and commerce, as courts grapple with balancing innovation and public safety concerns.

The race to decode animal sounds into human language

New tools and prize money are driving rapid advances in understanding animal vocalizations, though researchers caution against expecting human-like language structures.