The AI-powered War of Words tool, launched by Ukraine’s former minister of culture Volodymyr Borodiansky, analyzes vast amounts of Russian media content to expose pro-Kremlin propaganda and fake news narratives that can influence public opinion and democratic processes.
Key details of the AI tool: War of Words harnesses artificial intelligence to scrutinize thousands of hours of video content across Russian TV and Telegram channels dating back to 2012, providing users with a searchable archive updated daily:
- The tool tracks and analyzes all content aired on Russian media since 2012, generating hundreds of units of propaganda every second.
- It aims to counter disinformation campaigns that have preceded Russian aggression in Ukraine, Syria, and Georgia, as well as attempts to destabilize some EU and NATO member countries.
Broader context of Russian disinformation: The launch of War of Words comes amid reports of Russian disinformation targeting the EU, with fake news narratives designed to disrupt elections and smear political leaders:
- AI-generated deepfake videos of political figures like Donald Trump and Joe Biden have been circulated by pro-Kremlin accounts to spread misinformation.
- The European Council on Foreign Relations (ECFR) recorded a surge in Russian disinformation ahead of the European Parliament elections, attempting to fuel political instability and erode public trust in democratic institutions.
Countering the threat of disinformation: Experts emphasize the need for tools like War of Words to debunk fake information and complement broader research into how disinformation spreads and influences opinions and behavior:
- Elena Simperl, professor of computer science at King’s College London, notes that while AI has been blamed for current disinformation campaigns, it can also be used effectively to expose false information.
- The Russian government invests heavily in spreading pro-Kremlin narratives, with an estimated $1.6 billion allocated to propaganda efforts according to disinformation analysis center Debunk.
Broader implications: The development of AI tools to combat state-sponsored disinformation campaigns highlights the increasingly complex landscape of information warfare and the critical need to safeguard democratic processes:
- As Borodiansky states, “To effectively respond to Russian aggression, we have created a tool that will help policymakers, diplomats, the media, and researchers identify how Russia threatens the peaceful existence of the world.”
- However, experts caution that debunking disinformation alone is not enough; deep research into how false narratives spread and influence behavior is crucial to mitigating the erosion of public trust in institutions and collective decision-making processes.
Ukraine deploys AI in fight against Putin's disinformation