×
Experts warn stolen AGI could enable global cyberattacks and destabilize power
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AI experts are increasingly concerned about the potential theft of artificial general intelligence (AGI) once it’s achieved, warning that stolen AGI could be weaponized by bad actors or hostile nations. This security challenge represents one of the most significant risks facing the AI industry, as AGI theft could enable everything from global cyberattacks to geopolitical domination.

The big picture: The race to achieve AGI has created a new category of high-stakes cybercrime, where the first successful AGI system becomes an irresistible target for competitors, governments, and criminals alike.

  • Only one entity is expected to achieve AGI first, making that breakthrough system extraordinarily valuable and vulnerable to theft.
  • Potential thieves range from competing AI companies seeking shortcuts to hostile nations wanting geopolitical advantages.
  • The digital nature of AGI means it could theoretically be copied like any other software, though massive computational resources would be needed to run it.

Key security challenges: Protecting AGI from theft involves complex technical and logistical hurdles that go beyond traditional cybersecurity measures.

  • AGI systems will likely require thousands of servers and massive computational resources, making complete theft difficult but not impossible.
  • Encryption could provide additional protection, but thieves would need to obtain decryption keys separately.
  • The sheer size of AGI systems means theft would likely occur in smaller chunks over time, increasing the risk of detection.

Why this matters: A stolen AGI system could fundamentally alter global power dynamics and pose existential risks to humanity.

  • Criminals could bypass safety measures and use AGI for massive financial crimes or world domination schemes.
  • Countries could steal AGI to gain overnight geopolitical advantages, potentially triggering international conflicts.
  • The theft could create “AGI haves and have-nots” on a global scale, destabilizing international relations.

Potential countermeasures: Several approaches are being considered to prevent or mitigate AGI theft, though each has significant limitations.

  • Kill switches could remotely disable stolen AGI, but thieves might discover and remove them or block activation signals.
  • Global treaties could establish international cooperation against AGI theft, similar to nuclear non-proliferation agreements.
  • The original AGI system could potentially detect and respond to unauthorized copies, though this assumes AGI develops autonomous capabilities.

The plot twist: Some experts argue that stealing AGI might be justified if the original developer has malicious intent.

  • If an “evildoer” achieves AGI first, theft by “good guys” could level the playing field.
  • This scenario could lead to AGI-versus-AGI conflicts that determine humanity’s future.
  • The moral complexity highlights the need for international governance frameworks before AGI arrives.

What they’re saying: “Steal a little and they throw you in jail. Steal a lot and they make you king,” notes Lance Eliot, a Forbes contributor covering AI developments, referencing Bob Dylan’s famous observation about the relationship between crime scale and consequences.

Bottom line: The potential for AGI theft represents a security challenge unlike anything the world has faced, requiring unprecedented international cooperation and technical safeguards to prevent catastrophic outcomes.

Stealing Of AGI And AI Superintelligence Is An Entirely Enticing Option

Recent News

Safecracking Cambridge researchers undermine artist anti-AI defenses with new tool

Artists' digital defenses may provide only temporary security against AI scraping.

Microsoft, OpenAI invest $23M in teachers’ union amid education cuts

Tech companies are embedding themselves in cash-strapped institutions playing the long game.