×
AI energy use remains tiny fraction of global consumption
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

New analysis reveals that artificial intelligence’s global energy consumption, while rapidly growing, remains a surprisingly small fraction of worldwide electricity use—roughly equivalent to what televisions consumed in the 1980s. The findings challenge common narratives about AI’s environmental impact by placing its resource consumption in the broader context of global energy usage, though concentrated local impacts on communities hosting data centers remain a significant concern.

The big picture: AI accounts for only 10% to 20% of the 415 terawatt-hours consumed annually by data centers worldwide, which itself represents a tiny fraction of global electricity use.

  • Enterprise and government services consume over 50% of data center power, while streaming video takes about 15% and cloud photo storage accounts for just 0.2%.
  • Data center power use is expected to double by the end of the decade, with AI driving a significant portion of that growth.
  • Even if AI becomes the majority of data center electricity use, it would still represent only a small part of worldwide consumption.

How AI compares to other technologies: Individual AI prompts consume remarkably little power compared to everyday activities and established technologies.

  • Wi-Fi networks globally consume more electricity than AI systems, and if Wi-Fi were a country, its electricity use would rank it in the top 50 globally.
  • Electric vehicles are growing much faster as electricity consumers and will likely use 100 times more power than current AI systems.
  • Power-hungry 5G cellular networks may soon match AI’s consumption levels.

Water and carbon footprint breakdown: AI’s environmental impact varies dramatically depending on the measurement scale used.

  • One toilet flush could provide cooling for 10 prompts daily for almost five years, while a 500ml disposable water bottle equals cooling for 2,000 prompts.
  • The 100 billion liters of water used annually for AI data center cooling roughly equals what’s spent watering golf courses during rainfall.
  • Each AI prompt generates about 0.03 grams of CO2 equivalent—the same amount a person exhales in one breath.

Important stats: The carbon emissions data reveals AI’s relatively modest global impact while highlighting its absolute scale.

  • Ten daily prompts equal the CO2 emissions from a birthday candle or idling a car for less than a second.
  • AI’s yearly emissions represent approximately 0.07% of global totals, comparable to the entire ridesharing industry or the country of Denmark.
  • The author calculated “100 million prompts per Zamboni year, with a quarter of that from Canada” as a playful measurement unit.

Why this matters: The real environmental story extends beyond individual usage to concentrated local impacts on specific communities and ecosystems where data centers are built.

  • While individual AI usage remains minimal, the massive scale of deployment—with OpenAI alone handling over 2.5 billion prompts daily—creates substantial aggregate consumption.
  • The rapid growth pattern mirrors historical technology adoption, similar to how televisions scaled in the 1980s to reach consumption levels comparable to today’s AI systems.
Red flag or red herring? Here's how AI’s power, water and carbon footprints stack up on a global scale

Recent News

Palantir sues former AI engineers for stealing $B source code

Trade secret theft cases are multiplying as AI talent jumps between competitors.

Google Gemini captures 74% of AI image generation users

Integration with Gmail and Docs creates fluid workflows from research to final content.

8 best free AI courses to boost your career in 2025

AI skills are becoming essential workplace requirements, not optional technical knowledge.