×
AI supercomputers are a US first, China second phenomenon. And growing rapidly.
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AI supercomputers are scaling at an exponential rate, with performance doubling every nine months while power requirements and costs double annually. This unprecedented growth, detailed in a comprehensive study of 500 AI systems from 2019-2025, reveals a dramatic shift toward private ownership of computing resources, with industry now controlling 80% of global AI compute power. Understanding these trends is crucial as we approach a future where leading AI systems could require power equivalent to multiple cities and hardware investments in the hundreds of billions.

The big picture: AI supercomputers have experienced explosive growth in computational performance, increasing 2.5x annually through deploying more numerous and powerful specialized chips.

  • Leading systems that once contained fewer than 10,000 chips now regularly feature more than 100,000, exemplified by xAI’s Colossus with its 200,000 AI chips.
  • This growth is driven by a yearly 1.6x increase in chip quantity combined with a 1.6x annual improvement in performance per chip.

Behind the numbers: The expansion of AI supercomputers has created massive energy and financial demands, with power requirements and hardware costs doubling every year.

  • xAI‘s Colossus, the most powerful AI supercomputer as of March 2025, requires approximately 300 megawatts of power—equivalent to 250,000 households—and cost an estimated $7 billion in hardware alone.
  • Despite growing power demands, computational performance per watt has increased by 1.34x annually, primarily through the adoption of more energy-efficient chips.

Key shift: The landscape of AI supercomputing has transformed from primarily academic and public research to industry dominance in just six years.

  • Industry’s share of global AI compute jumped from 40% in 2019 to 80% in 2025, as private companies rapidly scaled their systems to conduct larger training runs.
  • Leading industry systems grew by 2.7x annually, significantly outpacing the 1.9x annual growth of public sector systems.

The big picture: The global distribution of AI supercomputing power shows overwhelming American dominance, with significant implications for technological leadership.

  • The United States controls approximately 75% of global AI supercomputer performance in the dataset, with China a distant second at 15%.
  • Traditional supercomputing powers like the UK, Germany, and Japan now play marginal roles in the AI supercomputing landscape.

Implications: If current growth trajectories continue, the scale of future AI systems will test physical and economic boundaries.

  • Projections suggest that by 2030, the largest AI supercomputer could require 9 gigawatts of power and cost hundreds of billions of dollars to build.
  • These unprecedented requirements raise serious questions about the sustainability of AI scaling and who will be able to participate in frontier AI development.
Trends in AI Supercomputers

Recent News

Databricks to invest $250M in India for AI growth, boost hiring

Data analytics firm commits $250 million to expand Indian operations with a new Bengaluru research center and plans to train 500,000 professionals in AI over three years.

AI-assisted cheating proves ineffective for students

Despite claims of academic advantage, AI tools like Cluely fail to deliver practical benefits during tests and meetings, exposing a significant gap between marketing promises and real-world performance.

Rust gets multi-platform compute boost with CubeCL

CubeCL brings GPU programming into Rust's ecosystem, allowing developers to write hardware-accelerated code using familiar syntax while maintaining safety guarantees across NVIDIA, AMD, and other platforms.