×
xAI Powers Up Massive Memphis Supercomputer, Sparking Local Excitement and Environmental Concerns
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Elon Musk’s xAI startup has powered up a massive supercomputing facility in Memphis, Tennessee to train its next AI model, Grok, with over 100,000 Nvidia H100 GPUs, sparking both excitement and concerns among locals.

Key details about xAI’s Memphis supercomputing cluster:

  • The facility, dubbed the “Gigafactory of Compute,” is now operational and began training models at 4:20am local time on Monday morning, according to Musk.
  • It leverages 100,000 liquid-cooled Nvidia H100 GPUs on a single RDMA fabric, which Musk touts as “the most powerful AI training cluster in the world.”
  • The H100 GPUs are designed specifically for training large AI models and require immense amounts of energy and computing power.

Local reactions and concerns:

  • While the Greater Memphis Chamber economic development group has praised xAI’s decision to open the facility in the area, some locals have expressed worries about its potential environmental impact.
  • The Memphis Community Against Pollution group estimates the cluster may consume over one million gallons of water per day for cooling and up to 150 megawatts of electricity per hour, equivalent to powering 100,000 homes.
  • Memphis City Council member Pearl Walker noted that people are “afraid of what’s possibly going to happen with the water and they are afraid about the energy supply.”

Putting xAI’s cluster in context:

  • While certainly massive, xAI’s Memphis facility may not necessarily be the largest computing cluster in the world, as tech giants like Microsoft, Google, and Meta also leverage huge data centers to train their AI models.
  • Meta CEO Mark Zuckerberg has vowed to acquire 350,000 Nvidia H100 GPUs this year alone, for comparison.
  • The race to build bigger and more powerful AI supercomputing clusters highlights the immense computing resources required to develop advanced AI systems and the challenges around their energy consumption.

Musk has previously stated that xAI plans to release Grok 2 in August, though it’s unclear if that model will utilize this new supercomputing cluster. However, Grok 3, slated for release by the end of 2024, will train on the 100,000 H100 GPUs in Memphis. As xAI ramps up its computing capabilities, questions remain about how the startup will address the environmental concerns its facilities raise and how it will stack up against tech giants in the race to achieve more powerful AI systems.

Elon Musk's xAI Powers Up 100K Nvidia GPUs to Train Grok

Recent News

Wordware secures $30M to enable no-code creation of AI agents

Natural language tools are enabling business teams to create AI solutions in days rather than months, bypassing traditional coding requirements.

Apple is developing new ‘LLM Siri’ for iOS 19 and macOS 16

Apple plans a complete overhaul of Siri using large language models while maintaining its privacy-first approach, with full deployment expected by 2026.

Solving AI model hallucination with retrieval-augmented generation

AI systems are being enhanced with document search capabilities to reduce reliance on training data and improve factual accuracy in responses.