×
Written by
Published on
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Elon Musk’s xAI startup has powered up a massive supercomputing facility in Memphis, Tennessee to train its next AI model, Grok, with over 100,000 Nvidia H100 GPUs, sparking both excitement and concerns among locals.

Key details about xAI’s Memphis supercomputing cluster:

  • The facility, dubbed the “Gigafactory of Compute,” is now operational and began training models at 4:20am local time on Monday morning, according to Musk.
  • It leverages 100,000 liquid-cooled Nvidia H100 GPUs on a single RDMA fabric, which Musk touts as “the most powerful AI training cluster in the world.”
  • The H100 GPUs are designed specifically for training large AI models and require immense amounts of energy and computing power.

Local reactions and concerns:

  • While the Greater Memphis Chamber economic development group has praised xAI’s decision to open the facility in the area, some locals have expressed worries about its potential environmental impact.
  • The Memphis Community Against Pollution group estimates the cluster may consume over one million gallons of water per day for cooling and up to 150 megawatts of electricity per hour, equivalent to powering 100,000 homes.
  • Memphis City Council member Pearl Walker noted that people are “afraid of what’s possibly going to happen with the water and they are afraid about the energy supply.”

Putting xAI’s cluster in context:

  • While certainly massive, xAI’s Memphis facility may not necessarily be the largest computing cluster in the world, as tech giants like Microsoft, Google, and Meta also leverage huge data centers to train their AI models.
  • Meta CEO Mark Zuckerberg has vowed to acquire 350,000 Nvidia H100 GPUs this year alone, for comparison.
  • The race to build bigger and more powerful AI supercomputing clusters highlights the immense computing resources required to develop advanced AI systems and the challenges around their energy consumption.

Musk has previously stated that xAI plans to release Grok 2 in August, though it’s unclear if that model will utilize this new supercomputing cluster. However, Grok 3, slated for release by the end of 2024, will train on the 100,000 H100 GPUs in Memphis. As xAI ramps up its computing capabilities, questions remain about how the startup will address the environmental concerns its facilities raise and how it will stack up against tech giants in the race to achieve more powerful AI systems.

Elon Musk's xAI Powers Up 100K Nvidia GPUs to Train Grok

Recent News

71% of Investment Bankers Now Use ChatGPT, Survey Finds

Investment banks are increasingly adopting AI, with smaller firms leading the way and larger institutions seeing higher potential value per employee.

Scientists are Designing “Humanity’s Last Exam” to Assess Powerful AI

The unprecedented test aims to assess AI capabilities across diverse fields, from rocketry to philosophy, with experts submitting challenging questions beyond current benchmarks.

Hume Launches ‘EVI 2’ AI Voice Model with Emotional Responsiveness

The new AI voice model offers improved naturalness, faster response times, and customizable voices, potentially enhancing AI-human interactions across various industries.