×
Meta is training its next Llama AI model on a record-breaking GPU cluster
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Meta’s AI ambitions accelerate: Meta is developing Llama 4, its next-generation AI model, using a massive GPU cluster that surpasses the computing power of its competitors.

  • CEO Mark Zuckerberg announced that Llama 4 is being trained on a cluster of more than 100,000 NVIDIA H100 GPUs, which he claims is “bigger than anything” reported by other companies.
  • The initial launch of Llama 4 is expected in early 2024, with smaller models likely to be ready first.
  • Zuckerberg hinted at potential advanced capabilities for Llama 4, including “new modalities,” “stronger reasoning,” and improved speed.

The race for AI dominance: Meta’s approach to AI development sets it apart from competitors and presents unique challenges and opportunities.

  • Unlike OpenAI, Google, and other major players, Meta makes its Llama models available for free download, attracting startups and researchers seeking complete control over their AI systems.
  • While Meta refers to Llama as “open source,” the license does impose some restrictions on commercial use, and the company does not disclose training details.
  • The massive scale of Meta’s GPU cluster for Llama 4 development presents significant engineering and energy consumption challenges.

Financial implications: Meta’s AI investments are substantial but supported by strong revenue growth and profit margins.

  • The company expects to spend up to $40 billion on capital expenditures in 2024, a 42% increase from 2023, largely for data centers and infrastructure.
  • Despite increased spending, Meta’s overall sales have grown by more than 22%, resulting in higher profit margins.
  • In contrast, OpenAI, while considered a leader in AI development, is reportedly burning through cash despite charging for access to its models.

Competitive landscape: Other major tech companies are also pushing forward with their AI development efforts.

  • OpenAI is working on GPT-5, which CEO Sam Altman claims will be “a significant leap forward” compared to its predecessor.
  • Google CEO Sundar Pichai announced that the company is developing a new version of its Gemini AI model family.
  • Elon Musk’s xAI venture has reportedly set up a cluster of 100,000 H100 GPUs in collaboration with X and NVIDIA.

Ethical considerations and controversies: Meta’s open approach to AI has raised concerns among some experts.

  • Critics worry that making powerful AI models freely available could enable malicious actors to launch cyberattacks or develop dangerous weapons.
  • While Llama models are fine-tuned to restrict harmful behavior, these safeguards can be relatively easily removed.
  • Zuckerberg remains committed to the open-source strategy, arguing that it offers cost-effectiveness, customization, trustworthiness, and ease of use for developers.

Future applications and monetization: Meta plans to leverage Llama 4’s capabilities across its services and explore new revenue streams.

  • The company’s ChatGPT-like chatbot, Meta AI, is already available in Facebook, Instagram, WhatsApp, and other apps, with over 500 million monthly users.
  • Meta expects to generate revenue through ads integrated into AI-powered features.
  • CFO Susan Li suggested that as users broaden their queries, new monetization opportunities will emerge over time.

Balancing innovation and sustainability: Meta’s ambitious AI development raises questions about energy consumption and environmental impact.

  • A cluster of 100,000 H100 chips is estimated to require 150 megawatts of power, significantly more than the largest national lab supercomputer in the United States.
  • Meta executives did not directly address concerns about energy access constraints that have affected AI development efforts in some parts of the US.

Looking ahead: Meta’s massive investment in AI infrastructure and its open-source approach could reshape the AI landscape.

  • If successful, Meta’s strategy of subsidizing Llama development through ad revenue could provide a sustainable model for making advanced AI accessible to a wider range of developers and researchers.
  • The company’s ability to balance innovation, ethical concerns, and environmental impact will be crucial in determining the long-term success of its AI initiatives.
Meta's Next Llama AI Models Are Training on a GPU Cluster ‘Bigger Than Anything’ Else

Recent News

Nvidia’s new AI agents can search and summarize huge quantities of visual data

NVIDIA's new AI Blueprint combines computer vision and generative AI to enable efficient analysis of video and image content, with potential applications across industries and smart city initiatives.

How Boulder schools balance AI innovation with student data protection

Colorado school districts embrace AI in classrooms, focusing on ethical use and data privacy while preparing students for a tech-driven future.

Microsoft Copilot Vision nears launch — here’s what we know right now

Microsoft's new AI feature can analyze on-screen content, offering contextual assistance without the need for additional searches or explanations.