×
DeepMind Just Made a Breakthrough in AI Training
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Google DeepMind’s JEST AI training method promises significant speed and efficiency gains over traditional techniques, potentially addressing concerns about AI’s growing power demands.

Key Takeaways:

DeepMind’s JEST (joint example selection) training method breaks from traditional AI training by focusing on entire batches of data instead of individual data points:

  • A smaller AI model first grades data quality from high-quality sources and ranks batches by quality.
  • The small model then determines the batches most fit for training a larger model, resulting in up to 13 times faster training with 10 times less computation.

Addressing AI’s Power Demands: The JEST research comes at a crucial time as discussions about AI’s extreme power demands intensify:

  • AI workloads consumed about 4.3 GW in 2023, nearly matching the annual power consumption of Cyprus.
  • A single ChatGPT request costs 10 times more power than a Google search.
  • Arm’s CEO estimates AI may consume a quarter of the U.S. power grid by 2030.

Reliance on High-Quality Data: The success of the JEST method heavily depends on the quality of its initial training data:

  • The system relies on a human-curated dataset of the highest possible quality for its bootstrapping technique.
  • This makes the method more challenging for hobbyists or amateur AI developers to replicate, as expert-level research skills are likely required to curate the initial data.

Industry Adoption and Implications: How and if major AI players will adopt JEST methods remains uncertain:

  • Large language models like GPT-4 can cost hundreds of millions to train, so firms are likely seeking ways to reduce costs.
  • However, the competitive pressure to scale AI output may lead companies to use JEST to maintain maximum power draw for faster training rather than prioritizing energy savings.

Broader Implications:

While the JEST method promises significant efficiency gains, it remains to be seen whether the AI industry will prioritize cost savings and sustainability or use the technique to further accelerate the already rapid pace of AI development. As the costs of training cutting-edge AI models soar into the billions, the choices made by key players like Google could have profound implications for the future trajectory of artificial intelligence and its societal and environmental impacts.

Google claims new AI training tech is 13 times faster and 10 times more power efficient — DeepMind's new JEST optimizes training data for massive gains

Recent News

The first mini PC with CoPilot Plus and Intel Core Ultra processors is here

Asus's new mini PC integrates dedicated AI hardware and Microsoft's Copilot Plus certification into a Mac Mini-sized desktop computer.

Leap Financial secures $3.5M for AI-powered global payments

Tech-driven lenders are helping immigrants optimize their income and credit by tracking remittances and financial flows to their home countries.

OpenAI CEO Sam Altman calls former business partner Elon Musk a ‘bully’

The legal battle exposes growing friction between Silicon Valley's competing visions for ethical AI development and corporate governance.