×
Apple Paper Suggests  Apple Intelligence Was Trained with Google
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Key revelations from Apple’s research paper: A quote buried in Apple’s “Apple Intelligence Foundation Language Models” paper suggests that the company initially relied on Google’s hardware for training its AI models:

  • The paper states that Apple’s Foundation Model (AFM) and its underlying server technology were initially built on “v4 and v5P Cloud TPU clusters” using Apple software.
  • While a CNBC report suggested that Apple rented time on Google-hosted clusters, the research paper does not explicitly mention Google or Nvidia, implying that Apple likely purchased the hardware directly and used it within its own data centers.

Apple’s evolving AI infrastructure: As Apple increases its investment in the AI sector, the company is focusing on developing its own hardware and infrastructure for processing Apple Intelligence queries:

  • Project ACDC, Apple’s ongoing initiative, centers on using hardware derived from Apple Silicon in its data centers to handle Apple Intelligence queries.
  • The company plans to allocate $5 billion to server enhancements alone over the next two years, signaling its commitment to expanding its AI capabilities.

Controversy surrounding Apple’s AI training data: Questions have been raised about the data sources used to train Apple’s early-generation AI models:

Apple’s position in the AI market: While Apple may not have been an early influencer in the AI market, the company is making significant strides to become a major player:

  • Apple is investing heavily in AI, with plans to allocate substantial resources to server enhancements and infrastructure development.
  • The company aims to match the technological capabilities of competitors like Microsoft and Meta in the near future.
  • Apple is adopting a more open approach to its AI development, including the release of true open-source large language models (LLMs).

Analyzing deeper: Apple’s reliance on Google hardware for the early development of Apple Intelligence highlights the complex dynamics of competition and collaboration in the AI industry. As the company continues to invest in its own AI infrastructure and hardware, it remains to be seen how Apple will differentiate itself from its competitors and address concerns surrounding the ethical sourcing of training data. Moreover, Apple’s commitment to releasing open-source LLMs may signal a shift towards greater transparency in the AI development process, potentially setting a new standard for the industry as a whole.

Apple admits it relied on Google to train Apple Intelligence

Recent News

Baidu reports steepest revenue drop in 2 years amid slowdown

China's tech giant Baidu saw revenue drop 3% despite major AI investments, signaling broader challenges for the nation's technology sector amid economic headwinds.

How to manage risk in the age of AI

A conversation with Palo Alto Networks CEO about his approach to innovation as new technologies and risks emerge.

How to balance bold, responsible and successful AI deployment

Major companies are establishing AI governance structures and training programs while racing to deploy generative AI for competitive advantage.