×
Apple Paper Suggests  Apple Intelligence Was Trained with Google
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Key revelations from Apple’s research paper: A quote buried in Apple’s “Apple Intelligence Foundation Language Models” paper suggests that the company initially relied on Google’s hardware for training its AI models:

  • The paper states that Apple’s Foundation Model (AFM) and its underlying server technology were initially built on “v4 and v5P Cloud TPU clusters” using Apple software.
  • While a CNBC report suggested that Apple rented time on Google-hosted clusters, the research paper does not explicitly mention Google or Nvidia, implying that Apple likely purchased the hardware directly and used it within its own data centers.

Apple’s evolving AI infrastructure: As Apple increases its investment in the AI sector, the company is focusing on developing its own hardware and infrastructure for processing Apple Intelligence queries:

  • Project ACDC, Apple’s ongoing initiative, centers on using hardware derived from Apple Silicon in its data centers to handle Apple Intelligence queries.
  • The company plans to allocate $5 billion to server enhancements alone over the next two years, signaling its commitment to expanding its AI capabilities.

Controversy surrounding Apple’s AI training data: Questions have been raised about the data sources used to train Apple’s early-generation AI models:

Apple’s position in the AI market: While Apple may not have been an early influencer in the AI market, the company is making significant strides to become a major player:

  • Apple is investing heavily in AI, with plans to allocate substantial resources to server enhancements and infrastructure development.
  • The company aims to match the technological capabilities of competitors like Microsoft and Meta in the near future.
  • Apple is adopting a more open approach to its AI development, including the release of true open-source large language models (LLMs).

Analyzing deeper: Apple’s reliance on Google hardware for the early development of Apple Intelligence highlights the complex dynamics of competition and collaboration in the AI industry. As the company continues to invest in its own AI infrastructure and hardware, it remains to be seen how Apple will differentiate itself from its competitors and address concerns surrounding the ethical sourcing of training data. Moreover, Apple’s commitment to releasing open-source LLMs may signal a shift towards greater transparency in the AI development process, potentially setting a new standard for the industry as a whole.

Apple admits it relied on Google to train Apple Intelligence

Recent News

AI agents and the rise of Hybrid Organizations

Meta makes its improved AI image generator free to use while adding visible watermarks and daily limits to prevent misuse.

Adobe partnership brings AI creativity tools to Box’s content management platform

Box users can now access Adobe's AI-powered editing tools directly within their secure storage environment, eliminating the need to download files or switch between platforms.

Nvidia’s new ACE platform aims to bring more AI to games, but not everyone’s sold

Gaming companies are racing to integrate AI features into mainstream titles, but high hardware requirements and artificial interactions may limit near-term adoption.