×
Apple Paper Suggests  Apple Intelligence Was Trained with Google
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Key revelations from Apple’s research paper: A quote buried in Apple’s “Apple Intelligence Foundation Language Models” paper suggests that the company initially relied on Google’s hardware for training its AI models:

  • The paper states that Apple’s Foundation Model (AFM) and its underlying server technology were initially built on “v4 and v5P Cloud TPU clusters” using Apple software.
  • While a CNBC report suggested that Apple rented time on Google-hosted clusters, the research paper does not explicitly mention Google or Nvidia, implying that Apple likely purchased the hardware directly and used it within its own data centers.

Apple’s evolving AI infrastructure: As Apple increases its investment in the AI sector, the company is focusing on developing its own hardware and infrastructure for processing Apple Intelligence queries:

  • Project ACDC, Apple’s ongoing initiative, centers on using hardware derived from Apple Silicon in its data centers to handle Apple Intelligence queries.
  • The company plans to allocate $5 billion to server enhancements alone over the next two years, signaling its commitment to expanding its AI capabilities.

Controversy surrounding Apple’s AI training data: Questions have been raised about the data sources used to train Apple’s early-generation AI models:

Apple’s position in the AI market: While Apple may not have been an early influencer in the AI market, the company is making significant strides to become a major player:

  • Apple is investing heavily in AI, with plans to allocate substantial resources to server enhancements and infrastructure development.
  • The company aims to match the technological capabilities of competitors like Microsoft and Meta in the near future.
  • Apple is adopting a more open approach to its AI development, including the release of true open-source large language models (LLMs).

Analyzing deeper: Apple’s reliance on Google hardware for the early development of Apple Intelligence highlights the complex dynamics of competition and collaboration in the AI industry. As the company continues to invest in its own AI infrastructure and hardware, it remains to be seen how Apple will differentiate itself from its competitors and address concerns surrounding the ethical sourcing of training data. Moreover, Apple’s commitment to releasing open-source LLMs may signal a shift towards greater transparency in the AI development process, potentially setting a new standard for the industry as a whole.

Apple admits it relied on Google to train Apple Intelligence

Recent News

Nvidia’s new AI agents can search and summarize huge quantities of visual data

NVIDIA's new AI Blueprint combines computer vision and generative AI to enable efficient analysis of video and image content, with potential applications across industries and smart city initiatives.

How Boulder schools balance AI innovation with student data protection

Colorado school districts embrace AI in classrooms, focusing on ethical use and data privacy while preparing students for a tech-driven future.

Microsoft Copilot Vision nears launch — here’s what we know right now

Microsoft's new AI feature can analyze on-screen content, offering contextual assistance without the need for additional searches or explanations.