×
Thanks to Exo Labs you can run powerful open-source LLMs on a Mac
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The emergence of local AI model computing has taken a significant leap forward with Exo Labs enabling powerful open source AI models to run on Apple’s new M4-powered Mac computers.

Breaking new ground: Exo Labs has successfully demonstrated running advanced large language models (LLMs) locally on Apple M4 devices, marking a significant shift away from cloud-based AI computing.

  • A cluster of four Mac Mini M4 devices and one Macbook Pro M4 Max, totaling around $5,000, can now run sophisticated AI models like Qwen 2.5 Coder-32B
  • This setup provides a cost-effective alternative to traditional GPU solutions, as a single Nvidia H100 GPU costs between $25,000-$30,000
  • The cluster achieves impressive performance metrics, processing 18 tokens per second with Qwen 2.5 Coder 32B and 8 tokens per second with Nemotron-70B

Technical innovation and accessibility: Exo Labs’ open-source software enables distributed AI computing across multiple devices, democratizing access to powerful AI capabilities.

  • The software distributes AI workloads across connected devices, allowing users to run models without requiring expensive specialized hardware
  • Anyone with coding experience can access and implement the solution through Exo’s Github repository
  • The system works particularly well with Apple’s M4 chip, which offers what Apple calls “the world’s fastest GPU core” and superior single-threaded performance

Privacy and control benefits: Local AI computing offers significant advantages over web-based solutions in terms of data security and user autonomy.

  • Users can process sensitive information locally without exposing data to cloud services
  • The system allows for complete control over AI model behavior and data handling
  • Enterprises in regulated industries can leverage powerful AI capabilities while maintaining data sovereignty

Future developments: Exo Labs is positioning itself for broader adoption and enterprise implementation.

  • A benchmarking website launch is planned to help users compare different hardware configurations
  • The company is developing enterprise-grade software offerings and providing bespoke services
  • Exo Labs has secured private funding to support its expansion and development efforts

Market implications: The ability to run powerful AI models locally on relatively affordable hardware could reshape the AI computing landscape and challenge the current cloud-centric paradigm.

You can now run the most powerful open source AI models locally on Mac M4 computers, thanks to Exo Labs

Recent News

AI-powered computers are adding more time to workers’ tasks, but there’s a catch

Early AI PC adopters report spending more time on tasks than traditional computer users, signaling growing pains in the technology's implementation.

The global bootcamp that teaches intensive AI safety programming classes

Global bootcamp program trains next wave of AI safety professionals through intensive 10-day courses funded by Open Philanthropy.

‘Anti-scale’ and how to save journalism in an automated world

Struggling news organizations seek to balance AI adoption with growing public distrust, as the industry pivots toward community-focused journalism over content volume.