×
AMD launches open-source app Gaia that runs AI models locally on Windows PCs
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

AMD is entering the local AI market with Gaia, an open-source application designed to run large language models (LLMs) on Windows PCs. As more users seek to run AI models on their own hardware for improved privacy and performance, AMD’s offering provides optimizations for their Ryzen AI processors while still functioning on any Windows machine. The application leverages retrieval-augmented generation technology to enhance model responses with contextual awareness, positioning it as a noteworthy competitor in the growing space of local AI tools.

The big picture: AMD has developed Gaia as an open-source project that enables various LLM models to run locally on Windows PCs, with special optimizations for systems using Ryzen AI processors.

  • The application uses the open-source Lemonade SDK from ONNX TurnkeyML for LLM inference, allowing models to adapt for different purposes including summarization and complex reasoning.
  • Gaia works through Retrieval-Augmented Generation (RAG), combining an LLM with a knowledge base to provide more accurate and contextually aware responses.

Key features: Gaia incorporates four specialized agent types that enable different AI-powered interactions for users.

  • Simple Prompt Completion serves as a direct model interaction tool for testing and evaluation purposes.
  • Chaty functions as the core chatbot interface for user interactions.
  • Clip provides YouTube search and Q&A functionality, expanding the system’s media capabilities.
  • Joker generates humor content, adding personality to the chatbot experience.

How it works: The application enhances user queries by processing and vectorizing external content before the LLM handles them.

  • Gaia provides LLM-specific tasks through the Lemonade SDK and serves them across multiple runtimes.
  • The system “vectorizes external content” from sources like GitHub, YouTube, and text files, storing this information in a local vector index.
  • This pre-processing step allegedly improves response accuracy and relevance compared to standard LLM interactions.

Technical implementation: AMD offers two different installation options to accommodate various hardware configurations.

  • A mainstream installer works on any Windows PC regardless of hardware manufacturer.
  • A “Hybrid” installer optimized for Ryzen AI PCs enables Gaia to leverage both the neural processing unit (NPU) and integrated graphics for improved performance.

Why this matters: Local LLM applications offer significant advantages over cloud-based alternatives for users concerned with privacy and performance.

  • Running AI models locally provides greater security by keeping sensitive data on-device.
  • Local operation reduces latency and can deliver better performance depending on the system hardware.
  • Perhaps most importantly, local LLMs function offline without requiring an internet connection.
AMD launches Gaia open source project for running LLMs locally on any PC

Recent News

North Korea unveils AI-equipped suicide drones amid deepening Russia ties

North Korea's AI-equipped suicide drones reflect growing technological cooperation with Russia, potentially destabilizing security in an already tense Korean peninsula.

Rookie mistake: Police recruit fired for using ChatGPT on academy essay finds second chance

A promising police career was derailed then revived after an officer's use of AI revealed gaps in how law enforcement is adapting to new technology.

Auburn University launches AI-focused cybersecurity center to counter emerging threats

Auburn's new center brings together experts from multiple disciplines to develop defensive strategies against the rising tide of AI-powered cyber threats affecting 78 percent of security officers surveyed.