×
Run local LLMs in your browser with this free AI extension
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

A new Firefox extension called Page Assist enables users to interact with Ollama, a local Large Language Model (LLM), through a browser-based interface rather than command-line interactions.

What is Ollama: Ollama is a tool that allows users to run AI language models locally on their own computers, providing an alternative to cloud-based AI services and addressing privacy concerns.

  • Ollama can be installed on MacOS, Linux, and Windows operating systems
  • The software runs AI models locally, meaning all data processing happens on the user’s computer rather than in the cloud
  • Local processing offers enhanced privacy compared to remote AI services

Key Features of Page Assist: The Firefox extension adds a graphical interface to Ollama, making it more accessible to users who prefer not to use command-line tools.

  • Provides easy access to model selection and management
  • Includes image upload capabilities
  • Offers internet search enable/disable options
  • Features customizable settings through a user-friendly interface

Installation Process: Setting up Page Assist requires both Ollama and Firefox to be installed on the system.

  • Users can find Page Assist in the Firefox Add-Ons store
  • The extension’s source code is available on GitHub for security verification
  • While not actively monitored by Mozilla for security, the extension has positive user reviews and transparent development

Using the Extension: Page Assist streamlines interaction with Ollama through a simple workflow.

  • Users must ensure Ollama is running before using the extension
  • The extension can be pinned to the Firefox toolbar for easy access
  • Queries are submitted through a clean, intuitive interface
  • Multiple AI models can be selected and managed through the extension

Model Management: Page Assist includes built-in tools for managing different AI models.

  • Users can add new models directly through the extension interface
  • The system displays all installed models in a dropdown menu
  • Model installation requires consideration of file sizes, as some models can be quite large

Looking Forward: While Page Assist significantly improves the accessibility of local LLMs, users should remain mindful of security considerations when installing browser extensions, even those with available source code and positive reviews.

How to run a local LLM as a browser-based AI with this free extension

Recent News

AI’s energy demands set to triple, but economic gains expected to surpass costs

Economic gains from AI will reach 0.5% of global GDP annually through 2030, outweighing environmental costs despite data centers potentially consuming as much electricity as India.

AI-generated dolls spark backlash from traditional art community

Human artists rally against viral AI doll portrait trend that threatens custom figure makers and raises questions about artistic authenticity.

The impact of LLMs on problem-solving in software engineering

Developing deep expertise in a specific domain remains more valuable than general AI skills as technology continues to reshape technical professions.