×
Run local LLMs in your browser with this free AI extension
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

A new Firefox extension called Page Assist enables users to interact with Ollama, a local Large Language Model (LLM), through a browser-based interface rather than command-line interactions.

What is Ollama: Ollama is a tool that allows users to run AI language models locally on their own computers, providing an alternative to cloud-based AI services and addressing privacy concerns.

  • Ollama can be installed on MacOS, Linux, and Windows operating systems
  • The software runs AI models locally, meaning all data processing happens on the user’s computer rather than in the cloud
  • Local processing offers enhanced privacy compared to remote AI services

Key Features of Page Assist: The Firefox extension adds a graphical interface to Ollama, making it more accessible to users who prefer not to use command-line tools.

  • Provides easy access to model selection and management
  • Includes image upload capabilities
  • Offers internet search enable/disable options
  • Features customizable settings through a user-friendly interface

Installation Process: Setting up Page Assist requires both Ollama and Firefox to be installed on the system.

  • Users can find Page Assist in the Firefox Add-Ons store
  • The extension’s source code is available on GitHub for security verification
  • While not actively monitored by Mozilla for security, the extension has positive user reviews and transparent development

Using the Extension: Page Assist streamlines interaction with Ollama through a simple workflow.

  • Users must ensure Ollama is running before using the extension
  • The extension can be pinned to the Firefox toolbar for easy access
  • Queries are submitted through a clean, intuitive interface
  • Multiple AI models can be selected and managed through the extension

Model Management: Page Assist includes built-in tools for managing different AI models.

  • Users can add new models directly through the extension interface
  • The system displays all installed models in a dropdown menu
  • Model installation requires consideration of file sizes, as some models can be quite large

Looking Forward: While Page Assist significantly improves the accessibility of local LLMs, users should remain mindful of security considerations when installing browser extensions, even those with available source code and positive reviews.

How to run a local LLM as a browser-based AI with this free extension

Recent News

Data analytics acceleration solves AI’s hidden bottleneck

Data preparation consumes up to 80% of data scientists' time, creating a hidden bottleneck that threatens AI returns despite industry focus on larger models and faster inference chips.

Selling your face to AI could cost more than you think

Performers face lasting consequences after licensing their faces and voices for AI videos that promote questionable content beyond their control.

The hidden AI threat growing inside tech companies

Leading AI firms could use their models to secretly accelerate their own research, creating unprecedented power imbalances that regulators may be unable to detect or control.