×
Run local LLMs in your browser with this free AI extension
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

A new Firefox extension called Page Assist enables users to interact with Ollama, a local Large Language Model (LLM), through a browser-based interface rather than command-line interactions.

What is Ollama: Ollama is a tool that allows users to run AI language models locally on their own computers, providing an alternative to cloud-based AI services and addressing privacy concerns.

  • Ollama can be installed on MacOS, Linux, and Windows operating systems
  • The software runs AI models locally, meaning all data processing happens on the user’s computer rather than in the cloud
  • Local processing offers enhanced privacy compared to remote AI services

Key Features of Page Assist: The Firefox extension adds a graphical interface to Ollama, making it more accessible to users who prefer not to use command-line tools.

  • Provides easy access to model selection and management
  • Includes image upload capabilities
  • Offers internet search enable/disable options
  • Features customizable settings through a user-friendly interface

Installation Process: Setting up Page Assist requires both Ollama and Firefox to be installed on the system.

  • Users can find Page Assist in the Firefox Add-Ons store
  • The extension’s source code is available on GitHub for security verification
  • While not actively monitored by Mozilla for security, the extension has positive user reviews and transparent development

Using the Extension: Page Assist streamlines interaction with Ollama through a simple workflow.

  • Users must ensure Ollama is running before using the extension
  • The extension can be pinned to the Firefox toolbar for easy access
  • Queries are submitted through a clean, intuitive interface
  • Multiple AI models can be selected and managed through the extension

Model Management: Page Assist includes built-in tools for managing different AI models.

  • Users can add new models directly through the extension interface
  • The system displays all installed models in a dropdown menu
  • Model installation requires consideration of file sizes, as some models can be quite large

Looking Forward: While Page Assist significantly improves the accessibility of local LLMs, users should remain mindful of security considerations when installing browser extensions, even those with available source code and positive reviews.

How to run a local LLM as a browser-based AI with this free extension

Recent News

AI enhances supply chain resilience but can’t prevent all crises

Advanced forecasting tools help companies spot disruptions earlier but sudden viral trends still catch businesses off guard.

Trump vows use of emergency declaration to feed more power to AI

Presidential emergency order allows tech companies to build private power plants to meet growing AI computing demands.

Billionaire Mukesh Ambani to build world’s largest data center in India

Mukesh Ambani's Reliance Industries plans to invest $1.1 billion in a 100-megawatt data center near Mumbai.