View on GitHub

Meaningful Model for Exploring Expert Views

Transform your web browsing with AI-powered contextual understanding - right in your browser, powered by local LLMs.

Get Started

🔒 Privacy First

All processing happens locally on your machine. No data is sent to external servers, ensuring complete privacy of your browsing and queries.

🤖 Local LLM Power

Leverages Ollama for local LLM capabilities, providing intelligent responses without cloud dependencies.

🔍 Smart Context

Uses ChromaDB for efficient vector storage and retrieval, enabling contextually aware responses across multiple pages.

How It Works

MMXXV Demo
  1. Install the Chrome extension and start the local server
  2. Click the extension icon to open the chat interface
  3. Add URLs or use the current tab for context
  4. Ask questions and get contextually enhanced responses

Quick Start

# Clone the repository
git clone https://github.com/vanzway/mmxxv.git
cd mmxxv

# Set up Python environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install websockets chromadb beautifulsoup4 requests ollama

# Install required models
ollama pull nomic-embed-text
ollama pull hf.co/bartowski/Llama-3.2-1B-Instruct-GGUF:Q4_0

# Start the server
python mmxxv.py --server --config mmxxv.json