Take Control of AI: Choose Any Model and Keep Your Data Private

Thunderbolt, an open‑source AI client from Mozilla’s Thunderbird team, lets developers pick any OpenAI‑compatible model, run it on‑premises via Docker or Kubernetes, and keep all conversation data on their own servers, eliminating vendor lock‑in and enhancing privacy.

AI Explorer
AI Explorer
AI Explorer
Take Control of AI: Choose Any Model and Keep Your Data Private

Purpose and Origin

Thunderbolt is an open‑source AI client created by Mozilla’s Thunderbird team, the developers of a 20 million‑user mail client. It provides a unified entry point that lets users choose any model that implements the OpenAI API format, host the service on‑premises, and keep all conversation data under their own control.

Architecture

The codebase is written primarily in TypeScript. The front‑end uses a modern web stack, while the back‑end can be run in Docker containers or on Kubernetes. The design follows an “offline‑first” philosophy, aiming for full functionality even without internet connectivity.

Model integration is model‑agnostic: any service exposing an OpenAI‑compatible endpoint can be added. The documentation recommends Ollama or llama.cpp for free local inference and also supports commercial providers such as GPT‑4 or Claude.

Engineering artifacts include detailed system and component diagrams, a Storybook instance for UI component testing, and Vite Bundle Analyzer for front‑end performance profiling.

Quick Start (≈5 minutes)

Clone the repository and change to the deploy directory.

Run docker-compose up to launch the back‑end services.

In the client settings, add a model provider – for example a local Ollama endpoint ( http://localhost:11434) or an OpenAI API key.

Open the web UI; all conversations are stored on the local server.

For a zero‑cost experiment, install Ollama first and point Thunderbolt at the local Ollama service.

Target Scenarios

Enterprise deployments : On‑prem AI gateway for privacy‑sensitive sectors (finance, healthcare, legal) where employee queries must remain inside the corporate network.

Individual developers : Replacement for the ChatGPT web UI that can connect simultaneously to cloud models (e.g., GPT‑4) and local Llama models, enabling side‑by‑side answer comparison while keeping history locally.

Open‑source contributors : Clear contribution guidelines, consistent code style, and an ongoing security audit make the project suitable for community involvement.

Status

Thunderbolt is in early development. The team is expanding documentation, engaging the community, and maintaining a public roadmap. Bugs and feature requests are handled via GitHub Issues; security vulnerabilities should be reported through the official security channel. The project is released under the Mozilla Public License 2.0.

Interface Overview

Thunderbolt main interface: unified AI chat panel with model selection and session management
Thunderbolt main interface: unified AI chat panel with model selection and session management
DockerKubernetesopen-sourcemodel selectionData PrivacyAI client
AI Explorer
Written by

AI Explorer

Stay on track with the blogger and advance together in the AI era.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.