Welcome to ConfiChat â a multi-platform, privacy-focused LLM chat interface with optional encryption of chat history and assets.
ConfiChat offers the flexibility to operate either fully offline or blend offline-and-online capabilities:
We provide [pre-built binaries/executables]() for various platforms, making it easy to get started quickly.
Note for macOS and iOS users: Binaries are not provided due to platform restrictions. Please see the Compiling on your own section.
Note for Windows users: You may encounter a SmartScreen warning since the binaries aren't signed. They are safely built via GitHub CI when downloaded directly from the Releases section. You can also view the full build logs. And of course you can build from source.
â¤ď¸ If you find this app useful, consider sponsoring us in GitHub Sponsors to help us secure necessary certificates and accounts for future binary distributions.
đź If your company needs a bespoke version with robust enterprise features, Contact Us.
If you're completely new to offline LLMs, check out this easy Three-Step guide to get started (including ConfiChat) - a no-coding, no-dependencies approach.
You can also get started quickly with ConfiChat by following one of our quick start guides depending on whether you want to use local models, online models, or both.
ConfiChat is a lightweight, multi-platform chat interface designed with privacy and flexibility in mind. It supports both local and online providers.
Unlike other solutions that rely on Docker and a suite of heavy tools, ConfiChat is a standalone app that lets you focus on the models themselves rather than maintaining the UI. This makes it an ideal choice for users who prefer a streamlined, efficient interface.
All chat sessions are managed locally by the app as individual JSON files, with optional encryption available for added security.
Local LLMs are particularly beneficial for applications requiring offline access, low-latency responses, or the handling of sensitive data that must remain on your device. They also provide more customization and privacy for niche tasks, such as journaling or private counseling.
In a nutshell, ConfiChat caters to users who value transparent control over their AI experience.
Cross-Platform Compatibility: Developed in Flutter, ConfiChat runs on Windows, Linux, Android, MacOS, and iOS
Local Model Support (Ollama and LlamaCpp): Ollama & LlamaCpp both offer a range of lightweight, open-source local models, such as Llama by Meta, Gemma by Google, and Llava for multimodal/image support. These models are designed to run efficiently even on machines with limited resources.
OpenAI and Anthropic Support: Seamlessly integrates with OpenAI and Anthropic to provide advanced language model capabilities using your own API key. Please note that while the API does not store conversations like ChatGPT does, OpenAI retains input data for abuse monitoring purposes. You can review their latest data retention and security policies. In particular, check the "How does OpenAI handle data retention and monitoring for API usage?" in their FAQ (https://openai.com/enterprise-privacy/).
Privacy-Focused: Privacy is at the core of ConfiChat's development. The app is designed to prioritize user confidentiality, with optional chat history encryption ensuring that your data remains secure.
Lightweight Design: Optimized for performance with minimal resource usage.
For those who prefer to compile ConfiChat themselves, or for macOS and iOS users, we provide detailed instructions in the Compiling on your own section.
We welcome contributions from the community! Whether you're interested in adding new features, fixing bugs, or improving documentation, your help is appreciated. Please see our Contributing Guide for more details.
Your support helps us maintain and improve ConfiChat. Sponsorships are encouraged for the following items:
If you're interested in supporting ConfiChat, please visit our Sponsorship Page or if your company needs a bespoke version with robust enterprise features, Contact Us.