1runeberg / confichat

Lightweight, standalone, multi-platform, and privacy focused local LLM chat interface with optional encryption
http://confichat.ai
Apache License 2.0
56 stars 3 forks source link
ai chatgpt llm ollama openai

ConfiChat Logo

Windows Build Linux Build Android Build macOS Build iOS Build




ConfiChat Sizzle Reel



Welcome to ConfiChat – a multi-platform, privacy-focused LLM chat interface with optional encryption of chat history and assets.

ConfiChat offers the flexibility to operate either fully offline or blend offline-and-online capabilities:


📦 1. Downloads

We provide [pre-built binaries/executables]() for various platforms, making it easy to get started quickly.

Note for macOS and iOS users: Binaries are not provided due to platform restrictions. Please see the Compiling on your own section.

Note for Windows users: You may encounter a SmartScreen warning since the binaries aren't signed. They are safely built via GitHub CI when downloaded directly from the Releases section. You can also view the full build logs. And of course you can build from source.

❤️ If you find this app useful, consider sponsoring us in GitHub Sponsors to help us secure necessary certificates and accounts for future binary distributions.

💼 If your company needs a bespoke version with robust enterprise features, Contact Us.


📖 2. Quick Start Guides

If you're completely new to offline LLMs, check out this easy Three-Step guide to get started (including ConfiChat) - a no-coding, no-dependencies approach.

You can also get started quickly with ConfiChat by following one of our quick start guides depending on whether you want to use local models, online models, or both.


💬 3. About ConfiChat

ConfiChat is a lightweight, multi-platform chat interface designed with privacy and flexibility in mind. It supports both local and online providers.

Unlike other solutions that rely on Docker and a suite of heavy tools, ConfiChat is a standalone app that lets you focus on the models themselves rather than maintaining the UI. This makes it an ideal choice for users who prefer a streamlined, efficient interface.

All chat sessions are managed locally by the app as individual JSON files, with optional encryption available for added security.

Local LLMs are particularly beneficial for applications requiring offline access, low-latency responses, or the handling of sensitive data that must remain on your device. They also provide more customization and privacy for niche tasks, such as journaling or private counseling.

In a nutshell, ConfiChat caters to users who value transparent control over their AI experience.


✨ 4. Key Features


🛠️ 5. Compiling your own build

For those who prefer to compile ConfiChat themselves, or for macOS and iOS users, we provide detailed instructions in the Compiling on your own section.


🤝 6. Contributing

We welcome contributions from the community! Whether you're interested in adding new features, fixing bugs, or improving documentation, your help is appreciated. Please see our Contributing Guide for more details.


💖 7. Sponsorship

Your support helps us maintain and improve ConfiChat. Sponsorships are encouraged for the following items:

If you're interested in supporting ConfiChat, please visit our Sponsorship Page or if your company needs a bespoke version with robust enterprise features, Contact Us.