openchatai / OpenChat

LLMs custom-chatbots console ⚡
https://openchat.so
MIT License
5.1k stars 639 forks source link

GitHub Contributors GitHub Last Commit GitHub Issues GitHub Pull Requests Github License Discord


🔥 OpenChat


OpenChat is an everyday user chatbot console that simplifies the utilization of large language models. With the advancements in AI, the installation and usage of these models have become overwhelming. OpenChat aims to address this challenge by providing a two-step setup process to create a comprehensive chatbot console. It serves as a central hub for managing multiple customized chatbots.

Currently, OpenChat supports GPT models, and we are actively working on incorporating various open-source drivers that can be activated with a single click.

Try it out:

You can try it out on openchat.so

https://github.com/openchatai/OpenChat/assets/32633162/112a72a7-4314-474b-b7b5-91228558370c

Chinese Video Tutorial:https://www.bilibili.com/video/BV1YX4y1H7oN

🏁 Current Features

🛣️ Roadmap:

We love hearing from you! Got any cool ideas or requests? We're all ears! So, if you have something in mind, give us a shout!

🚀 Getting Started

git clone git@github.com:openchatai/OpenChat.git

Setting Up Your Environment

Note: Starting July, Qdrant is our Preferred Open-Source Vector Store 🚀 No initial Pinecone registration required. To begin, delve into the comprehensive guide: Using Qdrant, provided in the following section.

Before you begin, make sure to update the common.env file with the necessary keys:

OPENAI_API_KEY=# Retrieve from your [openai.com](https://www.openai.com) account
PINECONE_API_KEY=# Obtain from the "API Keys" tab in [pinecone](https://www.pinecone.io)
PINECONE_ENVIRONMENT=# Obtain after creating your index in [pinecone](https://www.pinecone.io)
VECTOR_STORE_INDEX_NAME=# Obtain after creating your index in [pinecone](https://www.pinecone.io)
STORE=pinecone

Using Azure OpenAI

Using Qdrant

If you want to switch from Pinecone to Qdrant, you can set the following environment variables:

Optional [To modify the chat behaviour]

CHAIN_TYPE = The type of chain to use: conversation_retrieval | retrieval_qa

Using Prebuilt Images

If you're experiencing slow internet speeds or if Docker builds are taking a long time, consider using the prebuilt images for your respective architecture. Simply comment out the unnecessary image line in the docker-compose.yml file and uncomment the appropriate prebuilt image line.

Example:

# Mac environment
image: codebanesr/openchat_llm_server:edge_amd64

# Or, for Linux environment
image: codebanesr/openchat_llm_server:edge

Note: for pincone db, make sure that the dimension is equal to 1536

or in case you are using Windows

make.bat

Sure, here's the modified text with the additional line you requested:

Getting Started with the Openchat Django App

Start your adventure of contributing to and using OpenChat, now remade using the Python programming language. You can begin by following the instructions in the guide available here: OpenChat Python Guide.

Kindly be aware that the transition to the Python backend includes a significant alteration related to the Qdrant vector store, constituting a breaking change.

Once the installation is complete, you can access the OpenChat console at: http://localhost:8000

🚀 Unleash the Power of Native LLM

Discover the latest addition: llama2 support. Dive into this Guide to Harness LLAMA2 by Meta 📖🔮


Full documentation available here

🚀 Upgrade guide:

We do our best to not introduce breaking changes, so far, you only need to git pull and run make install whenever there is a new update.

❤️ Thanks:

License

This project is licensed under the MIT License.

Contributors ✨

Thanks goes to these wonderful people (emoji key):

Ikko Eltociear Ashimine
Ikko Eltociear Ashimine

🤔 💻
Joshua Sindy
Joshua Sindy

🐛
Erjan Kalybek
Erjan Kalybek

📖
WoahAI
WoahAI

🐛 💻
Tommy in Tongji
Tommy in Tongji

📖
codebane
codebane

💻 📖
lvalics
lvalics

💻 📖

This project follows the all-contributors specification. Contributions of any kind welcome!