This is a simple Matrix bot that support using OpenAI API, Langchain to generate responses from user inputs. The bot responds to these commands: !gpt
, !chat
, !v
, !pic
, !new
, !lc
and !help
depending on the first word of the prompt.
Docker method(Recommended):
Edit config.json
or .env
with proper values
For explainations and complete parameter list see: https://github.com/hibobmaster/matrix_chatgpt_bot/wiki
Create two empty file, for persist database only
touch sync_db context.db manage_db
sudo docker compose up -d
manage_db(can be ignored) is for langchain agent, sync_db is for matrix sync database, context.db is for bot chat context
Normal Method:
system dependece: libolm-dev
git clone https://github.com/hibobmaster/matrix_chatgpt_bot.git
python -m venv venv
source venv/bin/activate
pip install -U pip setuptools wheel
pip install -r requirements.txt
room_id
: bot will work in the room where it is in {
"homeserver": "YOUR_HOMESERVER",
"user_id": "YOUR_USER_ID",
"password": "YOUR_PASSWORD",
"device_id": "YOUR_DEVICE_ID",
"room_id": "YOUR_ROOM_ID",
"openai_api_key": "YOUR_API_KEY",
"gpt_api_endpoint": "xxxxxxxxx"
}
python src/main.py
To interact with the bot, simply send a message to the bot in the Matrix room with one of the following prompts:
!help
help message
!gpt
To generate a one time response:
!gpt What is the meaning of life?
!chat
To chat using official api with context conversation!chat Can you tell me a joke?
You can refer the screenshot
Room Level: quote a image and @bot + {prompt}
Thread Level: quote a image with a {prompt}
!lc
To chat using langchain api endpoint
!lc All the world is a stage
!pic
To generate an image using openai DALL·E or LocalAI!pic A bridal bouquet made of succulents
!agent
display or set langchain agent
!agent list
!agent use {agent_name}
!new + {chat}
Start a new converstaionLangChain(flowise) admin: https://github.com/hibobmaster/matrix_chatgpt_bot/wiki/Langchain-(flowise)
https://github.com/hibobmaster/matrix_chatgpt_bot/wiki/
Room Level:
Thread Level:
Mention bot with prompt, bot will reply in thread.
To keep context just send prompt in thread directly without mention it.