Note: This is unmaintained. You should use Baibot instead of this project.
Talk to ChatGPT via any Matrix client!
A Matrix bot that uses waylaidwanderer/node-chatgpt-api to access the official ChatGPT API.
OpenAI released the official API for ChatGPT. Thus, we no longer have to use any older models or any models which kept on being turned off by OpenAI. This means the bot is now way more stable and way faster. However, please note: The usage of the API is no longer free. If you use this bot, your OpenAI account will be charged! You might want to limit your budget in your account using the OpenAI website.
You need to remove the CHATGPT_MODEL
variable from your environment, if you changed the value.
Create a copy of the example .env
file
cp .env.example .env
Adjust all required settings in the .env
file before running. Optional settings can also be adjusted later.
MATRIX_BLACKLIST
or MATRIX_WHITELIST
MATRIX_ROOM_BLACKLIST
or MATRIX_ROOM_WHITELIST
MATRIX_WHITELIST=:yourhomeserver.example
.OPENAI_API_KEY
in your .env
fileCHATGPT_API_MODEL
in your .env
file. ChatGPT is the gpt-3.5-turbo
-model which is the default. Please note that depending on the model your OpenAI account will be charged.CHATGPT_REVERSE_PROXY
in the .env
file.MATRIX_BOT_USERNAME
MATRIX_BOT_PASSWORD
(you can remove this later if you want)MATRIX_ACCESS_TOKEN
from the output.MATRIX_ACCESS_TOKEN
, you can now remove MATRIX_BOT_PASSWORD
.Note: Doing any of the following can cause issues with encryption later on:
There are multiple ways to run this bot. The easiest way is to run it within docker.
This is the recommended way to run this project. It will use the latest stable release.
docker run -it -v storage:/storage --env-file=./.env --name matrix-chatgpt-bot ghcr.io/matrixgpt/matrix-chatgpt-bot:latest
or to build locally from the latest unstable release (only do this if you have a good reason):
docker build . -t matrix-chatgpt-bot
docker run -it -v storage:/storage --env-file=./.env --name matrix-chatgpt-bot matrix-chatgpt-bot
Note: Without -it flags in the command above you won't be able to stop the container using Ctrl-C
Note: In order to see the output of your console you need to run docker logs matrix-chatgpt-bot
If you prefer you can use a docker-compose file. Copy the content below and save it in a file named docker-compose.yml
. Either with a self-build image (run docker build . -t matrix-chatgpt-bot
from your local git repo location) or with the latest stable pre-built release from this repo (the recommended way).
The script will look for the .env
file in the same folder as the docker-compose.yml
. The key storage folder storage
will be created in the same folder as well. Adjust the locations to your needs.
version: '3.7'
services:
matrix-chatgpt-bot:
container_name: matrix-chatgpt-bot
image: ghcr.io/matrixgpt/matrix-chatgpt-bot:latest ## change to "matrix-chatgpt-bot" if you want to use your self-build image
volumes:
- ./storage:/storage
env_file:
- ./.env
Important: It is strongly recommended to run this package under Docker to not need to install various dependencies manually.
Nevertheless, you can also run it by using the package manager yarn (get it via apt install -y yarn
). You might also need to have a newer version of Node.js and other missing packages.
yarn
yarn build
yarn start
You only need to do this if you want to contribute code to this package.
yarn
yarn build
Encryption works great with this package but can sometimes be a bit sensitive. Following steps can help to solve the "encryption" error
Don't use a MATRIX_ACCESS_TOKEN
extracted via Element-App, use the generated token from the bot based on your MATRIX_BOT_USERNAME
& MATRIX_BOT_PASSWORD
set in the env
file. It will be visible in the console at start up if the MATRIX_ACCESS_TOKEN
is not already set:
1) Remove the MATRIX_ACCESS_TOKEN
from the env
file and make sure MATRIX_BOT_USERNAME
& MATRIX_BOT_PASSWORD
are set
2) Re-run the bot
3) Copy the token from console output to your env
file
4) Restart the bot again.
If all fails, you can always reset your key storage. It's important to exercise all of the following steps, because any remaining data could lead to the next encryption error. Once everything is working, make sure to not touch the "storage" folder anymore:
1) Stop the bot
2) Delete the "storage" folder
3) Delete all user data for the matrix bot account (e.g. use Synapse-Admin) or create a fresh bot user account (you can then skip step 4)
4) Log into your bot account (e.g. via Element) and log out of all sessions
5) Verify the correctness of your env
file and then run the bot setup again (e.g. via docker-compose up
if you use docker-compose).
MATRIX_ENCRYPTION=false
in your env-file and restart the bot. If it previously was running with encryption switched on, you need to create a new room with the bot as encryption can't be switched off once it was activated.Once the bot has started succesfully, it will output the following information to your console.
[INFO] [index] Starting bot...
[INFO] [MatrixClientLite] End-to-end encryption enabled
## this depends on your setup[INFO] [index] Bot started!
You most likely need to view the logs by running docker logs matrix-chatgpt-bot
Set the temperature by using CHATGPT_TEMPERATURE in your .env file. The default is 0.8.
Here are some guidelines for setting the temperature:
Temperature Values | Appropriate Tasks | Examples |
---|---|---|
Below 0.5 (low) | Tasks requiring a single correct answer or predictable output | Programming |
0.5-0.9 (medium) | Tasks needing somewhat varied and creative content grounded in reality | E-mail response |
Above 0.9 (high) | Tasks requiring more creative and unpredictable output | Story writing |
You can simply write continue
to prompt the bot to continue from its previous answer.
Please report issues via Github. The chat room is for discussion.
Please use the search on Github and Matrix before asking for support.
Join #matrix-chatgpt-bot:matrix.org with any Matrix chat client or on the web!
If you've never set up a Matrix client before you can follow the prompts to get started.
GNU AGPLv3. See LICENSE