matrixgpt / matrix-chatgpt-bot

Talk to ChatGPT via any Matrix client!
GNU Affero General Public License v3.0
237 stars 64 forks source link
bot bridge chatgpt element matrix

Matrix ChatGPT Bot

Note: This is unmaintained. You should use Baibot instead of this project.

Talk to ChatGPT via any Matrix client!

Screenshot of Element iOS app showing conversation with bot

A Matrix bot that uses waylaidwanderer/node-chatgpt-api to access the official ChatGPT API.

Warning for users upgrading from version 2.x

OpenAI released the official API for ChatGPT. Thus, we no longer have to use any older models or any models which kept on being turned off by OpenAI. This means the bot is now way more stable and way faster. However, please note: The usage of the API is no longer free. If you use this bot, your OpenAI account will be charged! You might want to limit your budget in your account using the OpenAI website. You need to remove the CHATGPT_MODEL variable from your environment, if you changed the value.

Usage

  1. Create a room
  2. Add the bot
  3. Start chatting.

Features

Configure

Create a copy of the example .env file

cp .env.example .env

Adjust all required settings in the .env file before running. Optional settings can also be adjusted later.

Prerequsistes

Matrix

OpenAI / ChatGPT

Setup

Note: Doing any of the following can cause issues with encryption later on:

Run

There are multiple ways to run this bot. The easiest way is to run it within docker.

with Docker

This is the recommended way to run this project. It will use the latest stable release.

docker run -it -v storage:/storage --env-file=./.env --name matrix-chatgpt-bot ghcr.io/matrixgpt/matrix-chatgpt-bot:latest

or to build locally from the latest unstable release (only do this if you have a good reason):

docker build . -t matrix-chatgpt-bot
docker run -it -v storage:/storage --env-file=./.env --name matrix-chatgpt-bot matrix-chatgpt-bot

Note: Without -it flags in the command above you won't be able to stop the container using Ctrl-C

Note: In order to see the output of your console you need to run docker logs matrix-chatgpt-bot

with Docker Compose

If you prefer you can use a docker-compose file. Copy the content below and save it in a file named docker-compose.yml. Either with a self-build image (run docker build . -t matrix-chatgpt-bot from your local git repo location) or with the latest stable pre-built release from this repo (the recommended way).

The script will look for the .env file in the same folder as the docker-compose.yml. The key storage folder storage will be created in the same folder as well. Adjust the locations to your needs.

  version: '3.7'
  services:
    matrix-chatgpt-bot:
      container_name: matrix-chatgpt-bot 
      image: ghcr.io/matrixgpt/matrix-chatgpt-bot:latest ## change to "matrix-chatgpt-bot" if you want to use your self-build image 
      volumes:
        - ./storage:/storage
      env_file: 
        - ./.env

without Docker

Important: It is strongly recommended to run this package under Docker to not need to install various dependencies manually. Nevertheless, you can also run it by using the package manager yarn (get it via apt install -y yarn). You might also need to have a newer version of Node.js and other missing packages.

in Development

You only need to do this if you want to contribute code to this package.

Good to know

FAQ

I get "[Error: decryption failed because the room key is missing]"

Encryption works great with this package but can sometimes be a bit sensitive. Following steps can help to solve the "encryption" error

I want to chat with the bot without dealing with encryption problems

I get "{ errcode: 'M_NOT_FOUND', error: 'Event not found.' }"

How do I know that the bot is running succesfully?

Once the bot has started succesfully, it will output the following information to your console.

I use Docker but I don't see any console output

You most likely need to view the logs by running docker logs matrix-chatgpt-bot

How to set the temperature

Set the temperature by using CHATGPT_TEMPERATURE in your .env file. The default is 0.8.

Here are some guidelines for setting the temperature:

Temperature Values Appropriate Tasks Examples
Below 0.5 (low) Tasks requiring a single correct answer or predictable output Programming
0.5-0.9 (medium) Tasks needing somewhat varied and creative content grounded in reality E-mail response
Above 0.9 (high) Tasks requiring more creative and unpredictable output Story writing

The response I receive is excessively long and gets truncated. Unlike in ChatGPT, my matrix does not have a "Continue generating" button.

You can simply write continue to prompt the bot to continue from its previous answer.

Reporting issues

Please report issues via Github. The chat room is for discussion.

Please use the search on Github and Matrix before asking for support.

Discussion

Join #matrix-chatgpt-bot:matrix.org with any Matrix chat client or on the web!

If you've never set up a Matrix client before you can follow the prompts to get started.

License

GNU AGPLv3. See LICENSE