jrojer / tao-multibot

LLM chat-bot infrastructure service featuring multiple bots instances with custom plugins and frontends
GNU General Public License v3.0
6 stars 0 forks source link

tao-multibot

LLM chat-bot infrastructure service featuring multiple bots instances with custom plugins (code/sql execution/document processing/etc.) and frontends (telegram, whatsapp, web).

Join Project's Telegram Group and share your ideas.

Getting started

python3 -m virtualenv .venv
source .venv/bin/activate

pip install -r requirements.txt

pytest

Then:

Create master.json:

{
    "infra": {
        "debug": false,
        "server": {
            "port": 8080
        },
        "postgres": {
            "enabled": false,
            "host": "localhost",
            "port": 5432,
            "user": "postgres",
            "password": "password",
            "schemas": "public"
        },
        "influxdb": {
            "enabled": false,
            "url": "http://localhost:8086",
            "org": "org",
            "bucket": "bucket",
            "token": "token123123token"
        }
    },
    "bots": {
        "<YOUR BOT USERNAME>": {
            "bot_id": "<YOUR BOT USERNAME>",
            "type": "tg_bot",
            "token": "<TELEGRAM BOT TOKEN>",
            "tao_bot": {
                "username": "<YOUR BOT USERNAME>",
                "chats": [],
                "admins": [
                    "<YOUR USERNAME>"
                ],
                "users": [],
                "bot_mention_names": [
                    "tao",
                    "тао"
                ],
                "control_chat_id": "-",
                "messages_per_completion": 20,
                "system_prompt": "./"
            },
            "gpt": {
                "url": "https://api.openai.com/v1/chat/completions",
                "type": "openai",
                "token": "<YOUR OPENAI TOKEN>",
                "model": "gpt-3.5",
                "temperature": 0,
                "max_tokens": 1000,
                "top_p": 1,
                "frequency_penalty": 0,
                "presence_penalty": 0
            }
        }
}

Note, here postgres and influxdb are disabled. Chat messages are saved in-memory. You can configure the services later.

All is done, fire up the platform:

python entrypoint.py

Further steps

Features

Copilot notebook

Copilot notebook contains code that builds a systemp prompt out of all project files. Using this prompt the GPT can help you write code components and tests.

Logical diagram

doc/logical_diagram.png