hibobmaster / matrix_chatgpt_bot

A simple matrix bot that supports image generation and chatting using ChatGPT, Langchain
https://matrix.to/#/#public:matrix.qqs.tw
MIT License
75 stars 13 forks source link

Support other LocalAI engines #42

Closed 35develr closed 6 months ago

35develr commented 6 months ago

I want to implement a LocalAi running Mistral7B, both matrix_chatgpt_bot and LocalAi running on docker. but when i configure it in config.json i get this error:

raise NotImplementedError( NotImplementedError: Engine mistral is not supported. Select from ['gpt-3.5-turbo', 'gpt-3.5-turbo-16k', 'gpt-3.5-turbo-0613', 'gpt-3.5-turbo-16k-0613', 'gpt-4', 'gpt-4-32k', 'gpt-4-0613', 'gpt-4-32k-0613']

I added "mistral" to gptbot.py, but no effect.

ENGINES = [ "gpt-3.5-turbo", "gpt-3.5-turbo-16k", "gpt-3.5-turbo-0613", "gpt-3.5-turbo-16k-0613", "gpt-4", "gpt-4-32k", "gpt-4-0613", "gpt-4-32k-0613", "mistral", ]

Please Help ;-)

_EDIT:

i solved this problem by naming the model gpt-4 and unsing gpt-4 in matrix-chatgpt_bot config. set in docker-compose.yaml of LocalAI:

...
    environment:
      - DEBUG=true
      - MODELS_PATH=/models

      # You can preload different models here as well.
      # See: https://github.com/go-skynet/model-gallery
      - 'PRELOAD_MODELS=[{"url": "github:go-skynet/model-gallery/mistral.yaml", "name": "gpt-4"}]'
...

_

hibobmaster commented 6 months ago

Glad you have solved it.

hibobmaster commented 6 months ago

You can use a custom model name now, like "gpt_model": "mistral"

https://github.com/hibobmaster/matrix_chatgpt_bot/commit/96a83fd8242c69fa8b4912ca17deb18d1064d27d