I want to implement a LocalAi running Mistral7B, both matrix_chatgpt_bot and LocalAi running on docker.
but when i configure it in config.json i get this error:
raise NotImplementedError(
NotImplementedError: Engine mistral is not supported. Select from ['gpt-3.5-turbo', 'gpt-3.5-turbo-16k', 'gpt-3.5-turbo-0613', 'gpt-3.5-turbo-16k-0613', 'gpt-4', 'gpt-4-32k', 'gpt-4-0613', 'gpt-4-32k-0613']
i solved this problem by naming the model gpt-4 and unsing gpt-4 in matrix-chatgpt_bot config.
set in docker-compose.yaml of LocalAI:
...
environment:
- DEBUG=true
- MODELS_PATH=/models
# You can preload different models here as well.
# See: https://github.com/go-skynet/model-gallery
- 'PRELOAD_MODELS=[{"url": "github:go-skynet/model-gallery/mistral.yaml", "name": "gpt-4"}]'
...
I want to implement a LocalAi running Mistral7B, both matrix_chatgpt_bot and LocalAi running on docker. but when i configure it in config.json i get this error:
raise NotImplementedError( NotImplementedError: Engine mistral is not supported. Select from ['gpt-3.5-turbo', 'gpt-3.5-turbo-16k', 'gpt-3.5-turbo-0613', 'gpt-3.5-turbo-16k-0613', 'gpt-4', 'gpt-4-32k', 'gpt-4-0613', 'gpt-4-32k-0613']
I added "mistral" to gptbot.py, but no effect.
ENGINES = [ "gpt-3.5-turbo", "gpt-3.5-turbo-16k", "gpt-3.5-turbo-0613", "gpt-3.5-turbo-16k-0613", "gpt-4", "gpt-4-32k", "gpt-4-0613", "gpt-4-32k-0613", "mistral", ]
Please Help ;-)
_EDIT:
i solved this problem by naming the model gpt-4 and unsing gpt-4 in matrix-chatgpt_bot config. set in docker-compose.yaml of LocalAI:
_