iuiaoin / wechat-gptbot

A wechat robot based on ChatGPT with no risk, very stable! 🚀
MIT License
579 stars 113 forks source link

[Bug]: Potential bug in bot init #119

Open Niomax32 opened 6 months ago

Niomax32 commented 6 months ago

Search for answers in existing issues

Python version

python 3.10

Issue description

I am not sure if it's a bug or intended change, but this line doesn't look right: https://github.com/iuiaoin/wechat-gptbot/blob/main/bot/bot.py#L18

A ChatGPTBot() would be created for the models in the litellm support list, while in the following else block a LiteLLMChatGPTBot() is going to be created for model that's not in its support list.

Actually it causes problem when the model is not supported in the given version of litellm. For instance, If you set model to gpt-4-turbo-preview in config and it's going to completely break the bot, since it's going to create a LiteLLMChatGPTBot() which doesn't support that model.

It probably makes sense to change the line to elif model not in litellm.open_ai_chat_completion_models

BTW, the current litellm dep is really out of date and might need either upgrade or deprecation

Repro steps

  1. Set config model to gpt-4-turbo-preview which should be supported by ChatGPTBot
  2. Start the bot and see it breaks from LiteLLMChatGPTBot

Relevant log output

No response