matrixgpt / matrix-chatgpt-bot

Talk to ChatGPT via any Matrix client!
GNU Affero General Public License v3.0
230 stars 64 forks source link

Feature Request: Change model midconversation #219

Closed Jasonthefirst closed 7 months ago

Jasonthefirst commented 10 months ago

Sometimes I realize mid conversation that I need to use the bigger context length (gpt-3.5-turbo-16k) or the better understanding (gpt-4) insted the one I am using. It would be nice if it were possible to change the model midconversation by giving a new command. For example if the conversation would need a longer context length you could write !changemodel gpt-3.5-turbo-16k and the conversation would continue with the bigger model.

max298 commented 7 months ago

After looking into the official api documentation, I don't think that's something OpenAI allows you to do. You set the model before you start your prompts and you can not change it inside the current context. You could archive the behavior by creating a new instance of a conversation and load in all of your previous messages, I think this should even work quite well using the assistance api, however this would also require to access your messages unencrypted which is something we don't want to. Sorry, but I don't feel like we will ever implement something like this.