jackMort / ChatGPT.nvim

ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API
Apache License 2.0
3.56k stars 307 forks source link

Change model on the fly #415

Closed barnii77 closed 1 week ago

barnii77 commented 3 months ago

Earlier this day, I raised an issue here: https://github.com/jackMort/ChatGPT.nvim/issues/414 Then, I decided to dive into the codebase and add the feature myself using my proposed change.

What's new:

openai_params.model can now be either a constant string (only method so far) or a function that returns a string containing the model (new). The model is thus computed on the fly when an API request is made.

Motivation for the change

So far, without this change, it has not been possible to change the model you are using on the fly. With the modifications this PR introduces, one can now pass a function instead of a fixed model, and the function returns the model to be used. This function is evaluated on API invokation.

Code changes:

Testing

I have tested every related command (ChatGPT, ChatGPTActAs, ChatGPTRun) and they're all still working perfectly.

Apology letter (lol)

Finally, I want to apologize for the many "debugging" commits. While working on the feature, I found it easiest to commit the changes and let lazy put them into my editor so I could test them, then go back to the plugin.

jackMort commented 3 months ago

great thanks, could you please reformat to satisfy stylua and make PR more readable?

barnii77 commented 3 months ago

great thanks, could you please reformat to satisfy stylua and make PR more readable?

I have now refactored the PR to (hopefully) be more readable and have tried to fix the formatting. This has been quite troublesome as I have been on holiday since the day after I had raised the PR and don't have my laptop with me... Could you please approve the workflow so I can see if reformatting worked?

HPRIOR commented 2 months ago

Would love to see this feature merged

HPRIOR commented 2 months ago

It would also be great to have some kind of lua api exposed which allowed for model parameters to be set. Something like:

require('chatgpt').open_chat({ ... model params ... })
barnii77 commented 2 months ago

It would also be great to have some kind of lua api exposed which allowed for model parameters to be set. Something like:

require('chatgpt').open_chat({ ... model params ... })

Maybe, I don't think there's too much need for that though. I don't see much utility in the ability to change any settings other than the model and the endpoint at runtime. If you want to open a chat with a different model, you can do that already by manipulating what the function returns and then vim.cmd("ChatGPTRun").

barnii77 commented 2 months ago

Would love to see this feature merged

By the way, you can already use this feature by specifying my fork instead of the main repo until it's merged :) I've been using it ever since I made the PR ... it's 0 commits behind so it shouldn't matter

HPRIOR commented 2 months ago

It would also be great to have some kind of lua api exposed which allowed for model parameters to be set. Something like:

require('chatgpt').open_chat({ ... model params ... })

Maybe, I don't think there's too much need for that though. I don't see much utility in the ability to change any settings other than the model and the endpoint at runtime. If you want to open a chat with a different model, you can do that already by manipulating what the function returns and then vim.cmd("ChatGPTRun").

For my use case the function is unnecessarily complicated when you could pass a simple argument to the API

barnii77 commented 2 months ago

It would also be great to have some kind of lua api exposed which allowed for model parameters to be set. Something like:

require('chatgpt').open_chat({ ... model params ... })

Maybe, I don't think there's too much need for that though. I don't see much utility in the ability to change any settings other than the model and the endpoint at runtime. If you want to open a chat with a different model, you can do that already by manipulating what the function returns and then vim.cmd("ChatGPTRun").

For my use case the function is unnecessarily complicated when you could pass a simple argument to the API

I encourage you to look at the source and try implementing your feature. If you can, good, make a PR, if you can't, use my solution... I found my solution to be the most general and simple to both implement and get the job done.

barnii77 commented 2 months ago

It would also be great to have some kind of lua api exposed which allowed for model parameters to be set. Something like:

require('chatgpt').open_chat({ ... model params ... })

Maybe, I don't think there's too much need for that though. I don't see much utility in the ability to change any settings other than the model and the endpoint at runtime. If you want to open a chat with a different model, you can do that already by manipulating what the function returns and then vim.cmd("ChatGPTRun").

For my use case the function is unnecessarily complicated when you could pass a simple argument to the API

Also a function is not complicated at all, it is supposed to be a one-liner: function() return chatgpt_config.model end And then, when you want to change models, you dont have to interact with the function, you just change chatgpt_config.model

HPRIOR commented 2 months ago

It would also be great to have some kind of lua api exposed which allowed for model parameters to be set. Something like:

require('chatgpt').open_chat({ ... model params ... })

Maybe, I don't think there's too much need for that though. I don't see much utility in the ability to change any settings other than the model and the endpoint at runtime. If you want to open a chat with a different model, you can do that already by manipulating what the function returns and then vim.cmd("ChatGPTRun").

For my use case the function is unnecessarily complicated when you could pass a simple argument to the API

Also a function is not complicated at all, it is supposed to be a one-liner: function() return chatgpt_config.model end And then, when you want to change models, you dont have to interact with the function, you just change chatgpt_config.model

In my case, I am developing a telescope extension that offers interactive and chat modes through a picker. I'd rather not have the user of the extension declare a function in their chatGPT.nvim config. It would be much nicer if the telescope extension could use the API I suggested above

thiswillbeyourgithub commented 1 week ago

I just a question, we can set the models via a runtime function but not the API key right? AFAIU when a shell command is given as api key it is only executed at startup and can't be changed afterwards. But that would be great as it could allow changing for example from gpt-4o to openrouter.ai 's claude 3.5 sonnet. If you dot know openrouter.ai is a website offering a common openai api for pretty much any llm, whereas for example anthropic's API is not compatible.

Any idea how I can change the API key at runtime?

barnii77 commented 1 week ago

I just a question, we can set the models via a runtime function but not the API key right? AFAIU when a shell command is given as api key it is only executed at startup and can't be changed afterwards. But that would be great as it could allow changing for example from gpt-4o to openrouter.ai 's claude 3.5 sonnet. If you dot know openrouter.ai is a website offering a common openai api for pretty much any llm, whereas for example anthropic's API is not compatible.

Any idea how I can change the API key at runtime?

You can make your neovim config write to a file and make the command read the file and echo its contents

thiswillbeyourgithub commented 1 week ago

Yes but that's just at startup and not at runtime right?

barnii77 commented 1 week ago

Yes but that's just at startup and not at runtime right?

Oh right yeah I forgot. I guess you could try changing internal variables of the plugin. That didnt work for the model but maybe it does for the api key

Thats the thing I tried for the model before making the plugin feature