jackMort / ChatGPT.nvim

ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API
Apache License 2.0
3.81k stars 319 forks source link

would like to specify model used by the default ChatGPT command #403

Open l-bowman opened 9 months ago

l-bowman commented 9 months ago

Is it not possible to change the model used by the default command, or am I missing something? I have no problem specifying the model for custom actions, but I can't figure out what needs to be changed to change the default ChatGPT model. Thanks!

ilan-schemoul commented 8 months ago

It is in the config object you pass to the setup

openai_params = { 
     model = "gpt-3.5-turbo", 
}

As explained in the readme the default settings are there https://github.com/jackMort/ChatGPT.nvim/blob/f1453f588eb47e49e57fa34ac1776b795d71e2f1/lua/chatgpt/config.lua#L10-L182

l-bowman commented 7 months ago

I find that my model choices are not respected. It always defaults to 3. I have confirmed that I have API access to GPT 4.

l-bowman commented 7 months ago

And, I can write a custom action that uses GPT 4, but the default Chat doesn't seem to respect the model parameter.

l-bowman commented 7 months ago

Nevermind, I found the issue. Spelling issue! Thank you and apologies.

JeancarloBarrios commented 7 months ago

I have the same issue, and I think there are no spelling mistakes. Do you have any recommendations?

aklt commented 2 months ago

Same issue here.

Calling require('chatgpt').setup() with a config object including openai_edit_params still defaults to gpt-3.5-turbo as set here: https://github.com/jackMort/ChatGPT.nvim/blob/main/lua/chatgpt/config.lua#L170

It seems not possible to override the default

EmVee381 commented 2 months ago

Same issue here.

Calling require('chatgpt').setup() with a config object including openai_edit_params still defaults to gpt-3.5-turbo as set here: https://github.com/jackMort/ChatGPT.nvim/blob/main/lua/chatgpt/config.lua#L170

It seems not possible to override the default

i just found it use model set in lua/chatgpt/flows/actions/actions.json changed it there and it works.

l-bowman commented 1 month ago

Something valuable to know is that you can Ctrl - o on the chat menu to see the current model settings used.