Open l-bowman opened 9 months ago
It is in the config object you pass to the setup
openai_params = {
model = "gpt-3.5-turbo",
}
As explained in the readme the default settings are there https://github.com/jackMort/ChatGPT.nvim/blob/f1453f588eb47e49e57fa34ac1776b795d71e2f1/lua/chatgpt/config.lua#L10-L182
I find that my model choices are not respected. It always defaults to 3. I have confirmed that I have API access to GPT 4.
And, I can write a custom action that uses GPT 4, but the default Chat doesn't seem to respect the model parameter.
Nevermind, I found the issue. Spelling issue! Thank you and apologies.
I have the same issue, and I think there are no spelling mistakes. Do you have any recommendations?
Same issue here.
Calling require('chatgpt').setup()
with a config object including openai_edit_params
still defaults
to gpt-3.5-turbo
as set here: https://github.com/jackMort/ChatGPT.nvim/blob/main/lua/chatgpt/config.lua#L170
It seems not possible to override the default
Same issue here.
Calling
require('chatgpt').setup()
with a config object includingopenai_edit_params
still defaults togpt-3.5-turbo
as set here: https://github.com/jackMort/ChatGPT.nvim/blob/main/lua/chatgpt/config.lua#L170It seems not possible to override the default
i just found it use model set in lua/chatgpt/flows/actions/actions.json changed it there and it works.
Something valuable to know is that you can Ctrl - o
on the chat menu to see the current model settings used.
Is it not possible to change the model used by the default command, or am I missing something? I have no problem specifying the model for custom actions, but I can't figure out what needs to be changed to change the default
ChatGPT
model. Thanks!