jackMort / ChatGPT.nvim

ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API
Apache License 2.0
3.57k stars 307 forks source link

Updating gpt model in config not working #320

Closed ben-wall closed 7 months ago

ben-wall commented 8 months ago

I update my model in my config like so

        openai_params = {
          model = "gpt-4",
          frequency_penalty = 0,
          presence_penalty = 0,
          max_tokens = 800,
          temperature = 0,
          top_p = 1,
          n = 1,
        },
        openai_edit_params = {
          model = "gpt-4",
          temperature = 0,
          top_p = 1,
          n = 1,
        },

...which I have verified as being read properly by changing another option and seeing that update.

when I press ctrl o to open the options it says gpt-4 (which I by the way cannot get to change back to gpt-3.5-turbo, now) but when I ask 'what model of chat gpt am I using' it always reports chat gpt 3.

I've tried even trying to break the model by changing it to 'gpt-5' and it still seems to be using gpt 3 and doesn't break.

amplicity commented 8 months ago

I have the same issue

ysqander commented 7 months ago

I have the same issue. I think the default actions specify 3.5 Turbo and override the params in the config.

When I wrote a simple custom function with the model gpt-4-1106-preview, I received a reply from gpt-4 with cutoff early 2023.

david-strejc commented 7 months ago

Same here - is there a way how to change model to gpt-4-1106-preview?

winslowb commented 7 months ago

i've been sitting with the same issue for months and months

ben-wall commented 7 months ago

I have the same issue. I think the default actions specify 3.5 Turbo and override the params in the config.

When I wrote a simple custom function with the model gpt-4-1106-preview, I received a reply from gpt-4 with cutoff early 2023.

how did you even get it to gpt-4? that's all I want

ysqander commented 7 months ago

@ben-wall

I changed the config to this

config = function() require("chatgpt").setup({ api_key_cmd = "dcli note opk", openai_params = { model = "gpt-4", frequency_penalty = 0, presence_penalty = 0, max_tokens = 600, temperature = 0, top_p = 1, n = 1, }, openai_edit_params = { model = "gpt-4", frequency_penalty = 0, presence_penalty = 0, temperature = 0, top_p = 1, n = 1, }, actions_paths = { vim.fn.expand("~/GPTnvimActions/actions.json") },

Then in my custom actions.json I have the same default action but changing the model from 3.5 to gpt-4-1106-preview. For example:

"complete_code_gpt4Turbo": { "type": "chat", "opts": { "template": "Complete the following code written in {{lang}} by pasting the existing code and continuing it.\n\nExisting code:\n{{filetype}}\n{{input}}\n\n\n{{filetype}}\n", "strategy": "replace", "params": { "model": "gpt-4-1106-preview", "stop": [ "" ] } }

I tested it with an action that asked it which model it is and the cutoff and it said GPT4 and cutoff early 2023. So this works only for actions. The main chat window is still hooked to 3.5 despite the config changes.

e2r2fx commented 7 months ago

this is because the plugin tries to read from a file in the home directory, depending on the type of settings. Look for .chatgpt-chat_completions-params.json and remove it from your home. Then the configuration should kick in.