jackMort / ChatGPT.nvim

ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API
Apache License 2.0
3.56k stars 307 forks source link

Change the default `max_tokens` configuration value #419

Open jfmainville opened 3 months ago

jfmainville commented 3 months ago

Currently, the max_tokens value is set to 300 in the default configuration file (config.lua) which causes a high risk of answers from being cutoff when interacting with ChatGPT a model. In that regard, I was wondering if we could increase the max_tokens value to 4096 to reduce this risk?

Also, as the default model is gpt-3.5-turbo at the moment, which supports up to 4096 tokens by default (reference), it would make the process more convenient to new users. This action could also be done for the other available actions like code_readability_analysis and code_completion for example. We could standardize the definition of the max_tokens attribute across all available actions and models.

thiswillbeyourgithub commented 2 months ago

I agree. I was frequently very annoyed to see my chat completions abruptly stop until I figured out that I just needded to increase max_tokens

ser commented 2 months ago

my exactly first interaction was cut and it took me a while to understand why