jackMort / ChatGPT.nvim

ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API
Apache License 2.0
3.56k stars 307 forks source link

any qery gives response `This model's maximum context length is 16385 tokens. [...]` #427

Open sh-cau opened 2 months ago

sh-cau commented 2 months ago

The full error reads This model's maximum context length is 16385 tokens. However, your messages resulted in 16680 tokens. Please reduce the length of the messages.

This comes for any request (that previously could be executed successfully).

I tried setting the max_tokens to 20000 or some other arbitrary value but to no avail. Using chatgpt CLI I'm having no problems with the same queries.

What am I missing?

hrllk commented 1 month ago

would you try enter on normal mode

rafaelleru commented 1 month ago

same here, I wrotte a comment, entered normal mode and tried to run ChatGPTCompleteCode but I always get the error.

It does not happens when I try to run it in a new file, so I am gessing complete code is sending to much context in the request.