jackMort / ChatGPT.nvim

ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API
Apache License 2.0
3.64k stars 312 forks source link

ChatGPTCompleteCode exceeds token limit #252

Closed Rouji closed 8 months ago

Rouji commented 1 year ago

:ChatGPTCompleteCode results in

// API ERROR: This model's maximum context length is 4093 tokens, however you requested 5074 tokens (3026 in your prompt, suffix; 2048 for the completion).

...for files longer than about 300~400 lines. I haven't touched max_tokens, so that should be set to 300.

jschoolcraft commented 1 year ago

I'm getting a similar issue with just :ChatGPT and asking a simple question: "Summarize the book Multipliers"

Summarize the book Multipliers
This model's maximum context length is 4097 tokens. However, you requested 4256 tokens (3956 in the messages, 300 in the completion). Please reduce the length of the messages or completion.
collins-lagat commented 1 year ago

I'm getting a similar issue with just :ChatGPT and asking a simple question: "Summarize the book Multipliers"

Summarize the book Multipliers
This model's maximum context length is 4097 tokens. However, you requested 4256 tokens (3956 in the messages, 300 in the completion). Please reduce the length of the messages or completion.

Try this guy's solution. https://github.com/jackMort/ChatGPT.nvim/issues/224#issuecomment-1606352967

EduardoNeville commented 1 year ago

I've tried the solution posted on #244 (comment) but even after changing my credit card information the issue persists

lalitmeeq commented 1 year ago

Facing the same issue with ChatGPT command as well.

Seems like something to do with the model.

Thanatermesis commented 11 months ago

Same issue here,

Or maybe the plugin should just get a bunch of (in cursor place) 100 previous lines + 20 next lines, instead of the entire file 🤔

MateuszMielniczuk commented 11 months ago

I am also fighting with this issue: It is using gpt-3.5-turbo by default? I tried to change it to gpt-3.5-turbo-16k with larger context length, and increase max_tokens value over 10000 but, still getting message with max-context length 4093.

Is there some additional settings to change model for completion?


edit: It started working when I opened a different buffer/file in nvim, but it is not working in previous buffer/file even after removing cache and restarting nvim.