Closed dannliu closed 1 year ago
@dannliu I suspect you maxed out your chatgpt.gpt3.maxTokens
setting to 4096 for completion. You should make it lower or keep the default 1024. That number is the total allowed max nr of tokens to be shared between request and response
@gencay Why my request is 4217 tokens when I only input a "hello"?
Hi there, if you read the error msg, your request actually hasn't consumed 4127 tokens, don't worry! The msg reads: it needs 121 tokens for the messages and because you set max tokens for the completion(response) to 4096 in vs code settings, OpenAI complains that you requested 4127 tokens though the limit is 4096 for prompt(request) + completion(response)
Got it. Thanks!
@gencay is it possible to have the extension calculate the max tokens based on the curent prompt? so that if you put 4096 in max tokens the actual parameter that gets sent to the api will not exceed the maximum.
I only input a
Hello
and it prompts the error: (I tried to clear/reset the session/conversation)This model's maximum context length is 4096 tokens. However, you requested 4217 tokens (121 in the messages, 4096 in the completion). Please reduce the length of the messages or completion.