Open TylerLu opened 3 months ago
Good catch @TylerLu, we'll push a fix soon.
@singhk97 Thank you for the response! I appreciate your efforts in fixing this.
We are also facing the same error and found the issue as mentioned above, Any ETA on pushing the fix. Please let us know. Thanks.
Language
C#
Version
latest
Description
I set
max_input_tokens
to8000
in config.json of the default prompt:I asked the bot a question and the Bot returned the following error:
Below is the request and response captured by Fiddler:
According to OpenAI's document,
max_tokens
is the "the maximum number of tokens that can be generated in the chat completion". Somax_tokens
is accutally the max output tokens, and themax_input_tokens
should not be passed to it.Related code:
Reproduction Steps