Ulov888 / chatpdflike

an approximate implementation similar to chatpdf
Apache License 2.0
210 stars 28 forks source link

[bug] 多轮会话导致tokens超载 #2

Closed ac1982 closed 1 year ago

ac1982 commented 1 year ago

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4658 tokens (3158 in your prompt; 1500 for the completion). Please reduce your prompt; or completion length.

Ulov888 commented 1 year ago

The reason for this error is that gpt3-turbo limits the input length to 4096 tokens, you can modify the length of max_tokens in the response function., but it maybelimit the length of your generated text.