Closed liushui11 closed 1 year ago
[gpt request error: error, status code: 400, message: This model's maximum context length is 4097 tokens. However, you requested 4112 tokens (16 in the messages, 4096 in the completion). Please reduce the length of the messages or completion.]
请问下这个报错是怎么解决的,使用最新的v1.1.2版本就会出现这个
解决了么
[gpt request error: error, status code: 400, message: This model's maximum context length is 4097 tokens. However, you requested 4112 tokens (16 in the messages, 4096 in the completion). Please reduce the length of the messages or completion.]