Closed imsprojo2fan closed 1 year ago
你好,我看代码中好像是连续追加之前的回答到下一次的请求参数中,这样不会造成token过长导致请求失败吗?
error, status code: 400, message: This model's maximum context length is 4097 tokens. However, your messages resulted in 4112 tokens. Please reduce the length of the messages.
多轮对话是基于这个实现的,目前项目并没有检验输入长度。后面考虑调整,超出限制作一些提示。
你好,我看代码中好像是连续追加之前的回答到下一次的请求参数中,这样不会造成token过长导致请求失败吗?
error, status code: 400, message: This model's maximum context length is 4097 tokens. However, your messages resulted in 4112 tokens. Please reduce the length of the messages.