869413421 / chatgpt-web

基于ChatGPT3.5 API实现的私有化web程序
Apache License 2.0
3.19k stars 742 forks source link

最大token问题 #103

Closed imsprojo2fan closed 1 year ago

imsprojo2fan commented 1 year ago

你好,我看代码中好像是连续追加之前的回答到下一次的请求参数中,这样不会造成token过长导致请求失败吗?

error, status code: 400, message: This model's maximum context length is 4097 tokens. However, your messages resulted in 4112 tokens. Please reduce the length of the messages.

869413421 commented 1 year ago

你好,我看代码中好像是连续追加之前的回答到下一次的请求参数中,这样不会造成token过长导致请求失败吗?

error, status code: 400, message: This model's maximum context length is 4097 tokens. However, your messages resulted in 4112 tokens. Please reduce the length of the messages.

多轮对话是基于这个实现的,目前项目并没有检验输入长度。后面考虑调整,超出限制作一些提示。