Open Sodapopoo opened 1 year ago
报错信息: openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 29968 tokens. Please reduce the length of the messages.
我数了了下,报错那次一共翻译了文本的1952到7285排共5333排的内容 我看了下代码,说是每次限制了长度1024,但是对于短文本的处理依然有问题啊,根本没能限制长度,为什么短文本要放在一起翻译而不是遍历每一排来翻译呢?
报错信息: openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 29968 tokens. Please reduce the length of the messages.
我数了了下,报错那次一共翻译了文本的1952到7285排共5333排的内容 我看了下代码,说是每次限制了长度1024,但是对于短文本的处理依然有问题啊,根本没能限制长度,为什么短文本要放在一起翻译而不是遍历每一排来翻译呢?