Thanks for your great idea.
I tried with calling OpenAI API and sent chunks into the conversation context.
Due to the token limit, I got this error:
"type": "InvalidRequestError",
"message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 4415 tokens. Please reduce the length of the messages."
Thanks for your great idea. I tried with calling OpenAI API and sent chunks into the conversation context. Due to the token limit, I got this error:
"type": "InvalidRequestError", "message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 4415 tokens. Please reduce the length of the messages."
Any suggestion?