hwchase17 / notion-qa

MIT License
2.13k stars 376 forks source link

It is very easy to raise the maximum context length error #33

Closed gazedreamily closed 1 year ago

gazedreamily commented 1 year ago

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 4688 tokens. Please reduce the length of the messages.

gazedreamily commented 1 year ago

Oh I should slice files into small pieces!