Describe the bug
I'm getting `This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.):``
To Reproduce
Steps to reproduce the behavior.
load an URL example: https://vadb.org/scenes/cordoba,
the chunk is gt the accepted, it will warn and then fail: (pinecone store)
Created a chunk of size 210763, which is longer than the specified 1000
[ActiveJob] [Goai::WebsiteProcessorJob] [dd6a9549-dd34-4430-a510-fc2d4a2208f2] This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.
Expected behavior
chunks properly chunked
Terminal commands & output
WARN -- : Created a chunk of size 210763, which is longer than the specified 1000
[ActiveJob] This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.
RuntimeError (An error occurred: This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.):
Desktop (please complete the following information):
Describe the bug I'm getting `This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.):``
To Reproduce Steps to reproduce the behavior. load an URL example: https://vadb.org/scenes/cordoba, the chunk is gt the accepted, it will warn and then fail: (pinecone store)
Created a chunk of size 210763, which is longer than the specified 1000 [ActiveJob] [Goai::WebsiteProcessorJob] [dd6a9549-dd34-4430-a510-fc2d4a2208f2] This model's maximum context length is 8191 tokens, but the given text is 56273 tokens long.
Expected behavior chunks properly chunked
Terminal commands & output
Desktop (please complete the following information):