[2024-05-13 02:40:01] production.INFO: [LaraChain] Summarizing Document {"token_count_v2":194788,"token_count_v1":189589}
[2024-05-13 02:40:01] production.INFO: [LaraChain] - SummarizeDocumentPrompt
[2024-05-13 02:40:02] production.ERROR: This model's maximum context length is 128000 tokens. However, your messages resulted in 138295 tokens. Please red
Use the shortener for this