LlmLaraHub / larallama

This is version 2 of LaraChain.io ream more at https://LaraLlama.io
MIT License
73 stars 12 forks source link

Summary on Summarize doc after import and on Output Summary really has to trip down it is just too big #16

Closed alnutile closed 4 days ago

alnutile commented 1 month ago
[2024-05-13 02:40:01] production.INFO: [LaraChain] Summarizing Document {"token_count_v2":194788,"token_count_v1":189589}
[2024-05-13 02:40:01] production.INFO: [LaraChain] - SummarizeDocumentPrompt
[2024-05-13 02:40:02] production.ERROR: This model's maximum context length is 128000 tokens. However, your messages resulted in 138295 tokens. Please red

Use the shortener for this

alnutile commented 1 month ago

I have patch for this will apply shortly

alnutile commented 4 days ago

done