-
Will it still able to summary/asked by some important events in book?
-
My corpus contains 300 paragraphs, and the speed is slow. More than 30 mins.
Could you please introduce sumy's performance ? And which stage will make it slow when corpus is large.
Thanks!
-
# Goal
------
* Many of the new LLMs models support long context. For example, lamma 3.1 and Mistral 2 support 128k;
* The trend is upwards, e.g. Gemini support 1M - 10M. Claude supports 200k;
* …
-
-
You will see the problem in the text below, this is with using gpt-4o and version 0.5 of agent zero, but have similar issues with other models
User message ('e' to leave):
> Write a college level …
-
The main idea would be to be able to split text in windows of token to be able to fit into the context windows of the llms.
example: take these answers, group them in chunks of 4000 tokens, summarize…
-
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Feature Description
I want to add a text summarization module using transformer-based models. The objective is…
-
> **For context, see:** [CONTRIBUTING.md](https://github.com/julep-ai/julep/blob/dev/CONTRIBUTING.md) and [cookbooks/README.md](https://github.com/julep-ai/julep/tree/dev/cookbooks)
## Overview
…
-
Can you explain me how to run this code?
-
I want to finetune on an existing dataset, so I ran the following code, but I ran into some errors, could you please take a look at it?Thanks!
python3 pegasus/bin/train.py --params=aeslc_transform…