Open headchem opened 2 years ago
Good conversion about overcoming the 4096 token limit: https://www.reddit.com/r/GPT3/comments/ukkqfs/work_around_the_limitation_of_4000_tokens_for/
Best approach seems to be AIDungeon's "lore book" approach:
This idea is partially implemented with the "blurbs" feature. The blurbs act as our "lore book". But, will keep this issue open to address the idea of expanding text between two sequences to get more overall length.
New tab "expansions" where the sequence overviews are hard coded and not editable. Then we use the Edit mode (which is not fine-tuned, but we should have enough context already) to fill in between two sequences (or more, like a sliding window to include as much context before and after as possible). This way we can get a longer story. This step can also infuse more dialog into the sequences. Hopefully, the edit mode bookends will keep the story on track and logical.