briansunter / logseq-plugin-gpt3-openai

A plugin for GPT-3 AI assisted note taking in Logseq
https://twitter.com/bsunter
MIT License
716 stars 75 forks source link

Handle Long Text #21

Open briansunter opened 2 years ago

briansunter commented 2 years ago

Limit the length of text it can send.

Short term give a warning if they send too much. Long term, send it in batches.

Maybe use #13 for long text summaries.

alexc123-ui commented 2 years ago

Thank you for the answer. But the Contest of the text can change if you cut the text in the wrong Position. You know what I mean?

briansunter commented 2 years ago

It is true that it loses some context, but I've been surprised at well it can still get the meaning even with the chunks.

One other idea is feeding the summary of the first block along with the full text of a second block.

Unfortunately GPT-3 is not the best for text summarization, since they limit how much text you can send them and it's pricy. But if you break it up into chunks and do some workarounds you can still get decent results. I'm looking into #13 for this too.

alexc123-ui commented 2 years ago

thank you!

arminta7 commented 1 year ago

What about Readwise Reader's approach? If too long basically break it up into central sentences.

Ex:

Give me a list of the key points of the article "{{ document.title }}":

"""

{% if (document.content | count_tokens) > 1000 %}

{{ document.content | central_sentences | join('\n\n') }}

{% else %}

{{ document.content }}

{% endif %}

"""