Open ShiganChu opened 1 year ago
Yes, it is possible, and I also want fix for this. In theory there is no limit of text length.
Send messages with this template and after all batches are sent and processed, send a summarization message.
I will give you text in batches, after each message respond "I read it", remember all text, I will give you question about it later: {{BATCH}}
Tried implementing myself, but I'm not that proficient in react
@givebest can you make the improvement? the change should be made in https://github.com/sparticleinc/chatgpt-google-summary-extension/blob/b2b9536ee061284293b15d1db9f0f76a00433c6f/src/content-script/prompt.ts#L17
We have made improvements in two areas:
export const modelMaxToken = {
'gpt-3.5-turbo': 4096.
'gpt-3.5-turbo-0301': 4096.
'gpt-4': 8192.
'gpt-4-0314': 8192.
'gpt-4-32k': 32768.
'gpt-4-32k-0314': 32768.
}
These will be implemented in the new version.
Thanks! both solutions are api based, and they might be costly based on usage frequency. is there plan to handle long article cases based on free webapp?
chatGPT Multiple requests in a short period of time can easily lead to restricted access. Haven't found a better way to do this.
Currently the web article summarizer can only handle the first 14000 tokens of article, this is insufficient for long articles. Is it possible to split the article into multiple batches, and perform summarization for each batches?