Closed mangadi3859 closed 7 months ago
Thank you for your suggestion! I'm going to take a look at your code, afterwards I'll include something like this in the extension + connect to the Cohere API. Once again, thanks!
As for summarizing in chunks, that might work, but I'm not sure if Cohere can transfer context for the summarization model. I'll investigate for a bit, maybe it actually does.
@mangadi3859 It appears that combining the two functions didn't work out of the box due to promise-array issues, but now it's resolved and it seems like your algorithm works really well! This will also reduce the size of the extension because it'll remove the device type dependency (as the client had a different structure on mobile and on legacy chats). At the moment I'm working on integrating it with the C.AI extension to replace the export / summarizer function grabbers.
This has been successfully implemented for both Chat Export functions and Automatic Generation and will be included in the next stable release, yet already available in the source code. Thank you for your amazing proposition!
Suggestion
What happened?
When i was using the memory manager and automatically import a memory base on my chat, i'm noticed that it actually only summarize my chats to a certain point and not the entire chats. The current method of scanning messages is to scan the page of any HTML element that contain the message. Character AI itself by default doesn't load the entire chats until we scroll to a certain point and it start to load more. so it make the feature only load the newest chat instead of the entire chats.
Solution?
Instead of scanning the page, we could try to use a http request to get every messages on including the one that is not currently loaded by the page.
How?
I've been looking on network tab on chrome devtools and manage to get a few API Endpoint that is used by the page. I've also made a function in JS to get all the messages history.
hope this help
and the messages might be too long to be summarize so you can summarize it in chunks.