LyubomirT / c.ai-addons

Enhance your Character.ai experience with a set of useful features and tools to make your interactions smoother and more personalized. This extension is a collection of handy add-ons designed to empower your conversations and improve the user interface.
https://lyubomirt.github.io/c.ai-addons/
GNU General Public License v3.0
10 stars 2 forks source link

Suggestion to automatically summarize the entire chat history. #19

Closed mangadi3859 closed 7 months ago

mangadi3859 commented 7 months ago

Suggestion

What happened?

When i was using the memory manager and automatically import a memory base on my chat, i'm noticed that it actually only summarize my chats to a certain point and not the entire chats. The current method of scanning messages is to scan the page of any HTML element that contain the message. Character AI itself by default doesn't load the entire chats until we scroll to a certain point and it start to load more. so it make the feature only load the newest chat instead of the entire chats.

Solution?

Instead of scanning the page, we could try to use a http request to get every messages on including the one that is not currently loaded by the page.

How?

I've been looking on network tab on chrome devtools and manage to get a few API Endpoint that is used by the page. I've also made a function in JS to get all the messages history.

async function getChatHistory() {
    // Get the char ID from url params
    let charID = new URLSearchParams(window.location.search).get("char");
    if (!charID) throw new Error("CharID not found");

    // Get user token to use it as Authorization header
    let token = JSON.parse(localStorage.getItem("char_token")).value;
    let opt = {
        headers: {
            Authorization: `Token ${token}`,
        },
    };

    // Send request to get chat information such as chat_id
    let chatInfo = await (await fetch(`https://neo.character.ai/chats/recent/${charID}`, opt)).json();
    if (!chatInfo) throw new Error("No chat was found");

    let chatID = chatInfo.chats[0].chat_id;
    // Send request to get the first newest turns (a chunk of messages) to get the next token
    let recentHistory = await (await fetch(`https://neo.character.ai/turns/${chatID}`, opt)).json();
    // Every turns will be stored here
    let chatsHistory = [recentHistory];

    // Get the nextToken
    let nextToken = chatsHistory[chatsHistory.length - 1].meta.next_token;
    while (nextToken) {
        // Send request to get the next turns until the next token is null.
        let history = await (await fetch(`https://neo.character.ai/turns/${chatID}?next_token=${nextToken}`, opt)).json();
        chatsHistory.push(history);
        nextToken = history.meta.next_token;
    }

    return chatsHistory;
}

//Data is the returns value of the function above
function convertHistory(data) {
    // Reverse the array
    data.reverse();
    // Convert the data only include an array of turns (without next_token)
    let turns = data.reduce((pre, now) => [...pre, ...now.turns], []);
    let chats = [];

    // Convert the data to a string of the author name and the message
    turns.forEach((e) => {
        chats.push(`<< ${e.author.name} >>\n${e.candidates[0].raw_content}`);
    });

    return chats;
}

hope this help

and the messages might be too long to be summarize so you can summarize it in chunks.

LyubomirT commented 7 months ago

Thank you for your suggestion! I'm going to take a look at your code, afterwards I'll include something like this in the extension + connect to the Cohere API. Once again, thanks!

LyubomirT commented 7 months ago

As for summarizing in chunks, that might work, but I'm not sure if Cohere can transfer context for the summarization model. I'll investigate for a bit, maybe it actually does.

LyubomirT commented 7 months ago

@mangadi3859 It appears that combining the two functions didn't work out of the box due to promise-array issues, but now it's resolved and it seems like your algorithm works really well! This will also reduce the size of the extension because it'll remove the device type dependency (as the client had a different structure on mobile and on legacy chats). At the moment I'm working on integrating it with the C.AI extension to replace the export / summarizer function grabbers.

LyubomirT commented 7 months ago

This has been successfully implemented for both Chat Export functions and Automatic Generation and will be included in the next stable release, yet already available in the source code. Thank you for your amazing proposition!