abi / secret-llama

Fully private LLM chatbot that runs entirely with a browser with no server needed. Supports Mistral and LLama 3.
https://secretllama.com
Apache License 2.0
2.44k stars 143 forks source link

Optionally save previous chats #4

Open not-matthias opened 4 months ago

not-matthias commented 4 months ago

It'd be nice to save previous chats (kinda like ChatGPT) to access them again in the future.

abi commented 4 months ago

Yeah should be relatively easy to do this with local storage. Will spec this out in case anyone is interested to helping to implement it.

arthurjdam commented 4 months ago

Yeah should be relatively easy to do this with local storage. Will spec this out in case anyone is interested to helping to implement it.

I may have some time to help! Started persisting state in LocalStorage using zustand persist middleware. Will need to figure out what changes to accommodate an array of conversations, and some kind of routing

abi commented 4 months ago

For reference, in this discord, @o-stahl shared his fork where he's added some of this support: https://github.com/o-stahl/secret-llama/tree/main

IndexedDB would be a decent solution to start. But because the browser can still choose to evict that data, it's not truly persistent. File system API is also worth exploring for a more persistent and reliable solution: https://developer.mozilla.org/en-US/docs/Web/API/File_System_API https://mburakerman.github.io/file-system-access-api-demo/

abi commented 4 months ago

Linked to #11