ItsPi3141 / alpaca-electron

The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer
MIT License
1.29k stars 141 forks source link

[ENHANCEMENT] ability to save or have multiple conversations. #97

Open blenderman94 opened 9 months ago

blenderman94 commented 9 months ago

good. day im wondering about having more conversations capability for alpaca electron in a form of more messages and the ability to preserve them.

ItsPi3141 commented 5 months ago

Llama.cpp context files are 1GB each, which is inconvenient to save in large amounts. It also takes longer to initialize the model.