Alpaca is an Ollama client where you can manage and chat with multiple models, Alpaca provides an easy and begginer friendly way of interacting with local AI, everything is open source and powered by Ollama.
[!WARNING] This project is not affiliated at all with Ollama, I'm not responsible for any damages to your device or software caused by running code given by any AI models.
Code highlighting | Chatting with models | Managing models |
---|---|---|
![]() |
![]() |
![]() |
run
buttonreleases
pageGo to ~/.var/app/com.jeffser.Alpaca/config/server.json
and change the "local_port"
value, by default it is 11435
.
The chat data is located in ~/.var/app/com.jeffser.Alpaca/data/chats
you can copy that directory wherever you want to.
To do that you just need to delete the file ~/.var/app/com.jeffser.Alpaca/config/server.json
, this won't affect your saved chats or models.
You can change anything except $HOME
and $OLLAMA_HOST
, to do this go to ~/.var/app/com.jeffser.Alpaca/config/server.json
and change ollama_overrides
accordingly, some overrides are available to change on the GUI.
If you want to fork this... I mean, I think it would be better if you start from scratch, my code isn't well documented at all, but if you really want to, please give me some credit, that's all I ask for... And maybe a donation (joke)