Closed tsilvs closed 2 months ago
Possibly related to #78
Hi, you see, I didn't modify at all, Ollama's behavior except for moving .ollama from home to ~/.var/app/com.jeffser.Alpaca/data/
Alpaca uses the Ollama api to get all the available models and Alpaca never directly touches that directory, so I'm not sure what could be causing this
I made some modifications to how instance is stopped when you close the app, maybe that would fix it, I haven't release it yet on flathub tho
Tested in 0.9.5. Still happens.
Just to be sure, you imported those models into the folder right? if that's the case, this looks like an Ollama issue, where it doesn't detect them as models and just delete them
you imported those models into the folder right?
Yes, ~/.var/app/com.jeffser.Alpaca/data/.ollama/models/blobs
that's so strange, Alpaca doesn't actually touch any models directly so I'm not sure what's happening with this
since I think this is an Ollama issue I'm just going to close this, sorry
Describe the bug
I made hard links of already downloaded
.bin
models fromGPT4All
directory to~/.var/app/com.jeffser.Alpaca/data/.ollama/models/blobs
. They were automatically deleted on launch of Alpaca.Expected behavior
Alpaca will read & register those files as models & will work with them.
For reference: Bavarder behaves differently. It detected these files and is able to use them to generate answers.
Additional context
Alpaca 0.9.2 flatpak from flathub