cztomsik / ava

All-in-one desktop app for running LLMs locally.
https://avapls.com
Other
419 stars 15 forks source link

Is it possible to use ollama models?` #19

Closed ShravanSunder closed 1 month ago

ShravanSunder commented 7 months ago

I am wondering if its possible to use ollama models already on my system? that way we don't have to download models multiple times.

thanks!

cztomsik commented 7 months ago

Not at the moment, and it's not currently planned. It would be certainly nice, but the primary format is GGUF. When this ticket gets implemented, you might be able to use ollama as inference engine (with some features missing because ollama does not support everything we can do with raw model)

cztomsik commented 6 months ago

Can you clarify a little bit about how ollama works? Looking at this, it seems like Modelfile is actually still referencing .gguf file so you should be able to just import that .gguf file in the Models->Installed screen