Closed ShravanSunder closed 1 month ago
Not at the moment, and it's not currently planned. It would be certainly nice, but the primary format is GGUF. When this ticket gets implemented, you might be able to use ollama as inference engine (with some features missing because ollama does not support everything we can do with raw model)
I am wondering if its possible to use ollama models already on my system? that way we don't have to download models multiple times.
thanks!