Closed georgiyozhegov closed 9 hours ago
LocalAI supports running Ollama and GGUF models, but it'll be pretty good to be able to use Ollama directly. I know that it has its own great CLI, but why not supporting Ollama?
Oh, just figured out how to use it.
mods --api ollama --model <local-model-name>
LocalAI supports running Ollama and GGUF models, but it'll be pretty good to be able to use Ollama directly. I know that it has its own great CLI, but why not supporting Ollama?