charmbracelet / mods

AI on the command line
MIT License
3.04k stars 117 forks source link

Add Ollama support #389

Closed georgiyozhegov closed 9 hours ago

georgiyozhegov commented 9 hours ago

LocalAI supports running Ollama and GGUF models, but it'll be pretty good to be able to use Ollama directly. I know that it has its own great CLI, but why not supporting Ollama?

georgiyozhegov commented 9 hours ago

Oh, just figured out how to use it.

mods --api ollama --model <local-model-name>