dustinblackman / oatmeal

Terminal UI to chat with large language models (LLM) using different model backends, and integrations with your favourite editors!
https://dustinblackman.com/posts/oatmeal/
MIT License
487 stars 23 forks source link

Provide ability to specify backend URL #11

Closed tadq closed 9 months ago

tadq commented 9 months ago

Is there a way to provide backend Ollama URL? Instead of current default of http://localhost:11434

dustinblackman commented 9 months ago

I could add it! Curious, how are you running Ollama?

tadq commented 9 months ago

Hi, I am running Ollama on few local servers. CORS is enabled.

dustinblackman commented 9 months ago

Nice! That's fun. Fixed in v0.7.0. Thanks!

tadq commented 9 months ago

Love it. Works well. Thanks a lot.