fbgallet / roam-extension-speech-to-roam

MIT License
9 stars 1 forks source link

Local LLM alternative hostname? #3

Closed cori closed 16 hours ago

cori commented 4 months ago

Hey just getting back from vacation and checking out the new version, which looks amazing! Love all the options for alternative LLM providers, particularly the OpenRouter integration.

One thing I'm missing is the ability to set a bespoke hostname (and, ideally, port) for the Ollama connection. While I do run an Ollama instance on my M1 MBP for code autocomplete and other small tasks, i also run an Ollama instance running ... meatier models on a little "NUC" with a GPU at http://IAMGROOT.local:11434. I'd love to be able to send my "local" queries there through customizability.

Allowing customizing the port would also mean i could run a different local provider on a different port if I wanted to.

fbgallet commented 3 months ago

Hi @cori , allowing to replace http://localhost:11434 by any url+port of your choice would be good for you ?

cori commented 3 months ago

Absolutely! I have a little box with a laptop GPU on my network at something like http://ollama.local:11434 so I can run slightly beefier models. Being able to tell Roam where to find my Ollama instance would be ideal.

I could probably put together a PR and normally I’d love to do so, but I have yet to set up a roam extension dev environment and have no idea how to test one so I’m a bit intimidated.

fbgallet commented 3 months ago

No worries, I will add the option in the next update.

fbgallet commented 3 months ago

I've pushed an update for review. You can eventually test it with this PR shorthand: fbgallet+roam-extension-speech-to-roam+903

fbgallet commented 3 months ago

New version is published. You can customize the hostname for Ollama now

cori commented 3 months ago

I got the new version; thanks! Having trouble connecting; that could be on my end of things, but i do have the service running and allowing * for origins - when i attempt a connection i get the warning message but nothing hits my ollama host (whereas a few other super-annoying apps make HEAD calls every second or so, i see no such connection here).

The error mentions that the request is coming from roamresearch.com, which i guess makes sense. I'm out of time to troubleshoot this right now but will return to it a bit later - would you prefer a new issue?

fbgallet commented 3 months ago

I suppose that it's an issue on server side, try to allow 'roamresearch.com' origin. I can't test my self, i've not your configuration. Once you have customized the path to Ollama server, the requests is sent by the extension to your-host-name:port/api/chat, you can also test it with Postman