Open dictvm opened 2 months ago
I have the exact same setup (local ollama server and tailscale vpn on all my devices) and would also like this feature.
+1 I would appreciate it too if you added this feature! I have the same setup as other users, btw.
Coming soon: https://github.com/brave/brave-core/pull/26475
Platforms
all
Description
Since about July Brave Assistant/Leo AI supports setting up selfhosted Ollama to use a broad selection of open source models. Currently users cannot configure Ollama without https unless it is running on localhost.
I would appreciate allowing ollama API access via plaintext http for the following networks:
RFC1918:
RFC6598:
I’m using tailscale to access my ollama server remotely. I don’t want to deal with having a publicly available domain to get a Letsencrypt certificate since tailscale is already setting up an encrypted Wireguard tunnel between my devices.
Even without tailscale or similar tunneling software, I think a warning about using http over insecure networks should suffice. Someone running Ollama most likely is at least somewhat technically capable.