EDIT: I just realized I put this in the wrong place. This should go in the OpenInterpreter repo. Will close here and open a new issue in the right place.
Is it possible to connect to a server running ollama accessible over LAN (e.g.192.168.77.10:11434) from my desktop at 192.168.77.7? I have confirmed that ollama is reachable from my desktop.
So far I've tried the approach discussed in running locally as well as the solution discussed here.
Using interpreter --model ollama/dolphin-mixtral:8x7b-v2.6, is "ollama" refering to 127.0.0.1:11434? If so, can this be changed to point to ollama's 192.168.x.x ip address?
I've tried --api_base 192.168.77.10:11434 but that sets the model to openai/gpt-4.
Any help/pointers in the right direction greatly appreciated.
EDIT: I just realized I put this in the wrong place. This should go in the OpenInterpreter repo. Will close here and open a new issue in the right place.
Is it possible to connect to a server running ollama accessible over LAN (e.g.192.168.77.10:11434) from my desktop at 192.168.77.7? I have confirmed that ollama is reachable from my desktop.
So far I've tried the approach discussed in running locally as well as the solution discussed here.
Using
interpreter --model ollama/dolphin-mixtral:8x7b-v2.6
, is "ollama" refering to 127.0.0.1:11434? If so, can this be changed to point to ollama's 192.168.x.x ip address?I've tried
--api_base 192.168.77.10:11434
but that sets the model to openai/gpt-4.Any help/pointers in the right direction greatly appreciated.