OpenInterpreter / 01

The #1 open-source voice interface for desktop, mobile, and ESP32 chips.
https://01.openinterpreter.com/
GNU Affero General Public License v3.0
4.92k stars 517 forks source link

Connecting from desktop to ollama on local server #158

Closed gibru closed 6 months ago

gibru commented 6 months ago

EDIT: I just realized I put this in the wrong place. This should go in the OpenInterpreter repo. Will close here and open a new issue in the right place.

Is it possible to connect to a server running ollama accessible over LAN (e.g.192.168.77.10:11434) from my desktop at 192.168.77.7? I have confirmed that ollama is reachable from my desktop.

So far I've tried the approach discussed in running locally as well as the solution discussed here.

Using interpreter --model ollama/dolphin-mixtral:8x7b-v2.6, is "ollama" refering to 127.0.0.1:11434? If so, can this be changed to point to ollama's 192.168.x.x ip address?

I've tried --api_base 192.168.77.10:11434 but that sets the model to openai/gpt-4.

Any help/pointers in the right direction greatly appreciated.

Sotiris-Bekiaris commented 6 months ago

Hello, could you please link the new issue so anyone interested can follow along? Thank you!

gibru commented 6 months ago

Hello, could you please link the new issue so anyone interested can follow along? Thank you!

Sure: new issue