Open neuhaus opened 6 days ago
I havent tried using ollama inside of docker yet. People on the ollama discord are pretty informed though. They might have some ideas!
Following the discord discussion: llm-x runs in a browser, it is not aware of any inner network calls for docker.
Instead you will need to expose the port from docker to the system, and connect to ollama (for example) that way
I know this isint exactly desired behavior and I'm looking into alternatives for this.
Maybe docker/electron versions of llm-x can use a node backend to handle the communications
Oh, right! 😊 Ok can you add support for a Bearer token to the HTTP request so my browser can do the request by itself (my ollama is behind a nginx that checks the Bearer tokens in the http auth header)? That would be great for me!
This might be working already (not exactly cleanly though)
I havent tested this with the other connections but for ollama:
Edit the ollama connection
add a new parameter: Field name is headers
value is {"HEADER_NAME":"header_value"}
and then select that it should be used as a json, (be sure to save changes to the connection after)
it should be passing HEADER_NAME now 🤞
OK i thought because it was about JSON that it would add fields to the JSON, not to the HTTP headers. I'll give it a try!
I just found out myself actually: More information on it here; https://github.com/langchain-ai/langchainjs/issues/4465#issuecomment-1955002478
Was this able to work for you usecase? If so we can close the issue :)
I'm running docker and LLM-X in a container with the following compose.yml file:
In the LLM-X configuration i add ollama and then configure it to be at http://ollama:11434/ but it doesn't show me any models (there are quite a few it should show).
I added a test container with the same network and i can connect from the test container to ollama on port 11434 without any problems so i think it's an issue with LLM-X.
Has anyone gotten this setup to work (Docker containers for ollama + llm-x)?