Open huornlmj opened 3 weeks ago
Hi @huornlmj , have you tried to set the apiBase
for the model?
If you get this running it could be a nice Tutorial to add to the docs site if you're interested in contributing!
Hi @huornlmj, here's a working config file for Open WebUI.
{
"models": [
{
"title": "Ollama Remote",
"provider": "ollama",
"model": "llama3:instruct",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://<your Open WebUI URL>/ollama",
"apiKey" :"<your API key>",
}
],
"customCommands": [
{
"name": "test",
"prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
"description": "Write unit tests for highlighted code"
}
],
"tabAutocompleteModel": {
"title": "Ollama Remote Autocomplete",
"provider": "ollama",
"model": "codellama:code",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://<your Open WebUI URL>/ollama",
"apiKey" :"<your API key>",
},
"allowAnonymousTelemetry": false,
"embeddingsProvider": {
"provider": "transformers.js"
},
"docs": []
}
Thanks for sharing @0xThresh !
Same problem but the config doesn't work. The tabAutocompleteModel
and some functions like writing a docstring work well. But everything needs to use the chat
panel, which will always give me an empty response. Here is my config.
"models": [
{
"model": "qwen2.5-coder:7b",
"provider": "ollama",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://****/ollama",
"apiKey": "*****",
"title": "Qwen2.5 Coder 7B"
}
],
"tabAutocompleteModel": {
"model": "qwen2.5-coder:7b",
"provider": "ollama",
"completionOptions": {},
"contextLength": 8192,
"apiBase": "https://****/ollama",
"apiKey": "****",
"title": "Qwen2.5 Coder 7B"
}
For example, when I want to explain my code.
@xqe2011 this actually just started happening to me as well, and I noticed it right after an Open WebUI upgrade. I went from around 0.3.10 to 0.3.32 and the issue appeared right away. I'll bring the issue up with the Open WebUI folks, it's helpful to know it's not just me.
Can you tell me what version of Open WebUI you're on?
UPDATE: The maintainer just told me someone submitted a PR that should fix this on 0.3.35, so I'll be upgrading to that tomorrow, I'll report back.
@0xThresh
My Open WebUI version: v0.3.33
Ollama version: 0.3.14
continue.dev version: v0.8.55
UPDATE: I tried to upgrade Open WebUI to v0.3.35
, and the problem was solved.
Before submitting your bug report
Relevant environment info
Description
I see there is a cloded issue / bug explaining how to connect to an internally hosted Ollama listening on an IP on port 11434. But how do you connect to an Ollama which is only served behind the popular Open WebUI (443/tcp TLS)?
To reproduce
No response
Log output
No response