continuedev / continue

⏩ Continue is the leading open-source AI code assistant. You can connect any models and any context to build custom autocomplete and chat experiences inside VS Code and JetBrains
https://docs.continue.dev/
Apache License 2.0
18.74k stars 1.59k forks source link

No advice on connecting to Ollama behind Open WebUI #2485

Open huornlmj opened 3 weeks ago

huornlmj commented 3 weeks ago

Before submitting your bug report

Relevant environment info

- OS:N/A
- Continue version: v0.8.52
- IDE version: 1.93.1
- Model: N/A
- config.json: N/A

Description

I see there is a cloded issue / bug explaining how to connect to an internally hosted Ollama listening on an IP on port 11434. But how do you connect to an Ollama which is only served behind the popular Open WebUI (443/tcp TLS)?

To reproduce

No response

Log output

No response

Patrick-Erichsen commented 3 weeks ago

Hi @huornlmj , have you tried to set the apiBase for the model?

If you get this running it could be a nice Tutorial to add to the docs site if you're interested in contributing!

Screenshot 2024-10-07 at 12 19 33 PM

0xThresh commented 3 weeks ago

Hi @huornlmj, here's a working config file for Open WebUI.

{
  "models": [
    {
      "title": "Ollama Remote",
      "provider": "ollama",
      "model": "llama3:instruct",
      "completionOptions": {},
      "contextLength": 8192,
      "apiBase": "https://<your Open WebUI URL>/ollama",
      "apiKey" :"<your API key>",
    }
  ],
  "customCommands": [
    {
      "name": "test",
      "prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
      "description": "Write unit tests for highlighted code"
    }
  ],
  "tabAutocompleteModel": {
    "title": "Ollama Remote Autocomplete",
    "provider": "ollama",
    "model": "codellama:code",
    "completionOptions": {},
    "contextLength": 8192,
      "apiBase": "https://<your Open WebUI URL>/ollama",
      "apiKey" :"<your API key>",
  },
  "allowAnonymousTelemetry": false,
  "embeddingsProvider": {
    "provider": "transformers.js"
  },
  "docs": []
}
Patrick-Erichsen commented 2 weeks ago

Thanks for sharing @0xThresh !

xqe2011 commented 5 days ago

Same problem but the config doesn't work. The tabAutocompleteModel and some functions like writing a docstring work well. But everything needs to use the chat panel, which will always give me an empty response. Here is my config.

"models": [
    {
      "model": "qwen2.5-coder:7b",
      "provider": "ollama",
      "completionOptions": {},
      "contextLength": 8192,
      "apiBase": "https://****/ollama",
      "apiKey": "*****",
      "title": "Qwen2.5 Coder 7B"
    }
  ],
  "tabAutocompleteModel": {
    "model": "qwen2.5-coder:7b",
    "provider": "ollama",
    "completionOptions": {},
    "contextLength": 8192,
    "apiBase": "https://****/ollama",
    "apiKey": "****",
    "title": "Qwen2.5 Coder 7B"
  }

For example, when I want to explain my code.

image
0xThresh commented 5 days ago

@xqe2011 this actually just started happening to me as well, and I noticed it right after an Open WebUI upgrade. I went from around 0.3.10 to 0.3.32 and the issue appeared right away. I'll bring the issue up with the Open WebUI folks, it's helpful to know it's not just me.

Can you tell me what version of Open WebUI you're on?

UPDATE: The maintainer just told me someone submitted a PR that should fix this on 0.3.35, so I'll be upgrading to that tomorrow, I'll report back.

xqe2011 commented 4 days ago

@0xThresh My Open WebUI version: v0.3.33 Ollama version: 0.3.14 continue.dev version: v0.8.55

UPDATE: I tried to upgrade Open WebUI to v0.3.35, and the problem was solved.