langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
50.05k stars 7.15k forks source link

Has not a 'Add Ollama' UI on my local dify web page #5766

Closed kylezhang closed 4 months ago

kylezhang commented 4 months ago

Self Checks

Dify version

0.6.12

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

  1. I installed Ollama and Llava on WSL, and Ollama can run Llava for a normal conversation.
  2. I followed this tutorial to complete the deployment of Dify: Docker Compose Deployment | Dify, and I am now able to log in and access it normally.
  3. In the web interface under "Settings" -> "Model Providers," I can't find an entry to add Ollama. image

✔️ Expected Behavior

I can normally add models that are locally deployed with Ollama.

❌ Actual Behavior

Has not a 'Add Ollama' UI on my local dify web page

takatost commented 4 months ago

So sorry about that! We've fixed the issue with the model provider icon not showing up. You can upgrade to 0.6.12-fix1 now.

kylezhang commented 4 months ago

So sorry about that! We've fixed the issue with the model provider icon not showing up. You can upgrade to 0.6.12-fix1 now.

Thanks, now it 's okay. image image