lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
https://chat-preview.lobehub.com
Other
39.94k stars 9.09k forks source link

[Bug] ollama models need to be "refreshed" when first time use #2101

Open cnkang opened 4 months ago

cnkang commented 4 months ago

💻 Operating System

macOS

📦 Environment

Docker

🌐 Browser

Chrome

🐛 Bug Description

docker-compose中有如下设置: OLLAMA_PROXY_URL=http://172.17.0.1:11434/v1 OLLAMA_MODEL_LIST=-all,+qwen:32b,+llama3:8b 首次登录后: image 进入设置显示如下: image 点击“重置”刷新后可以正常获取模型: image 后续可以继续正常使用: image

🚦 Expected Behavior

希望设定好的自定义的ollama模型首次进入后不需要去再设置那里刷新一下才能用。

📷 Recurrence Steps

用无痕浏览进入即可复现

📝 Additional Information

No response

lobehubbot commented 4 months ago

👀 @cnkang

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

lobehubbot commented 4 months ago

✅ @cnkang

This issue is closed, If you have any questions, you can comment and reply.\ 此问题已经关闭。如果您有任何问题,可以留言并回复。

lobehubbot commented 4 months ago

:tada: This issue has been resolved in version 0.148.4 :tada:

The release is available on:

Your semantic-release bot :package::rocket:

fgprodigal commented 4 months ago

v0.155.1 问题依然存在

lobehubbot commented 4 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


v0.155.1 problem still exists

xiaolinfrank commented 3 months ago

接力楼上,v0.156.2 问题依然存在

lobehubbot commented 3 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Relay upstairs, v0.156.2 problem still exists

CoreJa commented 3 months ago

Problem still exists in v0.162.17. @arvinxx

It is configured with no other LLM providers at all, just ollama itself.

Running docker cmd:

docker run -d --rm -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434/v1 -e ENABLED_OPENAI=0 -e OLLAMA_MODEL_LIST='-all,+wangshenzhi/llama3-8b-chinese-chat-ollama-q4:latest=ME-llama3<8192>' lobehub/lobe-chat

image

bingoct commented 2 months ago

problem still exists in docker.io/lobehub/lobe-chat:v0.162.21

environment

    environment:
      - FEATURE_FLAGS=-check-updates,-welcome_suggest
      - DEFAULT_AGENT_CONFIG="model=qwen2:7b;provider=ollama"
      - ENABLED_OPENAI=0
      - OLLAMA_PROXY_URL=http://192.168.3.100:11434
      - OLLAMA_MODEL_LIST=+qwen2:7b,+codellama:7b,+phi3:3.8b,-hhao/openbmb-minicpm-llama3-v-2_5:latest,-llama3:latest

Bug Description

  1. at first time, i open. then OLLAMA_PROXY_URL and is not right! image However, lube-chat's default agent can still be used. image and in ollama server log, it works indeedly.

    [GIN] 2024/06/11 - 13:28:53 | 200 |   14.1375523s |   192.168.3.100 | POST     "/api/chat"
  2. then i click the reset button, OLLAMA_MODEL_LIST works but still remain old default configuration. And OLLAMA_PROXY_URL is still not right. btw, i check the ollama server log. In fact,it received the request. So i guess the problem migtht happen in fronts.

    [GIN] 2024/06/11 - 13:15:01 | 200 |      6.2335ms |   192.168.3.100 | GET      "/api/tags"

    image

  3. whatever i set the OLLAMA_URL in front, the connect_health_check still fail. image

Expected Behavior

OLLAMA relate environment variable works right and health_check work right.

zidanereal5 commented 2 months ago

图片 同样的问题,环境变量配置未生效, 图片 需要在配置页面刷新,才可以显示模型。 版本v1.2.3 启动命令如下: sudo docker run -d -p 4000:3210 \ -e ENABLED_OPENAI=0 \ -e OPENAI_API_KEY=sk-EN1ertLm86bO0aO170B5F114271343E580B85e5eF8EeC729 \ -e OPENAI_PROXY_URL=http://10.2.147.71:3001/v1 \ -e OPENAI_MODEL_LIST="-all" \ -e ENABLED_OLLAMA=1 \ -e OLLAMA_PROXY_URL=http://10.2.147.71:11434 \ -e OLLAMA_MODEL_LIST="-all,+qwen2:7b<32000:fc>" \ --name lobe-chat \ lobehub/lobe-chat:v1.2.3

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Picture The same problem, the environment variable configuration does not take effect. Picture The configuration page needs to be refreshed before the model can be displayed. Version v1.2.3 The startup command is as follows: sudo docker run -d -p 4000:3210 \ -e ENABLED_OPENAI=0 \ -e OPENAI_API_KEY=sk-EN1ertLm86bO0aO170B5F114271343E580B85e5eF8EeC729 \ -e OPENAI_PROXY_URL=http://10.2.147.71:3001/v1 \ -e OPENAI_MODEL_LIST="-all" \ -e ENABLED_OLLAMA=1 \ -e OLLAMA_PROXY_URL=http://10.2.147.71:11434 \ -e OLLAMA_MODEL_LIST="-all,+qwen2:7b<32000:fc>" \ --name lobe-chat \ lobehub/lobe-chat:v1.2.3