lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
https://chat-preview.lobehub.com
Other
43k stars 9.68k forks source link

the check passes, but chatting still fails. #2839

Closed fazhang-master closed 3 weeks ago

fazhang-master commented 4 months ago

image image image Ollama is running locally, not in Docker. However, after running the lobe-chat Docker container using the following command:docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://127.0.0.1:11434 lobehub/lobe-chat。the check passes, but chatting still fails. The command curl http://127.0.0.1:11434 works successfully, and running ollama run llama3:8b also works correctly.

Originally posted by @fazhang-master in https://github.com/lobehub/lobe-chat/issues/1633#issuecomment-2158219327

lobehubbot commented 4 months ago

👀 @fazhang-master

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

arvinxx commented 3 weeks ago

do it still occur ?

lobehubbot commented 3 weeks ago

👋 @{{ author }}
Since the issue was labeled with 🤔 Need Reproduce, but no response in 3 days. This issue will be closed. If you have any questions, you can comment and reply.\ 由于该 issue 被标记为需要更多信息,却 3 天未收到回应。现关闭 issue,若有任何问题,可评论回复。

lobehubbot commented 3 weeks ago

✅ @fazhang-master

This issue is closed, If you have any questions, you can comment and reply.\ 此问题已经关闭。如果您有任何问题,可以留言并回复。