lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
https://chat-preview.lobehub.com
Other
42.16k stars 9.51k forks source link

[Bug] Ollama service is unavailable #2337

Closed yanliang84 closed 2 weeks ago

yanliang84 commented 5 months ago

💻 Operating System

macOS

📦 Environment

Docker

🌐 Browser

Chrome

🐛 Bug Description

问题1, 文档里一致没写,docker run后,打开什么url访问chat,请问是这个么https://chat-preview.lobehub.com/chat?session=inbox&agent= 问题2, 如果是,想请问,为什么是访问一个远程的网页,而不是localhost的呢? 问题3, 3210端口是用来做啥的? 问题4 在本地启动了ollama后, 也启动了 ollama run mistral, terminal可以正常使用,curl localhost:11434也可以. 也执行了launchctl setenv OLLAMA_ORIGINS "*". 但在“语言模型“配置里,仍然链接不上ollama,报错是: response.OllamaServiceUnavailable Show Details json { "host": "http://127.0.0.1:11434", "message": "please check whether your ollama service is available or set the CORS rules", "provider": "ollama" } 请问可能是什么问题呢?

🚦 Expected Behavior

期望使用lobe-chat访问本地ollama里的mistral模型

📷 Recurrence Steps

在本地启动了ollama后, 也启动了 ollama run mistral, terminal可以正常使用,curl localhost:11434也可以. 也执行了launchctl setenv OLLAMA_ORIGINS "*". 启动:docker run -d -p 3210:3210 -e OLLAMA_PROXY_URL=http://host.docker.internal:11434/v1 lobehub/lobe-chat

在“语言模型“配置里,仍然链接不上ollama,报错是: response.OllamaServiceUnavailable Show Details json { "host": "http://127.0.0.1:11434", "message": "please check whether your ollama service is available or set the CORS rules", "provider": "ollama" }

📝 Additional Information

No response

lobehubbot commented 5 months ago

👀 @yanliang84

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

kampchen commented 5 months ago

我也是,局域网有一台机器开的ollama,其他的应用比如openwebui和dify都可以调用ollama,就是lobechat不行,错误提示和楼主一样,配置了“ollama origins”也不行

lobehubbot commented 5 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Me too, I have a machine running ollama on the local area network. Other applications such as openwebui and diffy can call ollama, but lobechat cannot. The error message is the same as the original poster, even if "ollama origins" is configured.

Mrered commented 5 months ago

我也有同样的问题,局域网 Ollama 用 Open WebUI 都可以访问,用 LobeChat 不行

lobehubbot commented 5 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I also have the same problem. I can access Ollama on the LAN using Open WebUI, but not LobeChat.

ninjadogz commented 5 months ago

我在Windows环境里,局域网内通过lobechat连Ollama遇到了同样的问题。(openweb ui可以正常连接。)

检查了ollama的连接记录,局域网连接的时候显示了以下日志。 [GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags" 可能是从origin “http://10.168.10.1:3210” 获取 “http://(ollama):11434/api/tags” 的访问被 ollama的CORS 策略阻止了。

我在系统环境变量追加了OLLAMA_ORIGINS=,重启ollama之后就可以正常连接了。 `set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=; ollama serve` linux用export或者修改.bashrc应该可以达到一样的效果。

lobehubbot commented 5 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I encountered the same problem in a Windows environment and connected to Ollama through lobechat in the LAN. (openweb ui can connect normally.)

I checked ollama's connection record and the following log was displayed when connecting to the LAN. [GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags" It may be that access to "http://(ollama):11434/api/tags" from origin "http://10.168.10.1:3210" is blocked by ollama's CORS policy.

I added OLLAMA_ORIGINS= to the system environment variable, and after restarting ollama, I can connect normally. `set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=; ollama serve` Linux should be able to achieve the same effect by using export or modifying .bashrc.

Mrered commented 5 months ago

我在Windows环境里,局域网内通过lobechat连Ollama遇到了同样的问题。(openweb ui可以正常连接。)

检查了ollama的连接记录,局域网连接的时候显示了以下日志。 [GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags" 可能是从origin “http://10.168.10.1:3210” 获取 “http://(ollama):11434/api/tags” 的访问被 ollama的CORS 策略阻止了。

我在系统环境变量追加了OLLAMA_ORIGINS=,重启ollama之后就可以正常连接了。 `set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=; ollama serve` linux用export或者修改.bashrc应该可以达到一样的效果。

我已经将 OLLAMA_ORIGINS=* 设置到环境变量了,【沉浸式翻译】都可以正常调用,就是 LobeChat 不行

lobehubbot commented 5 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


I encountered the same problem in a Windows environment and connected to Ollama through lobechat in the LAN. (openweb ui can connect normally.)

Checked ollama's connection record, and the following log was displayed when connecting to the LAN. [GIN] 2024/05/04 - 18:35:30 | 403 | 0s | 10.168.10.1 | OPTIONS "/api/tags" may be from origin "http://10.168.10.1:3210" Access to "http://(ollama):11434/api/tags" is blocked by ollama's CORS policy.

I added OLLAMA_ORIGINS= to the system environment variable, and after restarting ollama, I can connect normally. `set OLLAMA_HOST=0.0.0.0; set OLLAMA_ORIGINS=; ollama serve` The same effect should be achieved by using export or modifying .bashrc on Linux.

I have set OLLAMA_ORIGINS=* to the environment variable, and [Immersive Translation] can be called normally, but LobeChat cannot.

yanliang84 commented 5 months ago

暂时没有回复,如果大家只是想有一个UI能够比较方便的调试LLM的话,可以使用一下这个项目,简单易部署:chatbot-ollama

lobehubbot commented 5 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


There is no reply yet. If you just want a UI that can debug LLM more conveniently, you can use this project, which is simple and easy to deploy: Ollama-ui

arvinxx commented 5 months ago

问题1, 文档里一致没写,docker run后,打开什么url访问chat

本地部署的 LobeChat docker 镜像,如果没改端口,访问 localhost:3210 就行了

在“语言模型“配置里,仍然链接不上ollama,报错是:

你有试过直接会话么?

lobehubbot commented 5 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


Question 1, it is not written in the document. After docker run, what URL should be opened to access chat?

For the locally deployed LobeChat docker image, if the port has not been changed, just visit localhost:3210.

In the "Language Model" configuration, ollama still cannot be connected, and the error is:

Have you tried direct conversation?

holycrypto commented 5 months ago

Same issue

{
  "host": "http://127.0.0.1:11434",
  "message": "please check whether your ollama service is available or set the CORS rules",
  "provider": "ollama"
}
btjawa commented 2 months ago

是浏览器 (blocked:mixed-content) 策略的问题

只会在启用客户端请求时发生

lobehubbot commented 2 months ago

Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑‍🤝‍🧑👫🧑🏿‍🤝‍🧑🏻👩🏾‍🤝‍👨🏿👬🏿


It's a browser (blocked:mixed-content) strategic pot

lobehubbot commented 2 weeks ago

✅ @yanliang84

This issue is closed, If you have any questions, you can comment and reply.\ 此问题已经关闭。如果您有任何问题,可以留言并回复。