Open fdg157 opened 3 days ago
👀 @fdg157
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
@fdg157 你好!我是Dosu,一个可以帮助您解决问题、回答疑问并成为贡献者的机器人。我会在等待人类维护者的过程中帮助您。
To address the connectivity check failure in LobeChat when using a locally proxied Ollama service with restricted IP access, you need to configure Ollama for cross-origin access. This involves setting the OLLAMA_ORIGINS
environment variable to allow access from any origin. Here are the steps based on your operating system:
launchctl
:
launchctl setenv OLLAMA_ORIGINS "*"
OLLAMA_ORIGINS
environment variable for your user account, setting the value to *
.sudo systemctl edit ollama.service
[Service]
:
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
Environment="OLLAMA_ORIGINS=*"
sudo systemctl daemon-reload
sudo systemctl restart ollama
docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -p 11434:11434 --name ollama ollama/ollama
Additionally, ensure that the DEFAULT_BASE_URL
in your LobeChat configuration is set to http://127.0.0.1:11434
, as this is the default host used by the Ollama service. If you have customized the base URL, make sure it matches the configuration in your Ollama service setup. These steps should help resolve the connectivity issue by allowing cross-origin requests to the Ollama service [1][2].
To continue talking to Dosu, mention @dosu.
📦 部署环境
Docker
📌 软件版本
1.22.8
💻 系统环境
Other Linux
🌐 浏览器
Firefox
🐛 问题描述
When access is restricted to cloud server IPs, the frontend webpage will fail the check when initiating a query, regardless of whether the client request mode is used.
📷 复现步骤
🚦 期望结果
The frontend webpage should be able to pass the connectivity check normally when initiating the check and call the local model session.
📝 补充信息
No response