lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Azure / DeepSeek), Knowledge Base (file upload / knowledge management / RAG ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT/ Claude application.
https://chat-preview.lobehub.com
Other
43.08k stars 9.71k forks source link

[Bug] When using LobeChat, I encountered a bug regarding the connectivity check for locally proxied Ollama to the internet. #4413

Open fdg157 opened 3 days ago

fdg157 commented 3 days ago

📦 部署环境

Docker

📌 软件版本

1.22.8

💻 系统环境

Other Linux

🌐 浏览器

Firefox

🐛 问题描述

When access is restricted to cloud server IPs, the frontend webpage will fail the check when initiating a query, regardless of whether the client request mode is used. ai com

📷 复现步骤

  1. Set up local Ollama proxying to the internet, allowing access only from cloud server IPs (Cloudflare WAF).
  2. Initiate a query from the frontend webpage.
  3. The check will fail regardless of whether the client request mode is used.

🚦 期望结果

The frontend webpage should be able to pass the connectivity check normally when initiating the check and call the local model session.

📝 补充信息

No response

lobehubbot commented 3 days ago

👀 @fdg157

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

dosubot[bot] commented 3 days ago

@fdg157 你好!我是Dosu,一个可以帮助您解决问题、回答疑问并成为贡献者的机器人。我会在等待人类维护者的过程中帮助您。

To address the connectivity check failure in LobeChat when using a locally proxied Ollama service with restricted IP access, you need to configure Ollama for cross-origin access. This involves setting the OLLAMA_ORIGINS environment variable to allow access from any origin. Here are the steps based on your operating system:

On macOS:

  1. Set the environment variable using launchctl:
    launchctl setenv OLLAMA_ORIGINS "*"
  2. Restart the Ollama application.

On Windows:

  1. Exit the Ollama program from the taskbar.
  2. Edit system environment variables from the Control Panel.
  3. Create or edit the OLLAMA_ORIGINS environment variable for your user account, setting the value to *.
  4. Save the changes and restart the system.
  5. Run Ollama again.

On Linux:

  1. Edit the systemd service:
    sudo systemctl edit ollama.service
  2. Add the following under [Service]:
    [Service]
    Environment="OLLAMA_HOST=0.0.0.0"
    Environment="OLLAMA_ORIGINS=*"
  3. Save and exit.
  4. Reload systemd and restart Ollama:
    sudo systemctl daemon-reload
    sudo systemctl restart ollama

Using Docker:

  1. Run the Docker container with the environment variable:
    docker run -d --gpus=all -v ollama:/root/.ollama -e OLLAMA_ORIGINS="*" -p 11434:11434 --name ollama ollama/ollama

Additionally, ensure that the DEFAULT_BASE_URL in your LobeChat configuration is set to http://127.0.0.1:11434, as this is the default host used by the Ollama service. If you have customized the base URL, make sure it matches the configuration in your Ollama service setup. These steps should help resolve the connectivity issue by allowing cross-origin requests to the Ollama service [1][2].

To continue talking to Dosu, mention @dosu.