infiniflow / ragflow

RAGFlow is an open-source RAG (Retrieval-Augmented Generation) engine based on deep document understanding.
https://ragflow.io
Apache License 2.0
10.96k stars 1.06k forks source link

When the "docker logs -f ragflow-server" command is run in offline deployment, the request address times out #832

Open liukx362330 opened 1 month ago

liukx362330 commented 1 month ago

Is there an existing issue for the same bug?

Branch name

main

Commit ID

081f922

Other environment information

No response

Actual behavior

pull down the image from the Internet, and report this connection to a website timeout when migrating to the Intranet for installation

Expected behavior

I'd like to deploy it on the Intranet

Steps to reproduce

load the related image and copy the source code, the first step: cd ragflow/docker, the second step: chmod +x./entrypoint. docker compose- f docker-compose-CN.yml up -d, Step 4: docker logs -f ragflow-server

Additional information

Traceback (most recent call last): File "/ragflow/api/ragflow_server.py", line 26, in from api.apps import app File "/ragflow/api/apps/init.py", line 26, in from api.db.db_models import close_connection File "/ragflow/api/db/db_models.py", line 32, in from api.settings import DATABASE, stat_logger, SECRET_KEY File "/ragflow/api/settings.py", line 35, in from rag.utils.es_conn import ELASTICSEARCH File "/ragflow/rag/utils/init.py", line 59, in encoder = tiktoken.encoding_for_model("gpt-3.5-turbo") File "/usr/local/lib/python3.10/dist-packages/tiktoken/model.py", line 101, in encoding_for_model return get_encoding(encoding_name_for_model(model_name)) File "/usr/local/lib/python3.10/dist-packages/tiktoken/registry.py", line 73, in get_encoding enc = Encoding(constructor()) File "/usr/local/lib/python3.10/dist-packages/tiktoken_ext/openai_public.py", line 72, in cl100k_base mergeable_ranks = load_tiktoken_bpe( File "/usr/local/lib/python3.10/dist-packages/tiktoken/load.py", line 147, in load_tiktoken_bpe contents = read_file_cached(tiktoken_bpe_file, expected_hash) File "/usr/local/lib/python3.10/dist-packages/tiktoken/load.py", line 64, in read_file_cached contents = read_file(blobpath) File "/usr/local/lib/python3.10/dist-packages/tiktoken/load.py", line 25, in read_file resp = requests.get(blobpath) File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 73, in get return request("get", url, params=params, kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/api.py", line 59, in request return session.request(method=method, url=url, kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 589, in request resp = self.send(prep, send_kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 703, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 507, in send raise ConnectTimeout(e, request=request) requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x7f4dd75afac0>, 'Connection to openaipublic.blob.core.windows.net timed out. (connect timeout=None)'))

dingsa0210 commented 1 month ago

Tokenizer model should not be hardcoded in line: tiktoken.encoding_for_model("gpt-3.5-turbo")

ye-jeck commented 1 month ago

have you solved it? I also encountered this problem

liwan14x commented 1 month ago

may be you can try to add proxy to the docker yml file. environment:

wuangqshd commented 1 week ago

may be you can try to add proxy to the docker yml file. environment:

The Internet environment is fine, no network agent can not work.