langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
46.04k stars 6.49k forks source link

[BUG]I got error "Meta endpoint! Unexpected status code: 503, with response body: None." when I upgrade to Version 0.6.12-fix1 #5893

Closed haodaohong closed 2 months ago

haodaohong commented 2 months ago

Self Checks

Dify version

Version 0.6.12-fix1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

after upgrade to Version 0.6.12-fix1 (only updated all URLs and port), then I try to do files embedding, got this error "Meta endpoint! Unexpected status code: 503, with response body: None." .

image

and also cannot ask question successfully

image

✔️ Expected Behavior

can success do embedding, and can ask question

❌ Actual Behavior

error "Meta endpoint! Unexpected status code: 503, with response body: None."

dosubot[bot] commented 2 months ago

@haodaohong Hello! I'm here to help you with any bugs, questions, or contributions while you wait for a human maintainer.

To resolve the "Meta endpoint! Unexpected status code: 503, with response body: None" error and the issue of the bot not responding to messages after upgrading to Version 0.6.12-fix1 of Dify in a self-hosted Docker setup, follow these steps:

  1. Ensure All Containers Are Running: Check if all the necessary containers are running successfully:

    docker compose ps
  2. Update and Restart Dify: Make sure you have the latest updates and restart the Dify services:

    cd dify/docker
    git pull origin main
    docker compose down
    docker compose pull
    docker compose up -d
  3. Check Environment Variables: Ensure that your .env file is correctly configured. If you made any changes, restart Dify:

    docker compose down
    docker compose up -d
  4. Inspect Logs: Check the logs of the containers to identify any specific issues. For example, to check the logs of the api container:

    docker compose logs api

By following these steps, you should be able to resolve the 503 error and ensure that the bot responds to messages correctly. Additionally, ensure that your Docker setup meets the prerequisites, such as having Docker 19.03 or later and Docker Compose 1.25.1 or later, and that your system resources are adequately allocated (e.g., at least 2 vCPUs and 8 GB of memory for Docker Desktop on macOS) [1][2][3].

To continue talking to Dosu, mention @dosu.

crazywoola commented 2 months ago

It seems something wrong with the embedding services, please provide the container logs of worker and api. It's mostly due to the network connection issue between your server and embedding service, especially when you are using cohere rerank. Please make sure you have setup a VPN properly as well.

haodaohong commented 2 months ago

hi @dosu @crazywoola this is the error log as below, pls help to check, thanks a lot~

api-1 | [2024-07-02 17:31:22 +0000] [23] [INFO] Starting gunicorn 22.0.0 api-1 | [2024-07-02 17:31:22 +0000] [23] [INFO] Listening at: http://0.0.0.0:5001 (23) api-1 | [2024-07-02 17:31:22 +0000] [23] [INFO] Using worker: gevent api-1 | [2024-07-02 17:31:22 +0000] [38] [INFO] Booting worker with pid: 38 api-1 | 2024-07-03 03:39:06,791.791 INFO [Dummy-1] [_client.py:1026] - HTTP Request: POST https://azscoai0001.openai.azure.com/openai/deployments/gpt-35-turbo-16k/chat/completions?api-version=2024-02-15-preview "HTTP/1.1 200 OK" api-1 | 2024-07-03 03:39:15,825.825 INFO [Dummy-2] [_client.py:1026] - HTTP Request: POST https://azscoai0001.openai.azure.com/openai/deployments/whisper/audio/transcriptions?api-version=2024-05-01-preview "HTTP/1.1 200 OK" api-1 | 2024-07-03 03:39:24,401.401 INFO [Dummy-3] [_client.py:1026] - HTTP Request: POST https://azscoai0001.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings?api-version=2024-04-01-preview "HTTP/1.1 200 OK" api-1 | 2024-07-03 03:39:34,792.792 INFO [ThreadPoolExecutor-4_0] [_client.py:1026] - HTTP Request: POST https://azscoai0001.openai.azure.com/openai/deployments/tts-hd/audio/speech?api-version=2024-05-01-preview "HTTP/1.1 200 OK" api-1 | 2024-07-03 03:41:07,331.331 ERROR [Thread-7 (_generate_worker)] [app_generator.py:214] - Unknown Error when generating api-1 | Traceback (most recent call last): api-1 | File "/app/api/core/app/apps/chat/app_generator.py", line 193, in _generate_worker api-1 | runner.run( api-1 | File "/app/api/core/app/apps/chat/app_runner.py", line 159, in run api-1 | context = dataset_retrieval.retrieve( api-1 | File "/app/api/core/rag/retrieval/dataset_retrieval.py", line 125, in retrieve api-1 | all_documents = self.single_retrieve( api-1 | File "/app/api/core/rag/retrieval/dataset_retrieval.py", line 271, in single_retrieve api-1 | results = RetrievalService.retrieve( api-1 | File "/app/api/core/rag/datasource/retrieval_service.py", line 87, in retrieve api-1 | raise Exception(exception_message) api-1 | Exception: Meta endpoint! Unexpected status code: 503, with response body: None. api-1 | 2024-07-03 03:41:07,338.338 ERROR [Dummy-6] [base_app_generate_response_converter.py:128] - Meta endpoint! Unexpected status code: 503, with response body: None.

crazywoola commented 2 months ago

It seems a network issue on their side. See docs of 503 error here. https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503

haodaohong commented 2 months ago

The issue has been resolved, but the cause is that docker could not pull at the beginning, so the relevant proxy config were used for docker, and then these proxy config affected the program to run, after deleting proxy config can ran successfully

hi-noikiy commented 3 weeks ago

The issue has been resolved, but the cause is that docker could not pull at the beginning, so the relevant proxy config were used for docker, and then these proxy config affected the program to run, after deleting proxy config can ran successfully

请问你是怎么解决的?我现在是对添加到知识库里的文档索引的时候出现的问题