FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
29.41k stars 15.21k forks source link

[BUG] Request Timeout Accessing Azure API in private network #2797

Closed bodzebod closed 1 day ago

bodzebod commented 1 month ago

Describe the bug Flowise hosted in an AWS EC2 running Docker is unable to invoke an LLM REST API located in AzureAI. Despite whitelisting required domains, there are still issues that lead to timeouts. The LLM REST API is deployed by our company in a private network, hence it's using a specific base URL ending with *.azure-api.net. The domain name has been added to the firewall whitelist, and I can invoke successfully the LLM via a CURL command, both inside and outside the flowise container. Beside, I have no issue when I run a docker flowise instance from a PC in my home. I can access this LLM in that situation. I don't understand why a CURL command is successful, but Flowise gets a timeout. I ran out of idea :(

To Reproduce Steps to reproduce the behavior:

  1. Deploy the Flowise AI application on an AWS EC2 instance with docker.
  2. Configure the application to send a request to the AzureAI REST API in a private network.
  3. Observe the application logs.
  4. See timeout error.

Expected behavior The application should successfully send a request to the AzureAI REST API and receive a timely response, as confirmed with direct CURL requests.

Screenshots image

Flow N/A

Setup

Additional context

Environment Variables:

Additional context Error message: 2024-07-12 12:36:06 [VERBOSE]: [llm/error] [1:chain:RunnableSequence > 13:chain:RunnableSequence > 15:llm:ChatOpenAI] [86.35s] LLM run errored with error: "Request timed out.\n\nTimeoutError: Request timed out.\n at wrapOpenAIClientError (/usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/utils/openai.cjs:13:17)\n at /usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/chat_models.cjs:764:69\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)" 2024-07-12 12:36:06 [VERBOSE]: [chain/error] [1:chain:RunnableSequence > 13:chain:RunnableSequence] [86.35s] Chain run errored with error: "Request timed out.\n\nTimeoutError: Request timed out.\n at wrapOpenAIClientError (/usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/utils/openai.cjs:13:17)\n at /usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/chat_models.cjs:764:69\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)" 2024-07-12 12:36:06 [VERBOSE]: [chain/error] [1:chain:RunnableSequence] [87.97s] Chain run errored with error: "Request timed out.\n\nTimeoutError: Request timed out.\n at wrapOpenAIClientError (/usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/utils/openai.cjs:13:17)\n at /usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/chat_models.cjs:764:69\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)" 2024-07-12 12:36:06 [ERROR]: [server]: Error: Request timed out. TimeoutError: Request timed out. at wrapOpenAIClientError (/usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/utils/openai.cjs:13:17) at /usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/chat_models.cjs:764:69 at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)

HenryHengZJ commented 1 month ago

I am guessing this is some proxy issue, causing it unable to reach the URL

bodzebod commented 1 month ago

The symptoms are similar, but there's no proxy set at the os level. Do you know how flowise would handle proxy settings? Based on https_proxy env variable? If so, it is not set.

bodzebod commented 2 weeks ago

Finally, I had to set the following environment variables in docker-compose file to make it work on Flowise 2.0.2: AZURE_OPENAI_BASE_PATH AZURE_OPENAI_API_DEPLOYMENT_NAME OPENAI_API_KEY AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME AZURE_OPENAI_API_INSTANCE_NAME AZURE_OPENAI_API_VERSION

Then the AzureAI chat node worked. I saw that release 2.0.5 added the possibility to set the AzureAI API base path in the AzureAI chat model node. However, this doesn't exist for AzureAI embeddings node. I happen to have to use a chat model and an embedding model that are not deployed in the same subnet, and I need to define a different base path for both.

At the moment, I have set the base path for the chat model in "base path" in flowise 2.0.5. I tried to set the one for the embedding model via the environment variable, but it doesn't work. Could you add also this "base path" feature to the AzureAI embedding node?

bodzebod commented 1 day ago

Base path added to AzureAI embedding node in 2.0.6