langflow-ai / langflow

Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database.
http://www.langflow.org
MIT License
29.21k stars 3.71k forks source link

"Server is busy" caused by slow build of flow. #3878

Open 9n opened 2 hours ago

9n commented 2 hours ago

截图 2024-09-22 12-17-53

A slow flow build will block the entire server, I experienced this issue in 1.0.17 and 1.0.18.

I notice an similar issue #3812 was fixed in 1.0.18, but I can't find out which commit fixed #3812 (maybe the eventmanager?), and I think this is a different one.

By digging into the code, I find below code try to run build in a dedicated thread to void blocking the entire asgi server:

https://github.com/langflow-ai/langflow/blob/1caba1cd7c300e3111f7393a844c4eade82655a0/src/backend/base/langflow/api/v1/chat.py#L327

I think it should be build_task = asyncio.create_task(asyncio.to_thread(asyncio.run, _build_vertex(vertex_id, graph, event_manager))). Because _build_vertex is async.

I'm new to python, am I missing something?

dosubot[bot] commented 2 hours ago

Hey @9n! 👋 I'm here to help you with any bugs, questions, or contributions. Let's tackle this together!

To resolve the "Server is busy" issue caused by a slow build of flow in versions 1.0.17 and 1.0.18, you can modify the asynchronous task creation in the Langflow repository. Your proposed modification to the asynchronous task creation is incorrect because asyncio.run is meant to run the top-level entry point of an asyncio program and cannot be used within an already running event loop.

The original implementation is appropriate for non-blocking behavior in an ASGI context:

build_task = asyncio.create_task(await asyncio.to_thread(_build_vertex, vertex_id, graph, event_manager))

This code correctly runs the _build_vertex function in a separate thread to prevent blocking the ASGI server.

Additionally, starting from version 0.5, Langflow introduces the sync flag, which allows users to opt for asynchronous processing of their flows. This can help free up resources and provide better control over long-running tasks. You can include the sync flag in the payload of your POST request to the /api/v1/process/<your_flow_id> endpoint. When set to false, the API will initiate an asynchronous task instead of processing the flow synchronously.

Example API request with the sync flag:

curl -X POST \
  http://localhost:3000/api/v1/process/<your_flow_id> \
  -H 'Content-Type: application/json' \
  -H 'x-api-key: <your_api_key>' \
  -d '{"inputs": {"text": ""}, "tweaks": {}, "sync": false}'

You can check the status of an asynchronous task by making a GET request to the /task/{task_id} endpoint:

curl -X GET \
  http://localhost:3000/api/v1/task/<task_id> \
  -H 'x-api-key: <your_api_key>'

This modification ensures that if the task fails to run in Celery, it will fall back to running the task locally, which can help mitigate the "Server is busy" issue [1][2].

To continue talking to Dosu, mention @dosu.