FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
31.99k stars 16.69k forks source link

[BUG] Flowise API is really slow. Between 16 and 28 seconds #2256

Open isaactmb opened 7 months ago

isaactmb commented 7 months ago

I have a very simple flow. It is a RAG system with Vectara, with Open AI and memory with Zep and with Upstash, I have tried both.

Flowise is hosted by Digital Ocean on the Basic plan.

The response results in Postman are a disaster between 16 and 22 seconds.

Is this normal?

How can I improve it?

api

Expected behavior I expect 5 to 9 seconds.

HenryHengZJ commented 7 months ago

Is this always that slow? I suspect that could be the delay from LLM, also feel free to attach the chatflow you were using

dennisbenner commented 6 months ago

Seconding this, it's very fast on both playground and directly via Flowise but via API it's incredibly slow. Using Docker image on self-hosted server.

HenryHengZJ commented 6 months ago

I think I probably have a good idea why, if you have document loader on the canvas, it will load the doc loader when you ask the first question, the next question should be relatively fast since we dont re-execute the whole flow for eevery question