Open isaactmb opened 7 months ago
Is this always that slow? I suspect that could be the delay from LLM, also feel free to attach the chatflow you were using
Seconding this, it's very fast on both playground and directly via Flowise but via API it's incredibly slow. Using Docker image on self-hosted server.
I think I probably have a good idea why, if you have document loader on the canvas, it will load the doc loader when you ask the first question, the next question should be relatively fast since we dont re-execute the whole flow for eevery question
I have a very simple flow. It is a RAG system with Vectara, with Open AI and memory with Zep and with Upstash, I have tried both.
Flowise is hosted by Digital Ocean on the Basic plan.
The response results in Postman are a disaster between 16 and 22 seconds.
Is this normal?
How can I improve it?
Expected behavior I expect 5 to 9 seconds.