FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
31.36k stars 16.32k forks source link

[Bug] Prediction api different of prediction-internal api #2828

Closed sergiozaninotti closed 3 months ago

sergiozaninotti commented 3 months ago

Hello! Firstly, I would like to thank you for the wonderful work, I thought the flowise was incredible!!

Now let's get to the problem... When I use the /api/v1/prediction/{chatId} endpoint, the bot cannot retrieve the history and keeps repeating the first presentation question, but in the /api/v1/internal-prediction/{chatId} which is for internal use, works perfectly, follows the entire context of the prompt and the flow of the Retrieval QA chain. The external prediction api is unable to retrieve the history correctly to continue with this flow. I'm using Redis as memory, see the images...

To Reproduce Steps to reproduce the behavior:

Flowise - Low-code LLM apps builder

prediction api video: https://www.awesomescreenshot.com/video/29677894?key=37e090ba9f30b280a5487cda5a38c6bc prediction-internal api video: https://www.awesomescreenshot.com/video/29678070?key=60a6c14095cfa93eae61ab33e8c7c949

The videos show that the same questions in prediction-internal work perfectly but in the api they don't. prediction api has no history: image

prediction-internal api has history: image

sergiozaninotti commented 3 months ago

Soved.

The problem was that the chatId was not included in the request after the first interaction returned.