FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
30.79k stars 16k forks source link

[BUG] Conversational Retrieval QA Chain #3383

Open doggi87 opened 3 days ago

doggi87 commented 3 days ago

Describe the bug I have a fully working RAG with Conversational Retrieval QA Chain and using Document Store (Vector) I have also tried direct in the flow with splitters and PDF document. It workes fine with Conversational Retrieval QA Chain with the additional parameters at first. When I ask something outside the PDF which is loaded then I get sorry answer that it can't find it but after some hours or day after when I check exact same thing Conversational Retrieval QA Chain doesn't seems to know anything in additional parameters as the model answers direct outside the document which is loaded. I seems that the additional parameters not working as it should. And the FAISS is used and the faiss.index is in it´s place so it works.

To Reproduce Steps to reproduce the behavior:

  1. After creating the document store which works after test I go to step 2.
  2. Chatflows
  3. Add the Conversational Retrieval QA Chain, document store (vector) ollama llm buffer memory and the chat works normally and everything from the document is working.
  4. There is no error but some time after the additional parameters in Conversational Retrieval QA Chain is not getting affected at all.
HenryHengZJ commented 21 hours ago

what do you mean by additional parameters ? and does it only happens with faiss? because faiss index file will get replaced entirely everytime you have a new upsert, it does not append, it replaces