FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
31.72k stars 16.55k forks source link

[BUG] LLM Chat models do not work outside Flowise after 2nd prompt, unless it's OpenAI Chat Model #3422

Open TheVoidRoger opened 3 weeks ago

TheVoidRoger commented 3 weeks ago

Describe the bug LLM Chat models do not work outside Flowise after 2nd prompt, unless it's OpenAI Chat Model Tried to use Other Chat Models like Anthropic, & Cohere. They all worked inside Flowise though.

To Reproduce Create a simple RAG app with a Conversation Retrieval QA Chain, Pinecone, OpeAI Embedding, Upstash Redis-backed Chat memory, and Chat Model [Anthropic, OpenAI, Cohere].

Run it inside Flowise. It works fine. Embed it outside, you can chat with it one time only. But It won't respond the 2nd time you prompt it.

Expected behaviour It won't respond the 2nd time you prompt it outside Flowise.

HenryHengZJ commented 2 weeks ago

Not able to replicate this. Are you using the latest embed version? And what version of Flowise you are using currently. This might helps - https://docs.flowiseai.com/using-flowise/embed#using-specific-version