Describe the bug
LLM Chat models do not work outside Flowise after 2nd prompt, unless it's OpenAI Chat Model
Tried to use Other Chat Models like Anthropic, & Cohere.
They all worked inside Flowise though.
To Reproduce
Create a simple RAG app with a Conversation Retrieval QA Chain, Pinecone, OpeAI Embedding, Upstash Redis-backed Chat memory, and Chat Model [Anthropic, OpenAI, Cohere].
Run it inside Flowise. It works fine.
Embed it outside, you can chat with it one time only. But It won't respond the 2nd time you prompt it.
Expected behaviour
It won't respond the 2nd time you prompt it outside Flowise.
Describe the bug LLM Chat models do not work outside Flowise after 2nd prompt, unless it's OpenAI Chat Model Tried to use Other Chat Models like Anthropic, & Cohere. They all worked inside Flowise though.
To Reproduce Create a simple RAG app with a Conversation Retrieval QA Chain, Pinecone, OpeAI Embedding, Upstash Redis-backed Chat memory, and Chat Model [Anthropic, OpenAI, Cohere].
Run it inside Flowise. It works fine. Embed it outside, you can chat with it one time only. But It won't respond the 2nd time you prompt it.
Expected behaviour It won't respond the 2nd time you prompt it outside Flowise.