Open maciejwie opened 4 months ago
I found that the error is due to this line: https://github.com/Chainlit/chainlit/blob/efc2c78359e99bdacc5082468f93f1170085bf3e/backend/chainlit/langchain/callbacks.py#L595
I fixed it this way for my local testing, but I haven't had time to understand why the new change is added and whether the change below is fully safe.
current_step.output = output[0] if isinstance(output, Sequence) else output
@qtangs can you please share the entire fix .was facing the simliar issue
The code I shared above is all you need, just replace the line with it. Let me know how it goes.
Describe the bug A regression was introduced in 1.1.400 (works ok in 1.1.306) which fails to process all Langchain chains correctly and throws errors such as:
After adding some instrumentation in the callback, my project shows:
To Reproduce Steps to reproduce the behavior:
from operator import itemgetter from typing import Optional
import chainlit as cl from langchain_core.prompts import ( ChatPromptTemplate, HumanMessagePromptTemplate, MessagesPlaceholder, SystemMessagePromptTemplate, ) from langchain_core.runnables import Runnable, RunnableLambda, RunnablePassthrough from langchain.memory import ConversationBufferWindowMemory from langchain.schema.runnable.config import RunnableConfig from langchain.schema import StrOutputParser from langchain_openai import ChatOpenAI
@cl.on_chat_start async def start(): llm = ChatOpenAI() memory = ConversationBufferWindowMemory( memory_key="chat_history", return_messages=True, k=5, )
initialize empty memroy
@cl.on_message async def run(message: cl.Message):
Retrieve session data