Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
7.31k stars 965 forks source link

LangchainTracer throwing ValueErrors #1175

Open maciejwie opened 4 months ago

maciejwie commented 4 months ago

Describe the bug A regression was introduced in 1.1.400 (works ok in 1.1.306) which fails to process all Langchain chains correctly and throws errors such as:

Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')
Error in LangchainTracer.on_chain_end callback: TypeError('cannot unpack non-iterable NoneType object')
Error in LangchainTracer.on_retriever_end callback: ValueError('too many values to unpack (expected 2)')
Error in LangchainTracer.on_chain_end callback: ValueError('too many values to unpack (expected 2)')

After adding some instrumentation in the callback, my project shows:

current_step <chainlit.step.Step object at 0x16a3cddf0>
run.outputs {'output': set()}
outputs {'output': set()}
output_keys ['output']
output set()
current_step <chainlit.step.Step object at 0x16a3cddf0>
2024-07-29 15:46:47 - Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')
current_step <chainlit.step.Step object at 0x16a410610>
run.outputs {'output': None}
outputs {'output': None}
output_keys ['output']
output None
current_step <chainlit.step.Step object at 0x16a410610>
2024-07-29 15:46:47 - Error in LangchainTracer.on_chain_end callback: TypeError('cannot unpack non-iterable NoneType object')
current_step <chainlit.step.Step object at 0x16a3cd880>
run.outputs {'documents': [Document(real Document data)]}
outputs {'documents': [Document(real Document data)]}
output_keys ['documents']
output [Document(real Document data)]}
current_step <chainlit.step.Step object at 0x16a3cd880>
2024-07-29 15:46:49 - Error in LangchainTracer.on_retriever_end callback: ValueError('too many values to unpack (expected 2)')
current_step <chainlit.step.Step object at 0x16a3ffee0>
run.outputs {'chat_history': []}
outputs {'chat_history': []}
output_keys ['chat_history']
output []
current_step <chainlit.step.Step object at 0x16a3ffee0>
2024-07-29 15:46:49 - Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')

To Reproduce Steps to reproduce the behavior:

  1. Run this code:
    
    from __future__ import annotations

from operator import itemgetter from typing import Optional

import chainlit as cl from langchain_core.prompts import ( ChatPromptTemplate, HumanMessagePromptTemplate, MessagesPlaceholder, SystemMessagePromptTemplate, ) from langchain_core.runnables import Runnable, RunnableLambda, RunnablePassthrough from langchain.memory import ConversationBufferWindowMemory from langchain.schema.runnable.config import RunnableConfig from langchain.schema import StrOutputParser from langchain_openai import ChatOpenAI

@cl.on_chat_start async def start(): llm = ChatOpenAI() memory = ConversationBufferWindowMemory( memory_key="chat_history", return_messages=True, k=5, )

initialize empty memroy

memory.load_memory_variables({})

template = "You are a helpful chatbot."

prompt = ChatPromptTemplate.from_messages(
    [
        SystemMessagePromptTemplate.from_template(template),
        MessagesPlaceholder(variable_name="chat_history"),
        HumanMessagePromptTemplate.from_template("{question}"),
    ]
)

runnable = RunnablePassthrough.assign(
    chat_history=RunnableLambda(memory.load_memory_variables) | itemgetter("chat_history")
).assign(output=prompt | llm | StrOutputParser())

# Store session data
cl.user_session.set("runnable", runnable)
cl.user_session.set("memory", memory)

@cl.on_message async def run(message: cl.Message):

Retrieve session data

runnable: Optional[Runnable] = cl.user_session.get("runnable")

# Create message object for the response
msg = cl.Message(content="")

# Generate the response, and stream the output
async for chunk in runnable.astream(
    {"question": message.content},
    config=RunnableConfig(callbacks=[cl.LangchainCallbackHandler()]),
):
    # Update the message with the response
    if chunk.get("output"):
        await msg.stream_token(chunk.get("output"))

# Finalize message and update UI
await msg.send()

2. Send any message
3. Look at logs for string `Error in LangchainTracer.on_chain_end callback: ValueError('not enough values to unpack (expected 2, got 0)')`

**Expected behavior**
LangchainTracer to not throw ValueErrors.

**Desktop (please complete the following information):**

- OS: MacOS
- Langchain version: tried both 0.2.7 and 0.2.11 (latest)
qtangs commented 4 months ago

I found that the error is due to this line: https://github.com/Chainlit/chainlit/blob/efc2c78359e99bdacc5082468f93f1170085bf3e/backend/chainlit/langchain/callbacks.py#L595

I fixed it this way for my local testing, but I haven't had time to understand why the new change is added and whether the change below is fully safe.

current_step.output = output[0] if isinstance(output, Sequence) else output
adhil-tinvio commented 4 months ago

@qtangs can you please share the entire fix .was facing the simliar issue

qtangs commented 4 months ago

The code I shared above is all you need, just replace the line with it. Let me know how it goes.