Closed devinbost closed 7 months ago
do you have more logs ? it seems an error in the python runtime.
After upgrading to Langstream 0.4.3, I get slightly different behavior. Now, it just seems to do nothing when I send a message through the UI. Here are the logs:
The code to reproduce it is here: https://github.com/devinbost/langstream/tree/azure-chatbot/examples/applications/azure-document-ingestion (The readme is outdated.)
Missing some input keys
is a langchain error.
The logs could be better to explain where the error comes from. We lose some error context during the translation from Python to Java (eg. the python stack trace).
Do you have unit tests ? It would help you to debug locally.
I think this comes from the prompt template. Since you didn't specify one, it's probably picking a default one but I don't know which one.
You don't put a prompt template in the chain so the first step is the retriever that takes a query
parameter as input.
See https://python.langchain.com/docs/use_cases/question_answering/vector_db_qa
The param is indeed named query
for the retriever
@cbornet You're absolutely right that was a LangChain issue. I remember now that I ran into this issue with LangChain once before when I tried adding conversational memory to a Q&A chain. It didn't cross my mind because the stack trace threw me off. I applied a change that fixed the issue in a previous project. However, due to the issue below, I can't determine if the change fixed the issue or not.
After I upgraded to Langstream 0.4.3, I'm not getting any logs when I send a message through the UI. What should I check next?
Here's a thread dump:
I found the issue. This was a clue:
17:39:16.394 [ws-consume-1] WARN o.a.k.c.c.i.ConsumerCoordinator -- [Consumer clientId=consumer--1, groupId=] Offset commit failed on partition answers-topic-0 at offset 0: This is not the correct coordinator.
I checked the pipeline.yaml input and output topics, and somehow I had inadvertently deleted the input line from the pipeline config. After setting it back to "questions-topic", it started working as desired.
For anyone else reading this, the fix was to replace:
response = self.answer_chain.invoke({
"question": question,
"chat_history": []
})
with:
response = self.answer_chain.run(question)
To implement conversational history, a conversational chain should be used instead, like ConversationalRetrievalChain
.
When I try to test my Python processor by entering a question in the LangStream UI, I get this exception.
I looked at how the message is being sent by the UI, and index.html sends the message like this:
handler.producer.send(JSON.stringify({value: message}))
It's not clear to me what the schema is of the message object, but it doesn't appear to have a query property. In my pipeline for the chatbot, I tried doing something like this:
but it's not even getting that far before giving the same gRPC exception.
If I remove that snippet, here is the pipeline:
Here is the Python processor:
Here is the gateway: