Closed jack-vinitsky closed 1 month ago
Faced the same issue, also not present in the associated cosmos db
As far as I have seen the backend, there isn't any logic to save the responses in the database. You can create your own endpoint to save the response in the conversations database and call it from the React code (to be precise Chat.tsx file).
Yeap, I also investigated the code and haven't found saving of responses. They used to work. I didn't have time to find a commit where a problem was introduced. Probably easier just to fix by myself.
Here is what I did:
We're seeing same issues. Also deployed using 1073
The issue is fixed for me, using latest available image
I have confirmed as well. Latest update #1116 has resolved the issue.
Glad to see this issue resolved! We had the same problem with LLM responses not being captured in Cosmos DB, and this fix is crucial for teams relying on detailed logs for reporting.
For those exploring conversation analytics, capturing both user prompts and LLM responses helps track key metrics like response times or missed replies. This data is essential for optimizing AI interactions.
That’s why we built AInsights for Power BI, available on AppSource, to help teams analyze conversation data and improve system performance.
Thanks again for the quick fix—it’s a big help!
Describe the bug Currently the chat history is only recording the end user's prompts and not the responses from the LLM. Checking the Cosmos DB database only shows end user prompts.
To Reproduce Steps to reproduce the behavior:
Expected behavior The history of the conversation includes responses from the LLM.
Screenshots Original Conversation:
Conversation Retrieved from Chat History:
Configuration: Please provide the following
Additional context This is only happening on one of the 5 instances of this chatbot that we are running although this one is the most recent update that we have deployed (as of pull request #1073)