microsoft / sample-app-aoai-chatGPT

Sample code for a simple web chat experience through Azure OpenAI, including Azure OpenAI On Your Data.
MIT License
1.66k stars 2.61k forks source link

Responses from LLM not being recorded in Chat History DB #1111

Closed jack-vinitsky closed 1 month ago

jack-vinitsky commented 1 month ago

Describe the bug Currently the chat history is only recording the end user's prompts and not the responses from the LLM. Checking the Cosmos DB database only shows end user prompts.

To Reproduce Steps to reproduce the behavior:

  1. Create a new chat and note the responses
  2. Start a new chat and then go to chat history.
  3. Click on a prior conversation.
  4. Note that only the end user's prompts are displayed.

Expected behavior The history of the conversation includes responses from the LLM.

Screenshots Original Conversation:

image

Conversation Retrieved from Chat History:

image

Configuration: Please provide the following

Additional context This is only happening on one of the 5 instances of this chatbot that we are running although this one is the most recent update that we have deployed (as of pull request #1073)

dmytroHexagon commented 1 month ago

Faced the same issue, also not present in the associated cosmos db

Ashutosh2547 commented 1 month ago

As far as I have seen the backend, there isn't any logic to save the responses in the database. You can create your own endpoint to save the response in the conversations database and call it from the React code (to be precise Chat.tsx file).

dmytroHexagon commented 1 month ago

Yeap, I also investigated the code and haven't found saving of responses. They used to work. I didn't have time to find a commit where a problem was introduced. Probably easier just to fix by myself.

Ashutosh2547 commented 1 month ago

Here is what I did:

ukiguy commented 1 month ago

We're seeing same issues. Also deployed using 1073

dmytroHexagon commented 1 month ago

The issue is fixed for me, using latest available image

jack-vinitsky commented 1 month ago

I have confirmed as well. Latest update #1116 has resolved the issue.

cdrguru commented 1 month ago

Glad to see this issue resolved! We had the same problem with LLM responses not being captured in Cosmos DB, and this fix is crucial for teams relying on detailed logs for reporting.

For those exploring conversation analytics, capturing both user prompts and LLM responses helps track key metrics like response times or missed replies. This data is essential for optimizing AI interactions.

That’s why we built AInsights for Power BI, available on AppSource, to help teams analyze conversation data and improve system performance.

Thanks again for the quick fix—it’s a big help!