microsoft / semantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps
https://aka.ms/semantic-kernel
MIT License
21.31k stars 3.13k forks source link

Long term memory not working in chatapp sample app #1033

Closed poweihuang0817 closed 1 year ago

poweihuang0817 commented 1 year ago

Describe the bug Due to token issue, the response from text completion will not be valid json. SemanticChatMemory memory = SemanticChatMemory.FromJson(result.ToString());

internal static async Task<SemanticChatMemory> ExtractCognitiveMemoryAsync(

{"items": [{"label": "workspaceName", "details": "SoP_High_Memory"}, {"label": "datasetName", "details": "SoP_High_Memory_0718"}, {"label": "columnName", "details": "AvgMemoryConsumption"}, {"label": "tableName", "details": "WabiAllHighRefresh0718"}, {"label...

To Reproduce Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior A clear and concise description of what you expected to happen.

Screenshots If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

Additional context Add any other context about the problem here.

TaoChenOSU commented 1 year ago

Hello!

Thanks for reporting the issue!

Could you provide more details as to what you mean by token issue? Did the prompt exceed the token limit?

Please note that the models will not always generate valid Json blocks.

poweihuang0817 commented 1 year ago

warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] No vectors were found. warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] No vectors were found. warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] No vectors were found. warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] No vectors were found. warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] No vectors were found. warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] Vectors not found warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] Vectors not found warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] No vectors were found. warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] Vectors not found warn: Microsoft.SemanticKernel.Connectors.Memory.Qdrant.QdrantMemoryStore[0] Vectors not found

  I'm suspecting there is some issue in memory setting.
TaoChenOSU commented 1 year ago

Hi Powe!

Could you provide more details on how you encountered the errors/warnings? Logs or screenshots will help greatly.

poweihuang0817 commented 1 year ago

image

TaoChenOSU commented 1 year ago

Could you let me know what you did? i.e.

  1. Did you import a document?
  2. Is this your first message? (I am asking this because the memory is empty when you start a new chat)
craigomatic commented 1 year ago

This may have been caused by a bug that was impacting the Qdrant connector. This was resolved in #1313

@poweihuang0817 if you have a spare moment to try with the latest code from the repository (I don't think the fix has made it to nuget yet) please let us know if the issue persists

poweihuang0817 commented 1 year ago

Hello craig and tao, I've mixed two issues up. The bigger question I want to report is that the long term memory would not be json format. GPT will not response as I requested. And then there will be json parsing error. I manage to reduce that by reducing the token.

TaoChenOSU commented 1 year ago

Hello Powei, it's common that GPT doesn't respond with the correct Json format. That's why we chose to skip instead of raising errors in memory extraction: https://github.com/microsoft/semantic-kernel/blob/main/samples/apps/copilot-chat-app/webapi/CopilotChat/Skills/ChatSkills/SemanticChatMemoryExtractor.cs#L57.

And of course, token limit will be another cause too. With the enough spare token for the models to respond, it's possible that the model cannot complete the Json block, leading to parsing error.

Let us know if that answers your question.

TaoChenOSU commented 1 year ago

Closing now as it appears this issue has been resolved. Please reopen as needed.