At least with Ollama ("llama3.1" or "llama3.2"), the CustomerSupport sample is not working.
The summary works, but the issues is in the chat prompt. There is string.Join("\n", searchResults) in the prompt. The searchResults are ManualChunks. ToString of ManualChunk returns just "ManualChunk", so there is no real context.
So the message to be passed to CompleteAsync ends up being:
Using the following data sources as context, answer the user query: Summary?
## Context
ManualChunk
ManualChunk
ManualChunk
ManualChunk
ManualChunk
Response:
Which amounts to no real context for the LLM (the LLM complains even that it only sees some "ManualChunk" texts as context).
At least with Ollama ("llama3.1" or "llama3.2"), the CustomerSupport sample is not working.
The summary works, but the issues is in the chat prompt. There is
string.Join("\n", searchResults)
in the prompt. ThesearchResults
areManualChunk
s.ToString
ofManualChunk
returns just "ManualChunk", so there is no real context.So the
message
to be passed toCompleteAsync
ends up being:Which amounts to no real context for the LLM (the LLM complains even that it only sees some "ManualChunk" texts as context).
Code https://github.com/dotnet/ai-samples/blob/05a832fac604e5d8f1dec4abdf55425ca1a17ac0/src/chat/CustomerSupport/Utils.cs#L155-L170