Azure-Samples / contoso-chat

This sample has the full End2End process of creating RAG application with Prompty and AI Studio. It includes GPT 3.5 Turbo LLM application code, evaluations, deployment automation with AZD CLI, GitHub actions for evaluation and deployment and intent mapping for multiple LLM task mapping.
MIT License
468 stars 2.88k forks source link

How to fetch more relevant products based on the previous question #65

Closed coderSinol closed 9 months ago

coderSinol commented 9 months ago

It looks like the prompt flow cannot fetch products based on the chat history. For instance, if I asked for rain jackets, it would suggest a few. But if I ask "Can you suggest more?", then it provides random products without considering the chat history.

cassiebreviu commented 9 months ago

How are you running the promptflow? Are you setting the chat history prop when you call it?

coderSinol commented 9 months ago

Thank you for your reply. I found the issue. My JSON data structure had an issue.

This is how I configured chat history in the customer prompt. Is that correct?

History {% for item in chat_history %} user: {{item.inputs.question}}: assistant: {{item.outputs.answer}} {% endfor %}