There are some cases when the LLM is trying to give us the answer based on the created prompt, it would produce low quality response as the prompt contains very mixed topics. In those cases, we need to have a better separation between different topics in the prompt.
Solutions
[ ] To separate the topics more in the prompt, as we're feeding the summaries into the prompts, it could be much better to have the summaries in bullet point like rather than paragraph, this could be achieved by updating the summarizer prompt.
[ ] To even more improve the summary prompts, we could produce multiple documents per each bullet point of the summary and then when doing the vector similarity search, just the only relevant summaries would be fetched.
Problem Description
There are some cases when the LLM is trying to give us the answer based on the created prompt, it would produce low quality response as the prompt contains very mixed topics. In those cases, we need to have a better separation between different topics in the prompt.
Solutions