Closed SummerDream233 closed 1 week ago
I find the length of my document chunk is too long, which leads to that the LLM may forget the prompt.
I check the retrival chunk, and it involve about 22000 words.
I think I need to reconsider the chunk size.
Is there an existing issue for the same bug?
Branch name
main
Commit ID
18ae8a4
Other environment information
No response
Actual behavior
I find that most answers from RAG have too many words, about 400-600 words.
Therefore, I want to limit the answer length by prompt (because it seems that there is no other method to limit the length).
But, I try different way to write prompt, the answers are still the same.
I think this maybe a bug. I want to confirm if you correctly passed the prompt parameter into the program when generating the answer. It seems that the prompt do not work.
Expected behavior
No response
Steps to reproduce
Additional information
The details about prompt:
The output: