Open starweavingdream opened 1 month ago
What about using a knowledge base to achieve this feature?
@crazywoola I think it is feasible, but it is better to have an attribute to distinguish the two, because the knowledge base needs to be indexed, but sensitive words do not need this. Of course, in order to avoid development costs, it is also possible to apply it directly.
We used the aws guardrail feature, and unfortunately, we had to pay quite a bit.
Hi, @starweavingdream. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.
Issue Summary:
Next Steps:
Thank you for your understanding and contribution!
Self Checks
1. Is this request related to a challenge you're experiencing? Tell me about your story.
During my use, I found that some users would raise some very sensitive questions, and then take screenshots of these questions and the answers given by the model and use them for other non-normal purposes; although Dify provides relevant content review functions, it only has 100 lines, which is obviously not enough. Although it provides API calls, it is relatively redundant to set up a separate service for sensitive word management. And the cost will be a bit high.
I hope to have a sensitive word management function in Dify, that is, to maintain sensitive words in a table. Instead of being fixed in the chat room as it is now, the sensitive word information can only be limited to 100 lines, and the short input box is not so easy to maintain. I hope to reference the data from the sensitive word management here later.
2. Additional context or comments
No response
3. Can you help us with this feature?