deadbits / vigil-llm

⚡ Vigil ⚡ Detect prompt injections, jailbreaks, and other potentially risky Large Language Model (LLM) inputs
https://vigil.deadbits.ai/
Apache License 2.0
277 stars 33 forks source link

YARA rule for PII #13

Closed deadbits closed 10 months ago

deadbits commented 10 months ago

Create new YARA rule for detecting PII and other sensitive information