deadbits / vigil-llm

⚡ Vigil ⚡ Detect prompt injections, jailbreaks, and other potentially risky Large Language Model (LLM) inputs
https://vigil.deadbits.ai/
Apache License 2.0
270 stars 32 forks source link

Accept multiple prompts #34

Open deadbits opened 9 months ago

deadbits commented 9 months ago

Modify submission endpoint and dispatch to allow for a list of prompts to be scanned instead of just one. Input should be a string or list of strings.