Application which investigates defensive measures against prompt injection attacks on an LLM, with a focus on the exposure of external tools.
16
stars
10
forks
source link
Manual Test Guidelines Wiki or ReadME #848
Closed
kclark-scottlogic closed 5 months ago
Adding in documentation somewhere about manual testing steps/guidelines needed for regression