twosixlabs / armory-library

Python library for Adversarial ML Evaluation
https://twosixlabs.github.io/armory-library/
MIT License
9 stars 3 forks source link

Research spike: LLM attack design #174

Closed ashleyha closed 1 month ago

ashleyha commented 2 months ago

Conduct literature review to identify an LLM attack to implement. Write follow-on issue(s) for attack implementation.

Example papers:

ashleyha commented 1 month ago

Completed, follow-on issue is #181