Open gsarti opened 2 years ago
Is there a plan to add LRP to inseq?
Hi @saxenarohit, in principle the Captum LRP implementation should be directly compatible with Inseq. However, the implementation is very model specific with some notable (and to my knowledge, presently unsolved) issues with skip connections, which are the bread and butter of most transformer architectures used in Inseq (see pytorch/captum#546).
I think in general to proceed with an integration we should make sure that:
🚀 Feature Request
The following is a non-exhaustive list of gradient-based feature attribution methods that could be added to the library:
pytorch/captum
pytorch/captum
pytorch/captum
pytorch/captum
PAIR-code/saliency
uclanlp/NLP-Interpretation-Faithfulness
josephenguehard/time_interpret
rachtibat/LRP-for-Transformers
Notes: