issues
search
rachtibat
/
LRP-eXplains-Transformers
Layer-Wise Relevance Propagation for Large Language Models and Vision Transformers [ICML 2024]
https://lxt.readthedocs.io
Other
54
stars
4
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
LLaMA family issues
#10
dvdblk
opened
1 day ago
3
add lrp for gpt2
#9
Tomsawyerhu
opened
2 days ago
1
How can i get each layer's lrp score?
#8
Patrick-Ni
opened
2 days ago
1
Error with torch.dtype=float16
#7
Patrick-Ni
closed
2 days ago
3
pip install ./lxt
#6
GeorgeRodinos
closed
3 weeks ago
4
added BERT implementation
#5
pkhdipraja
closed
1 month ago
4
Classification tasks example
#4
dvdblk
closed
1 month ago
6
LLaMA Quickstart repro with Inseq and compatibility question
#3
gsarti
opened
1 month ago
4
How to extract the relevance of neurons in FFN layers
#2
ChangWenhan
closed
2 months ago
2
[Desiderata] Captum-like implementation for Inseq compatibility
#1
gsarti
opened
5 months ago
6