Interpreting the BART model, basically obtaining the same as we can get using BERT. Especially interested in word attributions and visualization for sentence classification.
Motivation
Using BART for sequence classification, I have tried using the same templates and example notebooks to obtain the word attributions. However, they do not work. I think the main reason is that BART is not built the same as BERT, and the Embeddings layer used in the BERT examples is not present in BART. You could access something similar using the BARTEncoder part; however, it breaks when I pass that as input since it is probably not in the expected format.
Pitch
If possible, I would like to see the image with words in green and red of a sentence when BART is being used for sequence classification. If possible, also be able to do and get all the visualizations obtained with BERT.
Alternatives
At the very least, I would like to be able to obtain the numerical word attributions.
Additional context
Also, this might not be a feature, you might already support it, but I need to learn how to do it properly, and I haven't found anything on the web. Thank you so much for your attention and help.
🚀 Feature
Interpreting the BART model, basically obtaining the same as we can get using BERT. Especially interested in word attributions and visualization for sentence classification.
Motivation
Using BART for sequence classification, I have tried using the same templates and example notebooks to obtain the word attributions. However, they do not work. I think the main reason is that BART is not built the same as BERT, and the Embeddings layer used in the BERT examples is not present in BART. You could access something similar using the BARTEncoder part; however, it breaks when I pass that as input since it is probably not in the expected format.
Pitch
If possible, I would like to see the image with words in green and red of a sentence when BART is being used for sequence classification. If possible, also be able to do and get all the visualizations obtained with BERT.
Alternatives
At the very least, I would like to be able to obtain the numerical word attributions.
Additional context
Also, this might not be a feature, you might already support it, but I need to learn how to do it properly, and I haven't found anything on the web. Thank you so much for your attention and help.