Closed lapolonio closed 2 years ago
@lapolonio this might be relevant to your question: https://github.com/jessevig/bertviz.
thanks. I thought that library was primarily for visualization? transformers
makes that possible by exposing the activations and hidden states at each layer. exposed here: https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bert.py#L923 and aggregated here: https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bert.py#L572 I thought this library was a better candidate since it uses reflection to add LRP calculations to layers.
I was trying to work through a basic example using transformers
. the difficulty I am having is transformers
combines the preprocessing and model input into 1 function https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bert.py#L566
while innvestigate
specifically asks for a model using a keras input layer.
What do you think is the best way forward? Thanks again.
@heytitle I actually want LRP outputs as I want to compare my XAI method with it. LRP's relevance scores may differ from the attention significantly!
Hi @lapolonio and @CyanideBoy,
we currently don't implement the functionality, but you might be interested in Ali et al., "XAI for Transformers: Better Explanations through Conservative Propagation".
I'm closing this issue since we currently don't have plans to support Transformers.
If not where can the community ask about implementation details?