albermax / innvestigate

A toolbox to iNNvestigate neural networks' predictions!
Other
1.25k stars 233 forks source link

Any plans to support huggingface/transformers? #182

Closed lapolonio closed 2 years ago

lapolonio commented 4 years ago

If not where can the community ask about implementation details?

p16i commented 4 years ago

@lapolonio this might be relevant to your question: https://github.com/jessevig/bertviz.

lapolonio commented 4 years ago

thanks. I thought that library was primarily for visualization? transformers makes that possible by exposing the activations and hidden states at each layer. exposed here: https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bert.py#L923 and aggregated here: https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bert.py#L572 I thought this library was a better candidate since it uses reflection to add LRP calculations to layers.

I was trying to work through a basic example using transformers. the difficulty I am having is transformers combines the preprocessing and model input into 1 function https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bert.py#L566 while innvestigate specifically asks for a model using a keras input layer.

What do you think is the best way forward? Thanks again.

syomantak commented 4 years ago

@heytitle I actually want LRP outputs as I want to compare my XAI method with it. LRP's relevance scores may differ from the attention significantly!

adrhill commented 2 years ago

Hi @lapolonio and @CyanideBoy,

we currently don't implement the functionality, but you might be interested in Ali et al., "XAI for Transformers: Better Explanations through Conservative Propagation".

I'm closing this issue since we currently don't have plans to support Transformers.