albermax / innvestigate

A toolbox to iNNvestigate neural networks' predictions!
Other
1.25k stars 235 forks source link

Question regarding the Implementation of Integrated Gradients and LRP-z #156

Open bnaman50 opened 5 years ago

bnaman50 commented 5 years ago

Hey Alber,

  1. I was reading the Integrated Gradients (IG) paper and my understanding is that they backpropagate through a model with softmax layer. I also looked at their code to verify this and the same was mentioned there as well. image

But looking at your implementation, my understanding is that you pass a model_wo_softmax for analyzing. It would be great if you could help me understand as to why this discrepancy is happening or did I miss something?

  1. While looking at the list of methods available in this toolbox, I found that there is another variation of LRP called LRP-z but surprisingly, I could not find it in the papers. In the papers, authors mostly talk about LRP-eps/LRP-alpha-beta. Could you please refer me to the paper where authors discuss LRP-z.

Thanks, Naman

enryH commented 4 years ago

It is briefly mentioned in the appendix under A, if I see it correctly. Reference given is:

Montavon, G., Lapuschkin, S., Binder, A., Samek, W., Müller, K.R.: Explaining nonlinear classification decisions with deep Taylor decomposition. Pattern Recogn. 65, 211–222 (2017)

sebastian-lapuschkin commented 4 years ago

Hi @champnaman:

TL;DR: LRP-z = LRP-eps with eps=0. LRP-z can be regarded as the most basic (naive) decomposition approach

bnaman50 commented 4 years ago

Hi @sebastian-lapuschkin

Thank you and @enryH for your responses regarding my second question. Could you please help me clear the confusion in the first question as well?

Thanks, Naman

sebastian-lapuschkin commented 4 years ago

There is a group of methods, which avoids using the softmax for explanation, e.g. LRP, Sensitivity Analysis, Deconv (afaik) and some others.

There is an answer/explanation here why those methods remove the softmax.

Hope this answers the remaining question ;)

sebastian-lapuschkin commented 4 years ago

plus, in addition to above link:

softmax(x) = softmax(x+c) with c in R