chr5tphr / zennit

Zennit is a high-level framework in Python using PyTorch for explaining/exploring neural networks using attribution methods like LRP.
Other
183 stars 33 forks source link

Changing The Value of epsilon in the composite Epsilonplusflat creates a bug #203

Closed StudentAlessandro closed 8 months ago

StudentAlessandro commented 8 months ago

Hi to everyone I wanted to describe a problem that I encounter when I use the code of the tutorial.

print(f'Prediction: {output.argmax(1)[0].item()}')



This is the code i'm talking about, when i put a new value of epsilon inside the parenthesis of epsilonplusflat, the new heatmap is literally the same even if i change the epsilon 
chr5tphr commented 8 months ago

Hey @StudentAlessandro

the epsilon value in the Epsilon rule adds a stabilizer constant to the denominator, i.e. $$R_i = x_i \sumj \frac{w{ji}}{\sum{i'} x{i'} w_{ji'} + b_j + \varepsilon} R_j .$$ Since the denominator normalizes the attainable relevance to $1$, the increased $\varepsilon$ acts as lost relevance (as does the bias $b$), i.e., the sum over all relevances at that layer will be smaller than at the previous layer $\sum_i R_i < \sum_j R_j$. Except for numerical stability, this will not influence your resulting heatmap in usual feed-forward-style networks without parallel layers as you likely normalize the heatmap to its full range before plotting.

StudentAlessandro commented 8 months ago

Hey @chr5tphr , thank you for the answer, now it's clear.