albermax / innvestigate

A toolbox to iNNvestigate neural networks' predictions!
Other
1.25k stars 233 forks source link

Add operation of LRP always applies the Z-Rule #180

Open berleon opened 4 years ago

berleon commented 4 years ago

Hello,

the LRP Add operation always applies the Z-Rule as stated in the documentation: """Special Add layer handler that applies the Z-Rule""" https://github.com/albermax/innvestigate/blob/master/innvestigate/analyzer/relevance_based/relevance_analyzer.py#L254

When using the LRPAlpha1Beta0 analyzer, I would expect that the Z+ Rule is used, as the a+b can be seen as a linear layer: [1 1] @ [a b]^t. I think a similar problem exists for AveragePoolingReverseLayer.

Thanks, Leon

berleon commented 4 years ago

I implemented custom LRP rules for Add and BatchNorm. I constructed conv layers corresponding to the operations. The implementation is a bit hacky and definitely not in the quality to be considered for a pull request but it works: https://gist.github.com/berleon/47303fe7f0c06eeb79bc294b5a027530

For ResNet the GlobalAveragePooling2D layer does not behave differently for the z-Rule and the alpha,beta-Rule, as the input is always positive.