chr5tphr / zennit

Zennit is a high-level framework in Python using PyTorch for explaining/exploring neural networks using attribution methods like LRP.
Other
191 stars 32 forks source link

Flat Rule for Pooling Layers #9

Closed rachtibat closed 3 years ago

rachtibat commented 3 years ago

Hi Chris,

I am defining a new composite for instance:


@register_composite('all_flat')
class AllFlat(LayerMapComposite):

    def __init__(self, canonizers=None):
        layer_map = [
            (Linear, Flat()),
            (AvgPool, Flat()),
            (Activation, Pass()),
            (Sum, Norm()),
        ]

        super().__init__(layer_map, canonizers=canonizers)

The problem is, that the Flat() rule changes the parameter of a layer and the pooling layers do not define the "weight" parameter. As a consequence, there will be a RuntimeError saying, that zennit tries to access the parameter "weight" which is not available. The solution would be to define a new rule that does not have a param_modifier, for instance:


class FlatPooling(LinearHook):
    '''This is the Flat LRP rule. It is essentially the same as the WSquare Rule, but with all parameters set to ones.
    '''
    def __init__(self):
        super().__init__(
            input_modifiers=[torch.ones_like],
            param_modifiers=[None],
            output_modifiers=[lambda output: output],
            gradient_mapper=(lambda out_grad, outputs: out_grad / stabilize(outputs[0])),
            reducer=(lambda inputs, gradients: gradients[0])
        )

What do you think?

Best

chr5tphr commented 3 years ago

Ahh, yes, I think the Parameter-less Flat-Rule makes sense.

Though I actually think the expected behavior of Flat would be, that it also works with Pooling Layers.

Maybe getting rid of the [None]-behavior, but instead also allowing non-existent parameters in mod_params might be a better approach, then we would not need an additional rule.

chr5tphr commented 3 years ago

I will think about how to modify mod_params. Maybe a flag to ignore non-existent weights/biases might be beneficial, instead of the 'None'-behavior.

chr5tphr commented 3 years ago

Okay, I fixed this with #16 . Could you try it out and see whether it works for you?

rachtibat commented 3 years ago

Hi,

yes, now it's working! Good idea to add more control so that the user can choose which parameters to modify.

Thanks