Hi,
I was trying to use captum.attr._core.layer_activation.LayerActivation to get the activation of the first convolutional layer in a simple model. Here is my code:
In fact, I have computed the activation in two different ways and compared them afterwards. Obviously, I expected a value close to zero to be printed as the output, however, this is what I got:
tensor(3.4646, grad_fn=<NormBackward0>)
I hypothesize that the inplace ReLU layer after the convolutional layer acts on its output since there were many zeros in the activation computed by Captum ( i.e. layer_act.attribute(input)). In fact, when I changed the architecture of the network to the following:
Hi, I was trying to use
captum.attr._core.layer_activation.LayerActivation
to get the activation of the first convolutional layer in a simple model. Here is my code:In fact, I have computed the activation in two different ways and compared them afterwards. Obviously, I expected a value close to zero to be printed as the output, however, this is what I got:
I hypothesize that the inplace
ReLU
layer after the convolutional layer acts on its output since there were many zeros in the activation computed by Captum ( i.e.layer_act.attribute(input)
). In fact, when I changed the architecture of the network to the following:then the outputs matched.
System information