Closed HATEM-ELAZAB closed 2 years ago
@HATEM-ELAZAB , This issue is not related to Keras repo.Please find the correct repo and post the issue to resolve the issue.Thanks!
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.
Closing as stale. Please reopen if you'd like to work on this further.
I have a model, for which i need to compute the gradients of output w.r.t the model's input. But I want to apply some custom gradients for some of the layers in my model, which can be tedious to build from scratch. So i tried the code explained in this link [https://github.com/Lasagne/Recipes/blob/master/examples/Saliency%20Maps%20and%20Guided%20Backpropagation.ipynb]. I added the following two classes:
The helper class that allows us to replace a nonlinearity with an Op that has the same output, but a custom gradient
The subclass that does guided backpropagation through a nonlinearity:
Then i used them in my code as follows:
relu = nn.nonlinearities.rectify relu_layers = [layer for layer in nn.layers.get_all_layers(net['l_out']) if getattr(layer, 'nonlinearity', None) is relu] modded_relu = GuidedBackprop(relu)
for layer in relu_layers: layer.nonlinearity = modded_relu
prop = nn.layers.get_output( net['l_out'], model_in, deterministic=True)
for sample in range(ini, batch_len):
model_out = prop[sample, 'z'] # get prop for label 'z' gradients = theano.gradient.jacobian(model_out, wrt=model_in)
gradients = theano.grad(model_out, wrt=model_in)