raghakot / keras-vis

Neural network visualization toolkit for keras
https://raghakot.github.io/keras-vis
MIT License
2.98k stars 664 forks source link

Support for standalone activation layers (keras.layers.advanced_activations) #56

Closed ahmedhosny closed 7 years ago

ahmedhosny commented 7 years ago

I have a 3d architecture with standalone leakyReLU layers (activation not applied as argument to layers) - see #53 for network architecture. Everything works great with backprop_modifier=None for both saliency and cam.

When trying to set the modifier to 'Relu' or 'guided', I get an error when the "modified model" is loaded back into keras- see here. Keras is unable to deserialize the temporary h5 file.

My intuition is that the step where all activation are changed to relu is messing up this specific model. I can work on fixing it, but have a couple of questions:

To recreate the error, download the temp h5 file generated by "tensorflow_backend.py" - then try loading it like so:

from keras.models import load_model
model = load_model('PATH_TO_H5')
raghakot commented 7 years ago

The reason for switching everything to Relu is so that I can override Relu's backprop step.

I wonder why it fails deserialization when we save the model with Relu. What is your model architecture? Are you on latest keras?

raghakot commented 7 years ago

Ok, so the issue is that we only replace Layer.activation or Activation layers. We need to add checks to replace all Advanced activation types as well. Relevant code block: https://github.com/raghakot/keras-vis/blob/master/vis/backend/tensorflow_backend.py#L72

It would be nice to do this in a scalable manner if new functions get added to advanced activations. One way is to inspect the advanced activations module, add them to a private dictionary and use that. That way when new stuff gets added, out code doesnt have to update.

raghakot commented 7 years ago

you can inspect module functions using inspect module in python

raghakot commented 7 years ago

Just pushed this. It should be fixed.