albermax / innvestigate

A toolbox to iNNvestigate neural networks' predictions!
Other
1.25k stars 233 forks source link

'Not supposed to happen!' error when i analyze two combined model #187

Closed zzangho closed 4 years ago

zzangho commented 4 years ago

Thank you for the great toolbox.

I tried to analyze the model that are combined by two models, but it raised 'Exception: This is not supposed to happen!' error

here is my example code.

import numpy as np
from keras.applications import ResNet101

# create First model
inp_firstmodel = keras.layers.Input(shape=(448, 448, 3), name='input_first')
x_firstmodel = keras.layers.Conv2D(64, (3,3))(inp_firstmodel)
x_firstmodel = keras.layers.Activation('relu')(x_firstmodel)
x_firstmodel = keras.layers.Conv2D(128, (3,3))(x_firstmodel)
x_firstmodel = keras.layers.Activation('relu')(x_firstmodel)
model_first = keras.Model(inputs=[inp_firstmodel], outputs=x_firstmodel)

# create Second model(predefined model from the library)
model_second = innvestigate.utils.model_wo_softmax(
                             ResNet101(input_shape=(448, 448, 128), 
                                               classes=10, 
                                               weights=None, 
                                               include_top=True, pooling='avg'))
inp_for_combined = keras.layers.Input(shape=(448, 448, 3), name='input_combined')
res_tensor_from_model1 = model_first([inp_for_combined])
res_tensor_from_model2 = model_second(res_tensor_from_model1)
model_combined = keras.Model(inp_for_combined, outputs=[res_tensor_from_model2])

analyzer_cls = innvestigate.analyzer.GuidedBackprop(model_combined, allow_lambda_layers=True,
                                          neuron_selection_mode="all")
input_sample = np.random.randn(1,448,448,3)
analyze = analyzer_cls.analyze(input_sample)

could you advise how to deal with this problem? when i tired to analyze the model_second only, it works well, but analyzing model_combined raised the following error:

Traceback (most recent call last):

  File "<ipython-input-60-ffcee663d9bb>", line 1, in <module>
    analyze = analyzer_cls.analyze(input_sample)

  File "/home/zzangho/miniconda3/envs/tf36/lib/python3.6/site-packages/innvestigate/analyzer/base.py", line 473, in analyze
    self.create_analyzer_model()

  File "/home/zzangho/miniconda3/envs/tf36/lib/python3.6/site-packages/innvestigate/analyzer/base.py", line 411, in create_analyzer_model
    model, stop_analysis_at_tensors=stop_analysis_at_tensors)

  File "/home/zzangho/miniconda3/envs/tf36/lib/python3.6/site-packages/innvestigate/analyzer/gradient_based.py", line 256, in _create_analysis
    return super(GuidedBackprop, self)._create_analysis(*args, **kwargs)

  File "/home/zzangho/miniconda3/envs/tf36/lib/python3.6/site-packages/innvestigate/analyzer/base.py", line 711, in _create_analysis
    return_all_reversed_tensors=return_all_reversed_tensors)

  File "/home/zzangho/miniconda3/envs/tf36/lib/python3.6/site-packages/innvestigate/analyzer/base.py", line 700, in _reverse_model
    return_all_reversed_tensors=return_all_reversed_tensors)

  File "/home/zzangho/miniconda3/envs/tf36/lib/python3.6/site-packages/innvestigate/utils/keras/graph.py", line 1003, in reverse_model
    raise Exception("This is not supposed to happen!")

Exception: This is not supposed to happen!
zzangho commented 4 years ago

Workaround: This can be solved by resnet101 receiving the last output of the first model through the input_tensor argument instead of input_shape.