Closed mohdsinad closed 3 years ago
Hi, @mohdsinad .
Unfortunately, as long as I know, Gradcam can't work with the model above. Because the graph of the cascade model that has any model as layer is disconnected in inside of it. So you can choose one of following.
tf_keras_vis.scorecam.Scorecam
Scorecam can avoid this problem because it is a gradient independent method. If you want to use Scorecam, please refer to the link below.
https://keisen.github.io/tf-keras-vis-docs/examples/attentions.html#ScoreCAM
If you want to build non-cascade model, please implement it like below. The code below is same as your model above except that its graph is unified.
import tensorflow as tf
base_model = tf.keras.applications.EfficientNetB0(include_top=False, pooling='avg')
x = base_model.output
x = tf.keras.layers.Dropout(0.2)(x)
x = tf.keras.layers.Dense(1024, activation='relu')(x)
x = tf.keras.layers.Dense(512, activation='relu')(x)
x = tf.keras.layers.Dense(1, activation='sigmoid')(x)
flat_model = tf.keras.Model(inputs=base_model.inputs, outputs=x)
flat_model.summary()
Thanks!
Thanks, @keisen for the prompt explanation. Will definitely try to implement your suggestions. Just a small doubt, will I be able to use already pre-trained weights in the case where the graph is unified?
P.S. I'm a beginner in using GradCAM
will I be able to use already pre-trained weights in the case where the graph is unified?
Yes, you will. Both the code you posted and I posted will also work the same for Transfer-Learning.
If the pre-trained weights is loaded to base_model
instance, it is also reflected to flat_model
instance.
Thank you very much for the clarification. Also, greatly thankful for helping me out
@mohdsinad , you're welcome. I hope you like tf-keras-vis! Please star this repository if you do.
Hi, I'm getting an error when I try to get the grad-cam visualizations for a custom model as given below.
Then when I try to call to GradCam I get:
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_1:0", shape=(None, 224, 224, 3), dtype=float32) at layer "rescaling". The following previous layers were accessed without issue: []