Open hrishikeshv opened 6 years ago
experiencing the same issue. Found this as well:
I am also getting the same error, can this issue be fixed?
I made it work following piece of code from : https://github.com/tensorflow/tensorflow/issues/783
def _compute_gradients(tensor, var_list): grads = tf.gradients(tensor, var_list) return [grad if grad is not None else tf.zeros_like(var) for var, grad in zip(var_list, grads)]
and transforming from
grads = normalize(K.gradients(loss, conv_output)[0])
to
grads = normalize(_compute_gradients(loss, [conv_output])[0])
osimeoni's answer worked out for me
I get another error when i use:
def _compute_gradients(tensor, var_list):
def _compute_gradients(tensor, var_list):
grads = tf.gradients(tensor, var_list)
  return [grad if grad is not None else tf.zeros_like(var) for var, grad in zip(var_list, grads)]
the error is:
zip argument #1 must support iteration
I finally find out where is key problem: K.gradient return None, because it counter an op that can't gradient I doubt that the Sequential have used to encapsulate vgg16 , that is a Model type here is code in grad_cam funciton after alter:
nb_classes = 1000
target_layer = lambda x: target_category_loss(x, category_index, nb_classes)
last = Lambda(target_layer, output_shape=target_category_loss_output_shape)(input_model.output)
model = Model(inputs=input_model.input, outputs=last)
loss = K.sum(model.layers[-1].output)
# loss = model.layers[-1].output
conv_output = [l for l in model.layers if l.name is layer_name][0].output
grads = normalize(K.gradients(loss, conv_output)[0])
# grads = normalize(_compute_gradients(loss, conv_output)[0])
gradient_function = K.function([model.inputs[0]], [conv_output, grads])
For everyone who is facing the same issue: I forked this repository and fixed all errors I was facing.
https://github.com/jacobgil/keras-grad-cam/pull/27 https://github.com/PowerOfCreation/keras-grad-cam
@PowerOfCreation Thanks, it really helps. It works!
For anyone who still experience this error with tensorflow 2.x. The default argument at tf.gradients (unconnected_gradients='none') causes this error. Set it to ='zero' could solve your problem.
I am getting the following error when I am trying to run your code on one of the example images. Is this due to some missing package? Thanks!