keisen / tf-keras-vis

Neural network visualization toolkit for tf.keras
https://keisen.github.io/tf-keras-vis-docs/
MIT License
318 stars 45 forks source link

'NoneType' object has no attribute 'ndim' #100

Open n-garc opened 1 year ago

n-garc commented 1 year ago

I have a tf 2.10 3DConv ANN with multiple regression outputs (model architecture at the end). I am attempting to use this package to generate gradcam++ heatmaps and I am getting the following error:

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Cell In[204], line 9
      7 scores = [cscore, InactiveScore(), InactiveScore()]
      8 scores = [cscore, cscore, cscore]
----> 9 cam = gradcam(scores, x, penultimate_layer='conv3d')
     11 # Render
     12 f, ax = plt.subplots(nrows=1, ncols=3, figsize=(12, 4))

Cell In[202], line 56, in GradcamPlusPlus2.__call__(self, score, seed_input, penultimate_layer, seek_penultimate_conv_layer, gradient_modifier, activation_modifier, training, expand_cam, normalize_cam, unconnected_gradients)
     53     score_values = [tf.cast(v, dtype=model.variable_dtype) for v in score_values]
     55 score_values = sum(tf.math.exp(o) for o in score_values)
---> 56 score_values = tf.reshape(score_values, score_values.shape + (1, ) * (grads.ndim - 1))
     58 if gradient_modifier is not None:
     59     grads = gradient_modifier(grads)

AttributeError: 'NoneType' object has no attribute 'ndim'

I am using a custom Score class: RegressionScore:

class RegressionScore(Score):
    """A score function that collects the scores from model output
    which is for regression
    """
    def __init__(self, target_values) -> None:
        """
        Args:
            target_values: A list of ints/floats.

        Raises:
            ValueError: When target_values is None or an empty list.
        """
        super().__init__('RegressionScore')
        self.target_values = target_values #listify(target_values, return_empty_list_if_none=False)
        if None in self.target_values:
            raise ValueError(f"Can't accept None. target_values: {target_values}")
        if len(self.target_values) == 0:
            raise ValueError(f"`indices` is required. target_values: {target_values}")

    def __call__(self, output) -> tf.Tensor:
        if output.ndim < 2:
            raise ValueError("`output` ndim must be 2 or more (batch_size, ..., channels), "
                             f"but was {output.ndim}")
        if output.shape[-1] <= max(self.target_values):
            raise ValueError(
                f"Invalid index value. target_values: {self.target_values}, output.shape: {output.shape}")
        target_values = self.target_values
        print(target_values.shape)
        print(output[0].shape)

        score = tf.math.abs(1.0 / (target_values - output[0]))
        print(score)
        return score

The offending line is grads = tape.gradient(score_values, penultimate_output, unconnected_gradients=unconnected_gradients). I have confirmed that score_values is a list of positive valued tensors and that penultimate_output is the output of the last conv layer.

Even using CategoricalScore on 5 sample videos I get the same error:

x = data[0:5,:,:,:,:]
gradcam = GradcamPlusPlus(model)
cscore = CategoricalScore([0,0,0,0,0])
scores = [cscore, cscore, cscore]
cam = gradcam(scores, x, penultimate_layer='conv3d')
n-garc commented 1 year ago

According to https://www.tensorflow.org/api_docs/python/tf/UnconnectedGradients, this is intended behavior to show that the source and target are unconnected and therefore have 0 gradients. It might be useful to wrap the gradients call in a try block and raise an exception with this information.

I'm not sure why my source and target are unconnected, going to keep digging.