nengo / keras-lmu

Keras implementation of Legendre Memory Units
https://www.nengo.ai/keras-lmu/
Other
207 stars 35 forks source link

Opting in to supports_masking #36

Open arvoelke opened 3 years ago

arvoelke commented 3 years ago

https://www.tensorflow.org/guide/keras/masking_and_padding#opting-in_to_mask_propagation_on_compatible_layers:

Most layers don't modify the time dimension, so don't need to modify the current mask. However, they may still want to be able to propagate the current mask, unchanged, to the next layer. This is an opt-in behavior. By default, a custom layer will destroy the current mask (since the framework has no way to tell whether propagating the mask is safe to do).

If you have a custom layer that does not modify the time dimension, and if you want it to be able to propagate the current input mask, you should set self.supports_masking = True in the layer constructor. In this case, the default behavior of compute_mask() is to just pass the current mask through.

Here's an example of a layer that is whitelisted for mask propagation:

class MyActivation(keras.layers.Layer):
    def __init__(self, **kwargs):
        super(MyActivation, self).__init__(**kwargs)
        # Signal that the layer is safe for mask propagation
        self.supports_masking = True

    def call(self, inputs):
        return tf.nn.relu(inputs) 

I believe mask propagation is safe for the LMUCell, as it does not modify the time dimension, so we should be able to opt-in by setting self.supports_masking = True.