You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Most layers don't modify the time dimension, so don't need to modify the current mask. However, they may still want to be able to propagate the current mask, unchanged, to the next layer. This is an opt-in behavior. By default, a custom layer will destroy the current mask (since the framework has no way to tell whether propagating the mask is safe to do).
If you have a custom layer that does not modify the time dimension, and if you want it to be able to propagate the current input mask, you should set self.supports_masking = True in the layer constructor. In this case, the default behavior of compute_mask() is to just pass the current mask through.
Here's an example of a layer that is whitelisted for mask propagation:
class MyActivation(keras.layers.Layer):
def __init__(self, **kwargs):
super(MyActivation, self).__init__(**kwargs)
# Signal that the layer is safe for mask propagation
self.supports_masking = True
def call(self, inputs):
return tf.nn.relu(inputs)
I believe mask propagation is safe for the LMUCell, as it does not modify the time dimension, so we should be able to opt-in by setting self.supports_masking = True.
The text was updated successfully, but these errors were encountered:
https://www.tensorflow.org/guide/keras/masking_and_padding#opting-in_to_mask_propagation_on_compatible_layers:
I believe mask propagation is safe for the LMUCell, as it does not modify the time dimension, so we should be able to opt-in by setting
self.supports_masking = True
.The text was updated successfully, but these errors were encountered: