I'm implementing a custom
tf.keras.layers.Layer that needs to support masking.
Consider the following scenario
embedded = tf.keras.layer.Embedding(input_dim=vocab_size + 1, output_dim=n_dims, mask_zero=True) x = MyCustomKerasLayers(embedded)
Now per the documentation
mask_zero: Whether or not the input value 0 is a special "padding" value that should be masked out. This is useful when using recurrent layers which may take variable length input. If this is True then all subsequent layers in the model need to support masking or an exception will be raised. If mask_zero is set to True, as a consequence, index 0 cannot be used in the vocabulary (input_dim should equal size of vocabulary + 1).
How do I support masking?
How do I access the mask from the past layer?
Assuming input of
(batch, time, channels)or `(batch, time) would the masks look different? What will be their shapes?
How do I pass it on to the next layer?