Open fsbashiri opened 1 year ago
@fsbashiri thanks for reporting! I propose an explanation. I'm not 100% sure, you can challenge me.
My clue is the TCN works a bit like an RNN even though it has no states like a LSTM would have.
The last outputs depend on the end of the sequence but also on the beginning.
post
, the end is padded with zeros. The last outputs will be non zeros because they also depend on the beginning of the sequence, which contains values (non zeros). pre
, the beginning will be padded with 0 and the first outputs will be 0.For your second point _keras_mask
, I guess we should not directly try to call it. But it's strange if it does not exist. Does it exist for other Keras layers that support masking? Maybe we should add it somewhere in the layer because it's not inherited from the Layer object. I don't know.
This problem is mentioned on issue #89. Author states "Con1d by keras lacking supports for Masking layer".
Describe the bug In my project, I am using TCN for sequence-to-sequence analysis of time series data that have variable lengths. I have defined a subclass of the Sequence class that pads each batch of data to its maximum sequence length (similar to what is suggested here). As for the model, I use a masking layer to compute and pass a mask to TCN (as suggested here issue #234). Supposedly, layers that support masking will automatically propagate the mask to the next layer. In the simplest form of my model, I have a masking layer, followed by a TCN, and a Dense layer with 1 unit.
Here are two issues that I've got:
_keras_mask
.Paste a snippet Please see the following simple code:
The output of the code:
Dependencies I am using: keras 2.4.3 keras-tan 3.1.1 Tensorflow-gpu 2.3.1