philipperemy / keras-tcn

Keras Temporal Convolutional Network.
MIT License
1.87k stars 454 forks source link

Question regarding return_sequences=True #260

Closed ohilo12 closed 2 weeks ago

ohilo12 commented 1 month ago

Hello, this is not a bug report, but a question about your implementation.

I have implemented a 1dim-CNN according to the example in the figure. The transition from the CNN layers to the dense layers is done by global max-pooling (not flattening, as the time series can be of different lengths).

1st question: How would I manage this structure with the TCN? I would have to set return_sequences=False? Model.add( TCN(nb_filters=4, kernel_size=(2), nb_stacks=1, dilations=[1,2,4], padding="causal", use_skip_connections=True, dropout_rate=0.0, return_sequences=True, activation="relu", input_shape=(None, k)) ) Model.add(GlobalMaxPooling1D())

2nd question: If I understand it correctly, if return_sequences=False is set, I will only get the bottom elements (marked blue) in the figure. Would that be correct?

Thank you in advance for your explanation.

Image

philipperemy commented 3 weeks ago

Hey,

Your data is a 3D tensor with shape (batch_size, timesteps, input_dim) where input_dim=K, timesteps=N.

If you set return_sequences=True, the TCN will output a tensor of shape (batch_size, timesteps, features).

If you use a GlobalMaxPooling1D, it will squeeze the time axis and compute the maximum to output a tensor of shape (batch_size, features).

If you set return_sequences=False, the TCN will output (batch_size, features) and you don't need to max pool 1d.

First question

Correct. The architecture you posted looks good to me.

Second question

Yes correct. It will be the blue one. The TCN will only return the last vector of features after processing all the previous steps.

ohilo12 commented 2 weeks ago

Ok, fine, if I have understood it correctly. I want to consider all elements, not just the blue ones. Therefore, I prefer the maximum over the time axis instead of using the last blue output value. But now everything is clear, thanks for your answer!

philipperemy commented 2 weeks ago

You're welcome!