JulesBelveze / time-series-autoencoder

PyTorch Dual-Attention LSTM-Autoencoder For Multivariate Time Series
Apache License 2.0
630 stars 62 forks source link

Encoded state #14

Closed TDominiak closed 1 year ago

TDominiak commented 3 years ago

Hi,

As I understand well you create y_hist in order to perform training using teacher forcing by providing the true target value to the Decoder. As an argument to the forward Decoder function, you only pass y_hist. I cannot understand why you do not pass also the Encoded Vector as the final hidden state produced from the encoder part of the model.

Thanks

JulesBelveze commented 3 years ago

Hey @TDominiak thanks for actually pointing that out, that does not seem right to me either. Tbh I haven't tried the model without the attention mechanisms! Would you like to open a PR that fixes the case or a regular decoder?? 😄 (shouldn't be too much work)

Otherwise I'll try to find some time soon !

TDominiak commented 3 years ago

Yes, sure. after i finish analyzing the solution with attention i will take care of this bug.

JulesBelveze commented 3 years ago

Hey @TDominiak just checking out if you need some help?