jiaxiang-cheng / PyTorch-LSTM-for-RUL-Prediction

PyTorch implementation of remaining useful life prediction with long-short term memories (LSTM), performing on NASA C-MAPSS data sets. Partially inspired by Zheng, S., Ristovski, K., Farahat, A., & Gupta, C. (2017, June). Long short-term memory network for remaining useful life estimation.
Apache License 2.0
118 stars 23 forks source link

about the form of input data to the model #2

Open wowSUNBOY opened 1 year ago

wowSUNBOY commented 1 year ago

Thanks for your code which is really useful.

And I have a question:

During the model definition, the input form should be (batch_size, seq_len, input_size) since the batch_first=True.

But when running the code, the trajectory data are sent into LSTM as the form (seq_len, batch_size, input_size) which is on the opposite.

Such as the 1st trajectory whose length is 192, the data x which is sent to LSTM is in (192, 1, 15).

Can you help me with this problem? Very appreciate.

jiaxiang-cheng commented 1 year ago

Thanks for your code which is really useful.

And I have a question:

During the model definition, the input form should be (batch_size, seq_len, input_size) since the batch_first=True.

But when running the code, the trajectory data are sent into LSTM as the form (seq_len, batch_size, input_size) which is on the opposite.

Such as the 1st trajectory whose length is 192, the data x which is sent to LSTM is in (192, 1, 15).

Can you help me with this problem? Very appreciate.

can you spot exact locations of the issues

zl13133581232 commented 9 months ago

您好请问这里没有划分训练测试和验证么

jiaxiang-cheng commented 9 months ago

您好请问这里没有划分训练测试和验证么

您好,训练和测试集是分开的,验证集可根据训练效果自己划分。

mmm656 commented 1 month ago

您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output? 屏幕截图 2024-08-05 165646