jiaxiang-cheng / PyTorch-LSTM-for-RUL-Prediction

PyTorch implementation of remaining useful life prediction with long-short term memories (LSTM), performing on NASA C-MAPSS data sets. Partially inspired by Zheng, S., Ristovski, K., Farahat, A., & Gupta, C. (2017, June). Long short-term memory network for remaining useful life estimation.
Apache License 2.0
131 stars 23 forks source link

您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output? #3

Open mmm656 opened 3 months ago

mmm656 commented 3 months ago
          您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output?

屏幕截图 2024-08-05 165646

Originally posted by @mmm656 in https://github.com/jiaxiang-cheng/PyTorch-LSTM-for-RUL-Prediction/issues/2#issuecomment-2268539314

jiaxiang-cheng commented 3 months ago
          您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output?

屏幕截图 2024-08-05 165646

Originally posted by @mmm656 in #2 (comment)

你可以选择直接用hn_1或者output作为输出

mmm656 commented 3 months ago
          您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output?

屏幕截图 2024-08-05 165646 Originally posted by @mmm656 in #2 (comment)

你可以选择直接用hn_1或者output作为输出

但是直接用output作为输出会导致误差很大,可能是陷入了局部最小值,不知道您当初是否基于这样的考虑才换成hn_o+hn_1,请问这样做有没有理论支撑,或者只是出于经验?