yuqinie98 / PatchTST

An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Apache License 2.0
1.38k stars 248 forks source link

代码中的位置编码 #12

Closed 369369Geryy closed 1 year ago

369369Geryy commented 1 year ago

u = self.dropout(u + self.W_pos) self.W_pos = positional_encoding(pe, learn_pe, q_len, d_model)其中的positional_encoding未进行定义。

369369Geryy commented 1 year ago

监督学习中遗漏了positional_encoding文件,在自监督学习里可以找到

yuqinie98 commented 1 year ago

你好!positional_encoding在layers.PatchTST_layers有定义

369369Geryy commented 1 year ago

好的,谢谢