zhejz / HPTR

Real-Time Motion Prediction via Heterogeneous Polyline Transformer with Relative Pose Encoding. NeurIPS 2023.
https://zhejz.github.io/hptr
Other
120 stars 9 forks source link

Add for loops for each module of intra-class attn #8

Closed roydenwa closed 8 months ago

roydenwa commented 8 months ago

Hi,

in your Wayformer implementation, the intra-class attention modules (tf_tl, tf_map and tf_other) are nn.ModuleLists. Hence, for loops are required to use each module. I changed your code accordingly. (Not an issue in your default config, but as soon as you increase n_layer_tf)

zhejz commented 8 months ago

Thanks a lot for the PR!