HaochengWan / PVT

PVT: Point-Voxel Transformer for 3D Deep Learning
https://arxiv.org/abs/2108.06076
MIT License
87 stars 8 forks source link

Relative Attention in Point branch? #1

Closed shanjiayao closed 2 years ago

shanjiayao commented 2 years ago

Hi, it's really a nice work, but I have some problems about point branch.

As described in paper, the Relative Attention is performed in the point branch to obtains significant improvements over not using this RPR term without adding extra training parameters. But it seems that there is not relative code in repo?

https://github.com/HaochengWan/PVT/blob/master/modules/pvtconv.py#L298

shanjiayao commented 2 years ago

@HaochengWan

zhangcheng828 commented 2 years ago

Hi, we will release our code about the latest version in two weeks.

zhangcheng828 commented 2 years ago

Hi, the code of Relative Attention was uploaded.

shanjiayao commented 2 years ago

Hi, the code of Relative Attention was uploaded.

Thanks for your reply, it's really helpful!