qq456cvb / Point-Transformers

Point Transformers
MIT License
609 stars 102 forks source link

Not like common self-attention #42

Open warmbodytt opened 4 months ago

warmbodytt commented 4 months ago

https://github.com/qq456cvb/Point-Transformers/blob/545ecc4fe55120338447c99016429b54dc022b5a/models/Menghao/model.py#L65 The tensor 'attention' here may be transposed. I'm not sure. Please have a check.