Pointcept / PointTransformerV3

[CVPR'24 Oral] Official repository of Point Transformer V3 (PTv3)
MIT License
587 stars 30 forks source link

Minimum requirement #5

Open darissa opened 5 months ago

darissa commented 5 months ago

Hello, I would like to try out PTV3 with minimum requirement, but may know what is the outcome if disable the flash attention?

Thanks in advanced.

Gofinge commented 5 months ago

Hi, FlashAttention makes PTv3 faster, and the memory cost and forward latency won't increase even with an enlarged patch size.