NVlabs / FB-BEV

Official PyTorch implementation of FB-BEV & FB-OCC - Forward-backward view transformation for vision-centric autonomous driving perception
Other
608 stars 40 forks source link

Float Point Precision for 3D heads #14

Open JUGGHM opened 10 months ago

JUGGHM commented 10 months ago

Thank you for your great work! I noticed that all 3d heads are forced to use fp32. Will there be performance drop if we use amp instead?