BEVFormer inference on TensorRT, including INT8 Quantization and Custom TensorRT Plugins (float/half/half2/int8).
430
stars
71
forks
source link
Did you really run the BEVFromer(Pytorch) on 3090? Or you just copy the result from https://github.com/fundamentalvision/BEVFormer #91
Open
admyxs opened 11 months ago
Did you really run the BEVFromer(Pytorch) on 3090? Or you just copy the result from https://github.com/fundamentalvision/BEVFormer. I found the bevformer_tiny runned on 8 A100-SXM-80GB in https://github.com/fundamentalvision/BEVFormer, you run bevformer_tiny on 8 3090, but you get a same score compared to https://github.com/fundamentalvision/BEVFormer, that`s weird