DerryHub / BEVFormer_tensorrt

BEVFormer inference on TensorRT, including INT8 Quantization and Custom TensorRT Plugins (float/half/half2/int8).
Apache License 2.0
425 stars 69 forks source link

could you please share BEFormer_tiny onnx model? #22

Closed RunnerZhong closed 1 year ago

RunnerZhong commented 1 year ago

I want to evaluate the model, onnx model is the best choice, thanks~

DerryHub commented 1 year ago

Sorry, this onnx model relies on some special operator plugins. But these plugins are not implemented on onnxruntime. Therefore, this onnx model can only be used as an intermediate model for the tensorrt engine.

RunnerZhong commented 1 year ago

got it, thanks your reply, could you help share it just for reference?

DerryHub commented 1 year ago

bevformer_tiny_epoch_24_cp.onnx

RunnerZhong commented 1 year ago

Thanks a lot for your sharing ~~