Open Derrick1994 opened 1 year ago
Orin,jetpack5.0.1 and tensorrt8.4.0.11,everything goes well
But,jetpack5.0.2 and tensorrt8.4.1.5,that can be compiled normally.But in the steps of conversion,report errors like these Assertion `(old_top == initial_top (av) && old_size == 0) || ((unsigned long) (old_size) >= MINSIZE && prev_inuse (old_top) && ((unsigned long) old_end & (pagesize - 1)) == 0)’ failed.
Can you share the method of converting onnx to trt file, thank you very very much!!
Hello Sorry ,maybe i dont detailed description! My onnx model was build on x86 by the tool from code, and the model can convert to trt on jetpack5.0.1 and tensorrt8.4.0.11,but my device version upgrade to jetpack5.0.2 and tensorrt8.4.1.5,then report errors. the onnx model from original code can convert to trt on jetpack5.0.2,so i think maybe the onnx which depend on the version of x86,like pytorch,cudnn and so on……
Orin,jetpack5.0.1 and tensorrt8.4.0.11,everything goes well
But,jetpack5.0.2 and tensorrt8.4.1.5,that can be compiled normally.But in the steps of conversion,report errors like these Assertion `(old_top == initial_top (av) && old_size == 0) || ((unsigned long) (old_size) >= MINSIZE && prev_inuse (old_top) && ((unsigned long) old_end & (pagesize - 1)) == 0)’ failed.