Open christmas-ww opened 2 years ago
update CUDA verison to 11.4,it also does not work.
After debugging, it was found that the error appeared in
if (!parser->parseFromFile(modelFile.data(), static_cast<int>(nvinfer1::ILogger::Severity::kWARNING)))
{
std::cerr << ": failed to parse onnx model file, please check the onnx version and trt support op!"
<< std::endl;
exit(-1);
}
maybe pointpillar.onnx file exits some problem.
Thanks.
my machine is 3090,maybe i need modify CMakeLists.txt?
After debugging, it was found that the error appeared in
if (!parser->parseFromFile(modelFile.data(), static_cast<int>(nvinfer1::ILogger::Severity::kWARNING))) { std::cerr << ": failed to parse onnx model file, please check the onnx version and trt support op!" << std::endl; exit(-1); }
maybe pointpillar.onnx file exits some problem.
Thanks. hi, i have the same problem,and i also think maybe personal onnx file exits some problem,but the model from original code goes well. Do you fix it?
After debugging, it was found that the error appeared in
if (!parser->parseFromFile(modelFile.data(), static_cast<int>(nvinfer1::ILogger::Severity::kWARNING))) { std::cerr << ": failed to parse onnx model file, please check the onnx version and trt support op!" << std::endl; exit(-1); }
maybe pointpillar.onnx file exits some problem.
Thanks. hi, i have the same problem,and i also think maybe personal onnx file exits some problem,but the model from original code goes well. Do you fix it?
Hello, I could compile,when run ./demo ,get this error.
Building TRT engine. ../model/pointpillar.onnxtrt_infer: ModelImporter.cpp:773: While parsing node number 6 [PillarScatterPlugin -> "input.3"]: trt_infer: ModelImporter.cpp:774: --- Begin node --- demo: malloc.c:2401: sysmalloc: Assertion `(old_top == initial_top (av) && old_size == 0) || ((unsigned long) (old_size) >= MINSIZE && prev_inuse (old_top) && ((unsigned long) old_end & (pagesize - 1)) == 0)' failed. 已放弃 (核心已转储) My cuda and tensorrt versions are:
CUDA: 11.1 cuDNN: 8.4.1 TensorRT: 8.4.1
Thanks in advance.