Closed Robintjhb closed 12 months ago
@Robintjhb 使用Tensorrt读取ONNX模型
ifstream tensorrt_file("path/to/model.trt", ios::binary);
shared_ptr<ICudaEngine> engine = nullptr;
shared_ptr<IExecutionContext> engine_context = nullptr;
tensorrt_file.seekg(0, ios::end);
auto file_size = tensorrt_file.tellg();
tensorrt_file.seekg(0, ios::beg);
vector<char> buffer(file_size);
tensorrt_file.read(buffer.data(), file_size);
tensorrt_file.close();
shared_ptr<IRuntime> runtime(createInferRuntime(logger));
engine.reset(runtime->deserializeCudaEngine(buffer.data(), file_size));
不使用trtexec ,能否把onnx文件 转为trt文件,进行tensorRT C++ 推理呢?有些情况是,不能使用trtexec的,通过pip 安装的tensorRT 环境没有trtexec~