Project-MONAI / MONAI

AI Toolkit for Healthcare Imaging
https://monai.io/
Apache License 2.0
5.76k stars 1.06k forks source link

Make Dints TensorRT Convertable #6079

Open binliunls opened 1 year ago

binliunls commented 1 year ago

Is your feature request related to a problem? Please describe. TensorRT is a SDK for high-performance deep learning inference. For the trained pytorch models, Torch-TensorRT is the most convenient way to apply the TensorRT acceleration. Therefore, we are trying to accelerate DiNTS trought this way.

However, currently the DiNTS net structure in MONAI cannot be converted through Torch-TensorRT. After having a discussion with the Torch-TensorRT team, this failed convert is caused by the torch.nn.ModuleDict module in the network like this one, which is not supported by Torch-TensorRT.

Describe the solution you'd like As the Torch-TensorRT team suggest, there are two ways that are worth to try:

  1. unpack torch.nn.ModuleDict and lay out the modules in a flat manner during inference phase.
  2. replace it with a Python level dictionary

Describe alternatives you've considered A clear and concise description of any alternative solutions or features you've considered.

Additional context Add any other context or screenshots about the feature request here.

Nic-Ma commented 1 year ago

Hi @dongyang0122 ,

Could you please help take a look at this ticket? Is it possible to adjust the implementation of DiNTS according to the suggestion?

Thanks in advance.

dongyang0122 commented 1 year ago

Hi @binliunls, how to verify if the conversion of Torch-TensorRT is successful?

binliunls commented 1 year ago

Hi @binliunls, how to verify if the conversion of Torch-TensorRT is successful?

Hi @dongyang0122 , I am working on the PR to add trt_export to MONAI. Currently it works fine with me and I am going to add some log support for it. You could use this API for verifying. You could either run the convert command line shown below with fp32 and fp16 precision to check if the model is convertable, like what we did for converting torchscript models. Please notice the convert process may last several minutes.

python -m monai.bundle trt_export network_def --filepath models/model_trt_fp32.ts --ckpt_file models/model.pt --meta_file configs/metadata.json --config_file configs/inference.json --precision fp32

python -m monai.bundle trt_export network_def --filepath models/model_trt_fp16.ts --ckpt_file models/model.pt --meta_file configs/metadata.json --config_file configs/inference.json --precision fp16

Or you can directly use the API in a python way and modify some parameters.

Thanks, Bin

dongyang0122 commented 1 year ago

Hi @binliunls, I cannot run the commands successfully. The command cannot find trt_export. Do you mean ckpt_export here?

binliunls commented 1 year ago

Hi @binliunls, I cannot run the commands successfully. The command cannot find trt_export. Do you mean ckpt_export here?

Hi @dongyang0122 Sorry for not making this clearly, this API is still in a WIP PR instead of the main branch. Did you use the wrong one? I would like to merge this to main branch ASAP. Thanks, Bin