MIC-DKFZ / nnUNet

Apache License 2.0
5.79k stars 1.74k forks source link

Inference time #567

Closed harsharaman closed 3 years ago

harsharaman commented 3 years ago

Hi Fabian,

I searched in common questions and also in issues but I could not find any questions raised on this.

I want the inference time of nnUNet on Liver dataset. More precisely, the inference time taken by the trained network on the individual patches, and not time taken due to preprocessing and postprocessing. Could you please provide me with information if you have it? Also, since the official Pytorch repo of UNetPlusPlus is also built on your excellent repository, and I want its inference time as well, could you maybe point me out to the section of your repo, where I could use the torch.cuda.Event(enable_timing=True) to get the required information?

Thanks a lot and have a nice day.

Best, Harsha

FabianIsensee commented 3 years ago

Hi, all you would have to do is time the forward pass during inference. That should be rather easy. Either you tap directly into Generic_UNet or you use this line (3d unet only):

https://github.com/MIC-DKFZ/nnUNet/blob/447cf802669de019b55c3968d3ec6fb2232f472b/nnunet/network_architecture/neural_network.py#L522

Best, Fabian

harsharaman commented 3 years ago

Thank you 👍🏾