I searched in common questions and also in issues but I could not find any questions raised on this.
I want the inference time of nnUNet on Liver dataset. More precisely, the inference time taken by the trained network on the individual patches, and not time taken due to preprocessing and postprocessing. Could you please provide me with information if you have it? Also, since the official Pytorch repo of UNetPlusPlus is also built on your excellent repository, and I want its inference time as well, could you maybe point me out to the section of your repo, where I could use the torch.cuda.Event(enable_timing=True) to get the required information?
Hi,
all you would have to do is time the forward pass during inference. That should be rather easy. Either you tap directly into Generic_UNet or you use this line (3d unet only):
Hi Fabian,
I searched in common questions and also in issues but I could not find any questions raised on this.
I want the inference time of nnUNet on Liver dataset. More precisely, the inference time taken by the trained network on the individual patches, and not time taken due to preprocessing and postprocessing. Could you please provide me with information if you have it? Also, since the official Pytorch repo of UNetPlusPlus is also built on your excellent repository, and I want its inference time as well, could you maybe point me out to the section of your repo, where I could use the torch.cuda.Event(enable_timing=True) to get the required information?
Thanks a lot and have a nice day.
Best, Harsha