MIC-DKFZ / nnUNet

Apache License 2.0
5.33k stars 1.63k forks source link

I want to increase the batch size during inference #2315

Open Body123 opened 6 days ago

Body123 commented 6 days ago

I want to increase the batch size during inference with this command nnUNetv2_predict

ykirchhoff commented 6 days ago

Hi @Body123,

nnUNet does not support any batch_size!=1 during inference in the nnUNetv2_predict command. Is there a specific reason you would like to increase the batch size?

Best, Yannick

Body123 commented 6 days ago

because I want to predict a huge a mount of images and the inference is too slow, can you suggest something that can help me, please?

Body123 commented 6 days ago

is there any updates

Body123 commented 5 days ago

or is there any similar command that helps us with more batch size rather than one?