Open Body123 opened 6 days ago
Hi @Body123,
nnUNet does not support any batch_size!=1
during inference in the nnUNetv2_predict command. Is there a specific reason you would like to increase the batch size?
Best, Yannick
because I want to predict a huge a mount of images and the inference is too slow, can you suggest something that can help me, please?
is there any updates
or is there any similar command that helps us with more batch size rather than one?
I want to increase the batch size during inference with this command nnUNetv2_predict