MASILab / 3DUX-Net

244 stars 34 forks source link

training speed very slow is right? #26

Closed yml-bit closed 1 year ago

yml-bit commented 1 year ago

Hello, the utilization rate of GPU fluctuates continuously from 0 to 100%. Is the training speed very slow?

leeh43 commented 1 year ago

Do you mind to tell me the time in tqdm bar for one epoch (The expected time for all batches)? I believe you are using kernel size of 7, so the training speed is slower than SwinUNETR, but it won't be too slow.

leeh43 commented 1 year ago

I believe it will use around 14-15 GB memory during training if you keep using patches of 96x96x96, but it will increase when it comes to validation for the sliding window inference strategy in MONAI. If it is only 1% for most 2/3 of the time, it is pretty weird.