Project-MONAI / research-contributions

Implementations of recent research prototypes/demonstrations using MONAI.
https://monai.io/
Apache License 2.0
1.02k stars 334 forks source link

Can you share the loss curve of the Swin-UNETR pre-training process #73

Open upupming opened 2 years ago

upupming commented 2 years ago

Hi, Thanks for your great work on Swin-UNETR, I am trying to run pre-training on another dataset (~2000 CTs). But the loss curve seems not to be decreased:

image

Could you share your loss curve on the 5050 CTs dataset? Thank you very much!

I am pre-training the model on a single GPU with batch size 2.

ahatamiz commented 2 years ago

Hi @upupming

I see. We trained the model using 4 nodes with 8 GPUs (total 32). We noticed scaling up the pre-training to multi-node is important for convergence.

GaoHuaZhang commented 1 year ago

Hi @ahatamiz, i meet the same problem ,as i pre-training the model on a single GPU with batch size 2.My input and ground-turth are shown in the image, but the output does not seem to have any resemblance to the original image, do you mean to train on multiple GPUs to have good results x1_aug x1_aug x1_gt x1_gt x1_recon x1_recon

tangy5 commented 1 year ago

hI @GaoHuaZhang ,

Thanks, batch size should be a key point. We don't have the record for this training any more, but we have this https://github.com/Project-MONAI/tutorials/tree/main/self_supervised_pretraining which is a similar strategy pre-trianing, you could see loss curves with this tutorial.

Thanks

lyangfan commented 1 year ago

Hi @ahatamiz, i meet the same problem ,as i pre-training the model on a single GPU with batch size 2.My input and ground-turth are shown in the image, but the output does not seem to have any resemblance to the original image, do you mean to train on multiple GPUs to have good results x1_aug x1_aug x1_gt x1_gt x1_recon x1_recon

Have you solved the problem? I have the same problem as you