MungoMeng / Registration-CorrMLP

[CVPR2024 Oral && Best Paper Candidate] CorrMLP: Correlation-aware MLP-based networks for deformable medical image registration
GNU General Public License v3.0
42 stars 2 forks source link

24GB graphics card memory is not enough #5

Open Miraclerice opened 2 months ago

Miraclerice commented 2 months ago

Hello, I tried to use 3090 to run your network, and found that the video memory was insufficient. The picture size is the same. Is there any way to reduce the video memory usage? I see that your server is 4090, which is also 24GB. If you have time, please reply

MungoMeng commented 2 months ago

Please make sure that the "use_checkpoint" is set as "True". Turning on the checkpoint function will reduce more than 50% GPU memory consumption.

Miraclerice commented 2 months ago

Please make sure that the "use_checkpoint" is set as "True". Turning on the checkpoint function will reduce more than 50% GPU memory consumption.

Thank you very much for your answer. I keep the parameter setting of the network, 'use_checkpoint' defaults to True. Is there any other setting https://github.com/MungoMeng/Registration-CorrMLP/blob/da5ce37276a9a233bd85865e8dc487ac4a8047da/CorrMLP/networks.py#L22

MungoMeng commented 2 months ago

This might be because using the checkpointing function in 4090 can reduce more GPU memory than 3090. Perhaps you can reduce the 'enc_channels' and 'dec_channels' a little bit.

Miraclerice commented 2 months ago

Ok, I'll try. Thank you

ZhaiJiaKai commented 2 days ago

好的,我试试。谢谢

Hello, have you solved this problem? I also encountered the same problem when training. I used 4090