milesial / Pytorch-UNet

PyTorch implementation of the U-Net for image semantic segmentation with high quality images
GNU General Public License v3.0
9.32k stars 2.52k forks source link

Gradients are clipped before the unscaling #468

Open marcovisentin opened 1 year ago

marcovisentin commented 1 year ago

At lines 114-115 in train.py. I believe 'scaler.unscale_(optimizer)' should be added before gradient clipping.

tensorctn commented 10 months ago

In my opinion, scaler.step(optimizer) include unscaleing and it do two things,first unscaling if you did't unscale manualy before.second,it will check if there exists overflows,if there are no NAN/INF,it will execute the optimizer's step,if there are,it will skip this iteration's parametes update.so if the gradients are clipped after the scaler.step,I think it make no sense.the gradients clip just aim to avoid gradient explosion,but if there exist gradinent explosion ,scale.step will skip this iteration's parametes update,abosolutely there in no need for clipping.