zbf1991 / RRM

Reliability Does Matter: An End-to-End Weakly Supervised Semantic Segmentation Approach
94 stars 11 forks source link

Loss NAN #13

Closed convnets closed 3 years ago

convnets commented 3 years ago

I trained the network with train_cls_weight.py and initialized the model with res38_cls.pth you provided. But the loss encounter NAN, can you help check why ?

dloss weight 1e-07
{'gpu_id': '0', 'LISTpath': 'voc12/train_aug(id).txt', 'IMpath': '/home/test/VOCdevkit/VOC2012/JPEGImages', 'SAVEpath': './output/model_weights', 'batch_size': 4, 'max_step': 20000, 'network': 'network.RRM', 'lr': 0.0007, 'num_workers': 16, 'wt_dec': 1e-05, 'weights': './netWeights/res38_cls.pth', 'session_name': 'RRM_', 'crop_size': 321, 'class_numbers': 20, 'crf_la_value': 4, 'crf_ha_value': 32, 'densecrfloss': 1e-07, 'rloss_scale': 0.5, 'sigma_rgb': 15.0, 'sigma_xy': 100}
Session started:  Sun Sep 27 12:50:05 2020
Iter:    0/20000 Loss:6.9927 imps:1.0 Fin:Mon Sep 28 10:08:25 2020 lr: 0.0007
closs: 0.0044 celoss: 9.4394 dloss: -0.6275
closs: 0.1205 celoss: 5.1730 dloss: -0.3368
closs: 0.1477 celoss: 5.8635 dloss: -0.2407
closs: 0.5365 celoss: 4.6322 dloss: -0.2464
closs: 0.2427 celoss: 3.2945 dloss: -0.3354
closs: 0.0825 celoss: 5.2225 dloss: -1.0677
closs: 0.0979 celoss: 9.1182 dloss: -1.2027
closs: 0.1901 celoss: 8.9809 dloss: -0.4709
closs: 0.3256 celoss: 1.4215 dloss: -0.1852
closs: 0.5821 celoss: 3.6866 dloss: -0.0932
closs: 0.4300 celoss: 3.5157 dloss: -0.2225
closs: 0.5388 celoss: 4.0678 dloss: -0.4153
closs: 0.2153 celoss: 0.7157 dloss: -0.1775
closs: 0.2832 celoss: 6.4929 dloss: -0.2787
closs: 0.3324 celoss: 3.6478 dloss: -0.1633
closs: 0.3553 celoss: 3.6112 dloss: -0.2413
closs: 0.2878 celoss: 2.4347 dloss: -0.2414
closs: 0.2624 celoss: 2.8557 dloss: -0.5394
closs: 0.0829 celoss: 3.8166 dloss: -0.4282
closs: 0.0380 celoss: 1.8013 dloss: -0.3638
closs: 0.0930 celoss: 3.9196 dloss: -0.6067
closs: 0.0150 celoss: 7.1038 dloss: -0.5287
closs: 0.0909 celoss: 0.9512 dloss: -0.4204
closs: 0.0452 celoss: 0.8439 dloss: -0.7635
closs: 0.0099 celoss: 5.4013 dloss: -0.7359
closs: 0.0284 celoss: 0.4117 dloss: -0.9293
closs: 0.0791 celoss: 2.7661 dloss: -0.5141
closs: 0.1172 celoss: 0.9836 dloss: -0.7071
closs: 0.2513 celoss: 2.3509 dloss: -0.4201
closs: 0.0033 celoss: 1.8597 dloss: -0.5371
closs: 0.0681 celoss: 2.0706 dloss: -0.8229
closs: 0.1058 celoss: 1.3337 dloss: -0.8299
closs: 0.0597 celoss: 1.5510 dloss: -1.1047
closs: 0.0299 celoss: 1.5824 dloss: -0.7586
closs: 0.0992 celoss: 2.3143 dloss: -1.0210
closs: 0.0791 celoss: 1.9394 dloss: -0.8417
closs: 0.0222 celoss: 0.4387 dloss: -0.4944
closs: 0.0288 celoss: 0.6625 dloss: -0.5628
closs: 0.0135 celoss: 4.5136 dloss: -0.6194
closs: 0.1020 celoss: 3.7248 dloss: -1.1224
closs: 0.0648 celoss: 1.3399 dloss: -0.3393
closs: 0.2042 celoss: 5.2593 dloss: -0.9651
closs: 0.0111 celoss: 1.0034 dloss: -0.3220
closs: 0.1350 celoss: 1.2016 dloss: -0.3987
closs: 0.1801 celoss: 1.7029 dloss: -0.6153
closs: 0.0074 celoss: 0.4721 dloss: -0.3116
closs: 0.0672 celoss: 0.2015 dloss: -0.8397
closs: 0.0198 celoss: 1.8567 dloss: -0.6717
closs: 0.0164 celoss: 0.7002 dloss: -0.7289
closs: 0.1535 celoss: 0.4099 dloss: -1.0501
Iter:   50/20000 Loss:2.5912 imps:0.5 Fin:Tue Sep 29 06:15:09 2020 lr: 0.0007
closs: 0.0186 celoss: 0.5534 dloss: -0.7522
closs: 0.0935 celoss: 2.2394 dloss: -0.9134
closs: 0.0657 celoss: 0.0391 dloss: -0.9369
closs: 0.0918 celoss: 3.0824 dloss: -1.0096
closs: 0.0200 celoss: 0.2935 dloss: -0.6191
closs: 0.0463 celoss: 0.5819 dloss: -0.7374
closs: 0.0586 celoss: 1.6121 dloss: -0.5797
closs: 0.0387 celoss: 1.6065 dloss: -0.6697
closs: 0.1740 celoss: 1.4451 dloss: -0.8965
closs: 0.0808 celoss: 1.7410 dloss: -0.3401
closs: 0.0106 celoss: 0.4464 dloss: -1.0401
closs: 0.1394 celoss: 0.8560 dloss: -1.4389
closs: 0.0874 celoss: 1.5509 dloss: -0.8592
closs: 0.1644 celoss: 1.1979 dloss: -1.9854
closs: 0.1198 celoss: 0.9197 dloss: -0.9195
closs: 0.1065 celoss: 1.2109 dloss: -0.6373
closs: 0.0588 celoss: 1.0543 dloss: -0.5841
closs: 0.0050 celoss: 1.8805 dloss: -1.6217
closs: 0.2087 celoss: 4.1127 dloss: -0.5923
closs: 0.0425 celoss: 0.4801 dloss: -0.6069
closs: 0.0653 celoss: 0.5786 dloss: -0.2389
closs: 0.2031 celoss: 1.5584 dloss: -0.4824
closs: 0.2697 celoss: 0.9956 dloss: -0.7905
closs: 0.0298 celoss: 1.3744 dloss: -0.5592
closs: 0.0852 celoss: 1.8596 dloss: -0.2178
closs: 0.1423 celoss: 3.2036 dloss: -0.8012
closs: 0.0112 celoss: 0.7400 dloss: -0.7337
closs: 0.0452 celoss: 0.0967 dloss: -0.9006
closs: 0.1238 celoss: 0.9540 dloss: -0.4174
closs: 0.1372 celoss: 0.4022 dloss: -0.4906
closs: 0.0215 celoss: 0.7961 dloss: -0.7226
closs: 0.2029 celoss: 0.7700 dloss: -1.0545
closs: 0.2938 celoss: 1.3062 dloss: -0.4279
closs: 0.1345 celoss: 0.1922 dloss: -0.4158
closs: 0.1429 celoss: 3.1052 dloss: -0.6353
closs: 0.1459 celoss: 1.4670 dloss: -0.8719
closs: 0.0420 celoss: 0.3905 dloss: -1.0492
closs: 0.1515 celoss: 0.7394 dloss: -1.0913
closs: 0.0812 celoss: 2.3438 dloss: -0.7482
closs: 0.0015 celoss: 0.2092 dloss: -0.6132
closs: 0.0264 celoss: 0.9715 dloss: -0.5522
closs: 0.0316 celoss: 0.0637 dloss: -0.7099
closs: 0.2186 celoss: 2.0530 dloss: -0.8735
closs: 0.0712 celoss: 0.9859 dloss: -1.4845
closs: 0.1248 celoss: 0.9471 dloss: -0.4290
closs: 0.1304 celoss: 1.3958 dloss: -0.4913
closs: 0.0504 celoss: 0.7157 dloss: -1.0267
closs: 0.0273 celoss: 1.2975 dloss: -0.4367
closs: 0.0822 celoss: 1.7112 dloss: -0.4826
closs: 0.1391 celoss: 1.1379 dloss: -0.7480
Iter:  100/20000 Loss:0.5579 imps:0.5 Fin:Tue Sep 29 10:22:49 2020 lr: 0.0007
closs: 0.0558 celoss: 1.2818 dloss: -0.6047
closs: 0.0448 celoss: 0.1854 dloss: -0.6605
closs: 0.0711 celoss: 0.7248 dloss: -0.4592
closs: 0.0851 celoss: 1.1428 dloss: -0.6737
closs: 0.0606 celoss: 0.6375 dloss: -1.0760
closs: 0.1295 celoss: 1.7562 dloss: -0.6098
closs: 0.0650 celoss: 0.8115 dloss: -0.5792
closs: 0.0359 celoss: 0.6091 dloss: -0.3923
closs: 0.1426 celoss: 0.7211 dloss: -0.4499
closs: 0.0850 celoss: 0.5728 dloss: -1.3284
closs: 0.0153 celoss: 1.0915 dloss: -0.5173
closs: 0.0204 celoss: 0.6239 dloss: -0.5377
closs: 0.0717 celoss: 0.8700 dloss: -0.6135
closs: 0.1348 celoss: 0.9156 dloss: -0.6275
closs: 0.0953 celoss: 0.6156 dloss: -0.5429
closs: 0.0737 celoss: 1.3502 dloss: -0.3975
closs: 0.1044 celoss: 0.6590 dloss: -0.7249
closs: 0.0416 celoss: 0.1282 dloss: -0.7763
closs: 0.0668 celoss: 0.4131 dloss: -0.8943
closs: 0.1354 celoss: 2.1792 dloss: -0.7357
closs: 0.1223 celoss: 0.3392 dloss: -0.6979
closs: 0.0507 celoss: 1.1406 dloss: -0.7358
closs: 0.0800 celoss: 0.5590 dloss: -1.5224
closs: 0.1624 celoss: 1.3533 dloss: -0.4674
closs: 0.0646 celoss: 0.4244 dloss: -0.6408
closs: 0.0753 celoss: 1.2881 dloss: -1.0271
closs: 0.0161 celoss: 0.0051 dloss: -2.0526
closs: 0.1103 celoss: 2.5632 dloss: -0.7065
closs: 0.0040 celoss: 0.0918 dloss: -0.6101
closs: 0.0128 celoss: 0.6130 dloss: -0.8321
closs: 0.0266 celoss: 0.2074 dloss: -0.7096
closs: 0.0671 celoss: 0.2723 dloss: -0.7433
closs: 0.0234 celoss: 0.3458 dloss: -0.8951
closs: 0.1489 celoss: 0.5624 dloss: -0.7095
closs: 0.0199 celoss: 1.2300 dloss: -1.1305
closs: 0.1348 celoss: 0.8209 dloss: -0.6234
closs: 0.1849 celoss: 1.1255 dloss: -0.5713
closs: 0.0594 celoss: 0.8439 dloss: -0.7159
closs: 0.0013 celoss: 0.0482 dloss: -1.5655
closs: 0.0320 celoss: 0.3643 dloss: -0.6873
closs: 0.0661 celoss: 1.1748 dloss: -0.6629
closs: 0.0166 celoss: 1.2994 dloss: -0.5595
closs: 0.0715 celoss: 0.2884 dloss: -1.0227
closs: 0.0228 celoss: 0.9772 dloss: -0.6697
closs: 0.2057 celoss: 0.2226 dloss: -0.6602
closs: 0.0184 celoss: 1.2756 dloss: -0.8218
closs: 0.0130 celoss: 0.2805 dloss: -0.5967
closs: 0.1203 celoss: 1.2384 dloss: -0.5204
closs: 0.1348 celoss: 0.4711 dloss: -0.6461
closs: 0.0274 celoss: 0.2033 dloss: -0.6084
Iter:  150/20000 Loss:0.0987 imps:0.5 Fin:Tue Sep 29 12:06:49 2020 lr: 0.0007
closs: 0.4298 celoss: 1.5731 dloss: -0.9354
closs: 0.0787 celoss: 3.2816 dloss: -0.7151
closs: 0.0946 celoss: 1.3763 dloss: -0.9236
closs: 0.1766 celoss: 0.9614 dloss: -0.7405
closs: 0.0379 celoss: 2.2669 dloss: -0.2658
closs: 0.1181 celoss: 0.6767 dloss: -0.7966
closs: 0.2176 celoss: 1.7115 dloss: -1.3958
closs: 0.2362 celoss: 0.9105 dloss: -0.9004
closs: 0.1016 celoss: 0.8248 dloss: -0.7804
closs: 0.0429 celoss: 2.7413 dloss: -1.1431
closs: 0.0257 celoss: 0.7514 dloss: -0.6107
closs: 0.5349 celoss: 0.9333 dloss: -0.5871
closs: 0.1494 celoss: 0.2364 dloss: -0.5980
closs: 0.2620 celoss: 0.7616 dloss: -0.5764
closs: 0.0516 celoss: 0.8762 dloss: -0.6489
closs: 0.2253 celoss: 0.2913 dloss: -0.5292
closs: 0.2918 celoss: 2.4652 dloss: -0.7501
closs: 0.3032 celoss: 3.9186 dloss: -0.3560
closs: 0.7368 celoss: 5.9555 dloss: -0.1446
closs: 3.4404 celoss: 28.8456 dloss: -0.2489
closs: 66284.4844 celoss: 1845719.8750 dloss: -0.1338
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
closs: nan celoss: nan dloss: nan
convnets commented 3 years ago

It seems that the issue is this code snippet:

    seg_label_copy = torch.squeeze(seg_label_tensor.clone())
    bg_label = seg_label_copy.clone()
    fg_label = seg_label_copy.clone()
    bg_label[seg_label_copy != 0] = 255
    fg_label[seg_label_copy == 0] = 255
    bg_celoss = critersion(pred, bg_label.long().cuda())

    fg_celoss = critersion(pred, fg_label.long().cuda())

    celoss = bg_celoss + fg_celoss

Separating bg_celoss and fg_celoss makes training very unstable. I'm not sure why using the criterion function twice on pred tensor would result in loss NaN. Hope the author could explain why you design your code in such a way. @zbf1991

zbf1991 commented 3 years ago

Maybe in some cases, there is no bg/fg seed mask in the pseudo label from the classification branch, which makes the prediction unstable, I have tried to use single celoss, but the performance is not satisfied. Maybe give a larger batch size can make the model be more stable. Sorry about that.