Closed Akinpzx closed 2 years ago
@Akinpzx Hi, thanks for your attention. Could you share the config?
我觉得作者您应该是中国人我就用中文好了,就是我用的config就是您文件里的segmentation里的ade20k中配置文件并做出了一些修改,不知道这是否是最主要的原因呢?
@Akinpzx 我跑一下这个数据集试试,晚一些给你回复
如果可以的话,如果您跑的结果还可以的话可以分享给我一下配置文件吗?
@Akinpzx 我跑一下这个数据集试试,晚一些给你回复
如果可以的话,如果您跑的结果还可以的话可以分享给我一下配置文件吗?
@Akinpzx 我跑一下这个数据集试试,晚一些给你回复
可以的
如果可以的话,如果您跑的结果还可以的话可以分享给我一下配置文件吗?
@Akinpzx 我跑一下这个数据集试试,晚一些给你回复
可以的
请问作者您有什么进展了吗?不知结果如何呢?
如果可以的话,如果您跑的结果还可以的话可以分享给我一下配置文件吗?
@Akinpzx 我跑一下这个数据集试试,晚一些给你回复
可以的
请问作者您有什么进展了吗?不知结果如何呢?
最近卡比较紧张,刚排上训练,晚上可以给你一个初步结果
@Akinpzx 请问你用的mmseg是什么版本,我看potsdam数据集需要升级到v0.21以上
@Akinpzx 请问你用的mmseg是什么版本,我看potsdam数据集需要升级到v0.21以上
我用的mmseg就是您在readme中的配置呢
2022-08-30 17:43:17,895 - mmseg - INFO - Iter [650/80000] lr: 6.162e-07, eta: 16:46:51, time: 0.734, data_time: 0.011, memory: 14739, decode.loss_cls: 1.1344, decode.loss_mask: 1.8733, decode.loss_dice: 2.7332, decode.d0.loss_cls: 3.9326, decode.d0.loss_mask: 1.8059, decode.d0.loss_dice: 2.8472, decode.d1.loss_cls: 1.0693, decode.d1.loss_mask: 1.7949, decode.d1.loss_dice: 2.6332, decode.d2.loss_cls: 0.8967, decode.d2.loss_mask: 1.8853, decode.d2.loss_dice: 2.6762, decode.d3.loss_cls: 0.8909, decode.d3.loss_mask: 1.8976, decode.d3.loss_dice: 2.6680, decode.d4.loss_cls: 0.9083, decode.d4.loss_mask: 1.9004, decode.d4.loss_dice: 2.6463, decode.d5.loss_cls: 0.9199, decode.d5.loss_mask: 1.9178, decode.d5.loss_dice: 2.6426, decode.d6.loss_cls: 0.9489, decode.d6.loss_mask: 1.9419, decode.d6.loss_dice: 2.6575, decode.d7.loss_cls: 1.0075, decode.d7.loss_mask: 1.9297, decode.d7.loss_dice: 2.6561, decode.d8.loss_cls: 1.0386, decode.d8.loss_mask: 1.9060, decode.d8.loss_dice: 2.6876, loss: 58.4478
2022-08-30 17:43:55,960 - mmseg - INFO - Iter [700/80000] lr: 6.632e-07, eta: 16:46:14, time: 0.762, data_time: 0.010, memory: 14739, decode.loss_cls: 1.0483, decode.loss_mask: 1.7637, decode.loss_dice: 2.6110, decode.d0.loss_cls: 3.9144, decode.d0.loss_mask: 1.7102, decode.d0.loss_dice: 2.8163, decode.d1.loss_cls: 1.0022, decode.d1.loss_mask: 1.7091, decode.d1.loss_dice: 2.5921, decode.d2.loss_cls: 0.8398, decode.d2.loss_mask: 1.7653, decode.d2.loss_dice: 2.6079, decode.d3.loss_cls: 0.8310, decode.d3.loss_mask: 1.7782, decode.d3.loss_dice: 2.5665, decode.d4.loss_cls: 0.8056, decode.d4.loss_mask: 1.8115, decode.d4.loss_dice: 2.5830, decode.d5.loss_cls: 0.8257, decode.d5.loss_mask: 1.7929, decode.d5.loss_dice: 2.5601, decode.d6.loss_cls: 0.8534, decode.d6.loss_mask: 1.8186, decode.d6.loss_dice: 2.5559, decode.d7.loss_cls: 0.8931, decode.d7.loss_mask: 1.7994, decode.d7.loss_dice: 2.5683, decode.d8.loss_cls: 0.9442, decode.d8.loss_mask: 1.7963, decode.d8.loss_dice: 2.5709, loss: 55.7350
2022-08-30 17:44:35,093 - mmseg - INFO - Iter [750/80000] lr: 7.102e-07, eta: 16:47:28, time: 0.782, data_time: 0.010, memory: 14739, decode.loss_cls: 0.9884, decode.loss_mask: 1.7916, decode.loss_dice: 2.5596, decode.d0.loss_cls: 3.9064, decode.d0.loss_mask: 1.7048, decode.d0.loss_dice: 2.7288, decode.d1.loss_cls: 0.9465, decode.d1.loss_mask: 1.7243, decode.d1.loss_dice: 2.5194, decode.d2.loss_cls: 0.7897, decode.d2.loss_mask: 1.7706, decode.d2.loss_dice: 2.5032, decode.d3.loss_cls: 0.7638, decode.d3.loss_mask: 1.8169, decode.d3.loss_dice: 2.4766, decode.d4.loss_cls: 0.7462, decode.d4.loss_mask: 1.8093, decode.d4.loss_dice: 2.4968, decode.d5.loss_cls: 0.7585, decode.d5.loss_mask: 1.8079, decode.d5.loss_dice: 2.4886, decode.d6.loss_cls: 0.7973, decode.d6.loss_mask: 1.8149, decode.d6.loss_dice: 2.5059, decode.d7.loss_cls: 0.8530, decode.d7.loss_mask: 1.8230, decode.d7.loss_dice: 2.5218, decode.d8.loss_cls: 0.9020, decode.d8.loss_mask: 1.8219, decode.d8.loss_dice: 2.5423, loss: 54.6799
2022-08-30 17:45:11,404 - mmseg - INFO - Iter [800/80000] lr: 7.572e-07, eta: 16:43:49, time: 0.726, data_time: 0.009, memory: 14739, decode.loss_cls: 0.9292, decode.loss_mask: 1.7690, decode.loss_dice: 2.4137, decode.d0.loss_cls: 3.8851, decode.d0.loss_mask: 1.7016, decode.d0.loss_dice: 2.6155, decode.d1.loss_cls: 0.8755, decode.d1.loss_mask: 1.7264, decode.d1.loss_dice: 2.3946, decode.d2.loss_cls: 0.7130, decode.d2.loss_mask: 1.7955, decode.d2.loss_dice: 2.3777, decode.d3.loss_cls: 0.7156, decode.d3.loss_mask: 1.8411, decode.d3.loss_dice: 2.3394, decode.d4.loss_cls: 0.6896, decode.d4.loss_mask: 1.8178, decode.d4.loss_dice: 2.3553, decode.d5.loss_cls: 0.6966, decode.d5.loss_mask: 1.8421, decode.d5.loss_dice: 2.3723, decode.d6.loss_cls: 0.7374, decode.d6.loss_mask: 1.8207, decode.d6.loss_dice: 2.3756, decode.d7.loss_cls: 0.7677, decode.d7.loss_mask: 1.8179, decode.d7.loss_dice: 2.3912, decode.d8.loss_cls: 0.8384, decode.d8.loss_mask: 1.7937, decode.d8.loss_dice: 2.3839, loss: 52.7929
2022-08-30 17:45:51,429 - mmseg - INFO - Iter [850/80000] lr: 8.040e-07, eta: 16:46:17, time: 0.800, data_time: 0.009, memory: 14739, decode.loss_cls: 0.8088, decode.loss_mask: 1.6895, decode.loss_dice: 2.3235, decode.d0.loss_cls: 3.8720, decode.d0.loss_mask: 1.6149, decode.d0.loss_dice: 2.5795, decode.d1.loss_cls: 0.7906, decode.d1.loss_mask: 1.6328, decode.d1.loss_dice: 2.3551, decode.d2.loss_cls: 0.5908, decode.d2.loss_mask: 1.7062, decode.d2.loss_dice: 2.3469, decode.d3.loss_cls: 0.5752, decode.d3.loss_mask: 1.7307, decode.d3.loss_dice: 2.3066, decode.d4.loss_cls: 0.5777, decode.d4.loss_mask: 1.7321, decode.d4.loss_dice: 2.2958, decode.d5.loss_cls: 0.5975, decode.d5.loss_mask: 1.7426, decode.d5.loss_dice: 2.3051, decode.d6.loss_cls: 0.6213, decode.d6.loss_mask: 1.7161, decode.d6.loss_dice: 2.3165, decode.d7.loss_cls: 0.6618, decode.d7.loss_mask: 1.7041, decode.d7.loss_dice: 2.3114, decode.d8.loss_cls: 0.7209, decode.d8.loss_mask: 1.7046, decode.d8.loss_dice: 2.3104, loss: 50.2410
2022-08-30 17:46:31,371 - mmseg - INFO - Iter [900/80000] lr: 8.509e-07, eta: 16:48:17, time: 0.799, data_time: 0.063, memory: 14739, decode.loss_cls: 0.7840, decode.loss_mask: 1.7451, decode.loss_dice: 2.3271, decode.d0.loss_cls: 3.8630, decode.d0.loss_mask: 1.6362, decode.d0.loss_dice: 2.5202, decode.d1.loss_cls: 0.7439, decode.d1.loss_mask: 1.6850, decode.d1.loss_dice: 2.3303, decode.d2.loss_cls: 0.5860, decode.d2.loss_mask: 1.7597, decode.d2.loss_dice: 2.3119, decode.d3.loss_cls: 0.5553, decode.d3.loss_mask: 1.8050, decode.d3.loss_dice: 2.2938, decode.d4.loss_cls: 0.5440, decode.d4.loss_mask: 1.8150, decode.d4.loss_dice: 2.3085, decode.d5.loss_cls: 0.5553, decode.d5.loss_mask: 1.8015, decode.d5.loss_dice: 2.3181, decode.d6.loss_cls: 0.5917, decode.d6.loss_mask: 1.7841, decode.d6.loss_dice: 2.2859, decode.d7.loss_cls: 0.6297, decode.d7.loss_mask: 1.7732, decode.d7.loss_dice: 2.3119, decode.d8.loss_cls: 0.6862, decode.d8.loss_mask: 1.7670, decode.d8.loss_dice: 2.3097, loss: 50.4283
2022-08-30 17:43:17,895 - mmseg - INFO - Iter [650/80000] lr: 6.162e-07, eta: 16:46:51, time: 0.734, data_time: 0.011, memory: 14739, decode.loss_cls: 1.1344, decode.loss_mask: 1.8733, decode.loss_dice: 2.7332, decode.d0.loss_cls: 3.9326, decode.d0.loss_mask: 1.8059, decode.d0.loss_dice: 2.8472, decode.d1.loss_cls: 1.0693, decode.d1.loss_mask: 1.7949, decode.d1.loss_dice: 2.6332, decode.d2.loss_cls: 0.8967, decode.d2.loss_mask: 1.8853, decode.d2.loss_dice: 2.6762, decode.d3.loss_cls: 0.8909, decode.d3.loss_mask: 1.8976, decode.d3.loss_dice: 2.6680, decode.d4.loss_cls: 0.9083, decode.d4.loss_mask: 1.9004, decode.d4.loss_dice: 2.6463, decode.d5.loss_cls: 0.9199, decode.d5.loss_mask: 1.9178, decode.d5.loss_dice: 2.6426, decode.d6.loss_cls: 0.9489, decode.d6.loss_mask: 1.9419, decode.d6.loss_dice: 2.6575, decode.d7.loss_cls: 1.0075, decode.d7.loss_mask: 1.9297, decode.d7.loss_dice: 2.6561, decode.d8.loss_cls: 1.0386, decode.d8.loss_mask: 1.9060, decode.d8.loss_dice: 2.6876, loss: 58.4478 2022-08-30 17:43:55,960 - mmseg - INFO - Iter [700/80000] lr: 6.632e-07, eta: 16:46:14, time: 0.762, data_time: 0.010, memory: 14739, decode.loss_cls: 1.0483, decode.loss_mask: 1.7637, decode.loss_dice: 2.6110, decode.d0.loss_cls: 3.9144, decode.d0.loss_mask: 1.7102, decode.d0.loss_dice: 2.8163, decode.d1.loss_cls: 1.0022, decode.d1.loss_mask: 1.7091, decode.d1.loss_dice: 2.5921, decode.d2.loss_cls: 0.8398, decode.d2.loss_mask: 1.7653, decode.d2.loss_dice: 2.6079, decode.d3.loss_cls: 0.8310, decode.d3.loss_mask: 1.7782, decode.d3.loss_dice: 2.5665, decode.d4.loss_cls: 0.8056, decode.d4.loss_mask: 1.8115, decode.d4.loss_dice: 2.5830, decode.d5.loss_cls: 0.8257, decode.d5.loss_mask: 1.7929, decode.d5.loss_dice: 2.5601, decode.d6.loss_cls: 0.8534, decode.d6.loss_mask: 1.8186, decode.d6.loss_dice: 2.5559, decode.d7.loss_cls: 0.8931, decode.d7.loss_mask: 1.7994, decode.d7.loss_dice: 2.5683, decode.d8.loss_cls: 0.9442, decode.d8.loss_mask: 1.7963, decode.d8.loss_dice: 2.5709, loss: 55.7350 2022-08-30 17:44:35,093 - mmseg - INFO - Iter [750/80000] lr: 7.102e-07, eta: 16:47:28, time: 0.782, data_time: 0.010, memory: 14739, decode.loss_cls: 0.9884, decode.loss_mask: 1.7916, decode.loss_dice: 2.5596, decode.d0.loss_cls: 3.9064, decode.d0.loss_mask: 1.7048, decode.d0.loss_dice: 2.7288, decode.d1.loss_cls: 0.9465, decode.d1.loss_mask: 1.7243, decode.d1.loss_dice: 2.5194, decode.d2.loss_cls: 0.7897, decode.d2.loss_mask: 1.7706, decode.d2.loss_dice: 2.5032, decode.d3.loss_cls: 0.7638, decode.d3.loss_mask: 1.8169, decode.d3.loss_dice: 2.4766, decode.d4.loss_cls: 0.7462, decode.d4.loss_mask: 1.8093, decode.d4.loss_dice: 2.4968, decode.d5.loss_cls: 0.7585, decode.d5.loss_mask: 1.8079, decode.d5.loss_dice: 2.4886, decode.d6.loss_cls: 0.7973, decode.d6.loss_mask: 1.8149, decode.d6.loss_dice: 2.5059, decode.d7.loss_cls: 0.8530, decode.d7.loss_mask: 1.8230, decode.d7.loss_dice: 2.5218, decode.d8.loss_cls: 0.9020, decode.d8.loss_mask: 1.8219, decode.d8.loss_dice: 2.5423, loss: 54.6799 2022-08-30 17:45:11,404 - mmseg - INFO - Iter [800/80000] lr: 7.572e-07, eta: 16:43:49, time: 0.726, data_time: 0.009, memory: 14739, decode.loss_cls: 0.9292, decode.loss_mask: 1.7690, decode.loss_dice: 2.4137, decode.d0.loss_cls: 3.8851, decode.d0.loss_mask: 1.7016, decode.d0.loss_dice: 2.6155, decode.d1.loss_cls: 0.8755, decode.d1.loss_mask: 1.7264, decode.d1.loss_dice: 2.3946, decode.d2.loss_cls: 0.7130, decode.d2.loss_mask: 1.7955, decode.d2.loss_dice: 2.3777, decode.d3.loss_cls: 0.7156, decode.d3.loss_mask: 1.8411, decode.d3.loss_dice: 2.3394, decode.d4.loss_cls: 0.6896, decode.d4.loss_mask: 1.8178, decode.d4.loss_dice: 2.3553, decode.d5.loss_cls: 0.6966, decode.d5.loss_mask: 1.8421, decode.d5.loss_dice: 2.3723, decode.d6.loss_cls: 0.7374, decode.d6.loss_mask: 1.8207, decode.d6.loss_dice: 2.3756, decode.d7.loss_cls: 0.7677, decode.d7.loss_mask: 1.8179, decode.d7.loss_dice: 2.3912, decode.d8.loss_cls: 0.8384, decode.d8.loss_mask: 1.7937, decode.d8.loss_dice: 2.3839, loss: 52.7929 2022-08-30 17:45:51,429 - mmseg - INFO - Iter [850/80000] lr: 8.040e-07, eta: 16:46:17, time: 0.800, data_time: 0.009, memory: 14739, decode.loss_cls: 0.8088, decode.loss_mask: 1.6895, decode.loss_dice: 2.3235, decode.d0.loss_cls: 3.8720, decode.d0.loss_mask: 1.6149, decode.d0.loss_dice: 2.5795, decode.d1.loss_cls: 0.7906, decode.d1.loss_mask: 1.6328, decode.d1.loss_dice: 2.3551, decode.d2.loss_cls: 0.5908, decode.d2.loss_mask: 1.7062, decode.d2.loss_dice: 2.3469, decode.d3.loss_cls: 0.5752, decode.d3.loss_mask: 1.7307, decode.d3.loss_dice: 2.3066, decode.d4.loss_cls: 0.5777, decode.d4.loss_mask: 1.7321, decode.d4.loss_dice: 2.2958, decode.d5.loss_cls: 0.5975, decode.d5.loss_mask: 1.7426, decode.d5.loss_dice: 2.3051, decode.d6.loss_cls: 0.6213, decode.d6.loss_mask: 1.7161, decode.d6.loss_dice: 2.3165, decode.d7.loss_cls: 0.6618, decode.d7.loss_mask: 1.7041, decode.d7.loss_dice: 2.3114, decode.d8.loss_cls: 0.7209, decode.d8.loss_mask: 1.7046, decode.d8.loss_dice: 2.3104, loss: 50.2410 2022-08-30 17:46:31,371 - mmseg - INFO - Iter [900/80000] lr: 8.509e-07, eta: 16:48:17, time: 0.799, data_time: 0.063, memory: 14739, decode.loss_cls: 0.7840, decode.loss_mask: 1.7451, decode.loss_dice: 2.3271, decode.d0.loss_cls: 3.8630, decode.d0.loss_mask: 1.6362, decode.d0.loss_dice: 2.5202, decode.d1.loss_cls: 0.7439, decode.d1.loss_mask: 1.6850, decode.d1.loss_dice: 2.3303, decode.d2.loss_cls: 0.5860, decode.d2.loss_mask: 1.7597, decode.d2.loss_dice: 2.3119, decode.d3.loss_cls: 0.5553, decode.d3.loss_mask: 1.8050, decode.d3.loss_dice: 2.2938, decode.d4.loss_cls: 0.5440, decode.d4.loss_mask: 1.8150, decode.d4.loss_dice: 2.3085, decode.d5.loss_cls: 0.5553, decode.d5.loss_mask: 1.8015, decode.d5.loss_dice: 2.3181, decode.d6.loss_cls: 0.5917, decode.d6.loss_mask: 1.7841, decode.d6.loss_dice: 2.2859, decode.d7.loss_cls: 0.6297, decode.d7.loss_mask: 1.7732, decode.d7.loss_dice: 2.3119, decode.d8.loss_cls: 0.6862, decode.d8.loss_mask: 1.7670, decode.d8.loss_dice: 2.3097, loss: 50.4283
请问miou,acc啥的呢,loss的话我现在打不开我的log文件有点不记得了,但是我记得开始160多到最后稳定收敛到了20多,感觉表现还是比较不太好?
@Akinpzx 这是8000/80000 iters的mIoU,完整的训完还需要16小时
2022-08-30 19:21:49,013 - mmseg - INFO - Iter [8000/80000] lr: 1.292e-06, eta: 16:00:51, time: 1.284, data_time: 0.010, memory: 14739, decode.loss_cls: 0.1364, decode.loss_mask
: 1.0678, decode.loss_dice: 1.2613, decode.d0.loss_cls: 1.1983, decode.d0.loss_mask: 1.0618, decode.d0.loss_dice: 1.2902, decode.d1.loss_cls: 0.1540, decode.d1.loss_mask: 1.0613, de
code.d1.loss_dice: 1.2739, decode.d2.loss_cls: 0.1577, decode.d2.loss_mask: 1.0582, decode.d2.loss_dice: 1.2563, decode.d3.loss_cls: 0.1552, decode.d3.loss_mask: 1.0524, decode.d3.l
oss_dice: 1.2437, decode.d4.loss_cls: 0.1612, decode.d4.loss_mask: 1.0587, decode.d4.loss_dice: 1.2411, decode.d5.loss_cls: 0.1574, decode.d5.loss_mask: 1.0585, decode.d5.loss_dice:
1.2433, decode.d6.loss_cls: 0.1523, decode.d6.loss_mask: 1.0612, decode.d6.loss_dice: 1.2478, decode.d7.loss_cls: 0.1546, decode.d7.loss_mask: 1.0591, decode.d7.loss_dice: 1.2467,
decode.d8.loss_cls: 0.1480, decode.d8.loss_mask: 1.0647, decode.d8.loss_dice: 1.2537, loss: 25.7369
[>>>>>>>>>>>>>>>>>>>>>>>>>>>] 2016/2016, 8.5 task/s, elapsed: 237s, ETA: 0s
2022-08-30 19:25:46,489 - mmseg - INFO - per class results:
2022-08-30 19:25:46,498 - mmseg - INFO -
+--------------------+-------+-------+
| Class | IoU | Acc |
+--------------------+-------+-------+
| impervious_surface | 87.92 | 94.09 |
| building | 94.29 | 97.25 |
| low_vegetation | 78.19 | 89.95 |
| tree | 80.55 | 87.61 |
| car | 91.87 | 97.72 |
| clutter | 42.05 | 52.84 |
+--------------------+-------+-------+
2022-08-30 19:25:46,498 - mmseg - INFO - Summary:
2022-08-30 19:25:46,498 - mmseg - INFO -
+------+-------+-------+
| aAcc | mIoU | mAcc |
+------+-------+-------+
| 91.2 | 79.14 | 86.58 |
+------+-------+-------+
2022-08-30 19:26:08,547 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_8000.pth.
2022-08-30 19:26:08,558 - mmseg - INFO - Best mIoU is 0.7914 at 8000 iter.
2022-08-30 19:26:08,574 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py
@Akinpzx 这是8000/80000 iters的mIoU,完整的训完还需要16小时
2022-08-30 19:21:49,013 - mmseg - INFO - Iter [8000/80000] lr: 1.292e-06, eta: 16:00:51, time: 1.284, data_time: 0.010, memory: 14739, decode.loss_cls: 0.1364, decode.loss_mask : 1.0678, decode.loss_dice: 1.2613, decode.d0.loss_cls: 1.1983, decode.d0.loss_mask: 1.0618, decode.d0.loss_dice: 1.2902, decode.d1.loss_cls: 0.1540, decode.d1.loss_mask: 1.0613, de code.d1.loss_dice: 1.2739, decode.d2.loss_cls: 0.1577, decode.d2.loss_mask: 1.0582, decode.d2.loss_dice: 1.2563, decode.d3.loss_cls: 0.1552, decode.d3.loss_mask: 1.0524, decode.d3.l oss_dice: 1.2437, decode.d4.loss_cls: 0.1612, decode.d4.loss_mask: 1.0587, decode.d4.loss_dice: 1.2411, decode.d5.loss_cls: 0.1574, decode.d5.loss_mask: 1.0585, decode.d5.loss_dice: 1.2433, decode.d6.loss_cls: 0.1523, decode.d6.loss_mask: 1.0612, decode.d6.loss_dice: 1.2478, decode.d7.loss_cls: 0.1546, decode.d7.loss_mask: 1.0591, decode.d7.loss_dice: 1.2467, decode.d8.loss_cls: 0.1480, decode.d8.loss_mask: 1.0647, decode.d8.loss_dice: 1.2537, loss: 25.7369 [>>>>>>>>>>>>>>>>>>>>>>>>>>>] 2016/2016, 8.5 task/s, elapsed: 237s, ETA: 0s 2022-08-30 19:25:46,489 - mmseg - INFO - per class results: 2022-08-30 19:25:46,498 - mmseg - INFO - +--------------------+-------+-------+ | Class | IoU | Acc | +--------------------+-------+-------+ | impervious_surface | 87.92 | 94.09 | | building | 94.29 | 97.25 | | low_vegetation | 78.19 | 89.95 | | tree | 80.55 | 87.61 | | car | 91.87 | 97.72 | | clutter | 42.05 | 52.84 | +--------------------+-------+-------+ 2022-08-30 19:25:46,498 - mmseg - INFO - Summary: 2022-08-30 19:25:46,498 - mmseg - INFO - +------+-------+-------+ | aAcc | mIoU | mAcc | +------+-------+-------+ | 91.2 | 79.14 | 86.58 | +------+-------+-------+ 2022-08-30 19:26:08,547 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_8000.pth. 2022-08-30 19:26:08,558 - mmseg - INFO - Best mIoU is 0.7914 at 8000 iter. 2022-08-30 19:26:08,574 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py
wow,希望您能分享一下配置文件呢,肯定是我配置错了导致数据那么低的,我学习语义分割不是很久,还不是很清楚
@Akinpzx 我更新了config,你可以试试看。另外,麻烦确认一下mmseg的版本,如果是v0.20.2,应该是没法直接跑potsdam这个数据集的,因为他是v0.21才加进来的
2022-08-30 21:08:50,453 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py [15/1870]
2022-08-30 21:08:50,459 - mmseg - INFO - Iter [16000/80000] lr: 1.149e-06, eta: 14:15:07, time: 1.269, data_time: 0.064, memory: 14739, decode.loss_cls: 0.1133, decode.loss_mask
: 0.8411, decode.loss_dice: 1.0110, decode.d0.loss_cls: 0.3088, decode.d0.loss_mask: 0.8615, decode.d0.loss_dice: 1.0595, decode.d1.loss_cls: 0.1243, decode.d1.loss_mask: 0.8513, de
code.d1.loss_dice: 1.0372, decode.d2.loss_cls: 0.1146, decode.d2.loss_mask: 0.8482, decode.d2.loss_dice: 1.0249, decode.d3.loss_cls: 0.1102, decode.d3.loss_mask: 0.8433, decode.d3.l
oss_dice: 1.0234, decode.d4.loss_cls: 0.1071, decode.d4.loss_mask: 0.8428, decode.d4.loss_dice: 1.0266, decode.d5.loss_cls: 0.1121, decode.d5.loss_mask: 0.8383, decode.d5.loss_dice:
1.0219, decode.d6.loss_cls: 0.1120, decode.d6.loss_mask: 0.8406, decode.d6.loss_dice: 1.0128, decode.d7.loss_cls: 0.1144, decode.d7.loss_mask: 0.8400, decode.d7.loss_dice: 1.0127,
decode.d8.loss_cls: 0.1059, decode.d8.loss_mask: 0.8423, decode.d8.loss_dice: 1.0103, loss: 20.0125
[>>>>>>>>>>>>>>>>>>>>>>>>>>>] 2016/2016, 58.8 task/s, elapsed: 34s, ETA: 0s
2022-08-30 21:09:25,874 - mmseg - INFO - per class results:
2022-08-30 21:09:25,883 - mmseg - INFO -
+--------------------+-------+-------+
| Class | IoU | Acc |
+--------------------+-------+-------+
| impervious_surface | 88.24 | 93.48 |
| building | 94.79 | 98.11 |
| low_vegetation | 78.63 | 90.54 |
| tree | 81.08 | 88.79 |
| car | 92.88 | 96.93 |
| clutter | 43.8 | 52.93 |
+--------------------+-------+-------+
2022-08-30 21:09:25,883 - mmseg - INFO - Summary:
2022-08-30 21:09:25,883 - mmseg - INFO -
+-------+------+------+
| aAcc | mIoU | mAcc |
+-------+------+------+
| 91.54 | 79.9 | 86.8 |
+-------+------+------+
2022-08-30 21:09:26,071 - mmseg - INFO - The previous best checkpoint /mnt/petrelfs/chenzhe1/workspace/ViT-Adapter/segmentation/work_dirs/mask2former_beit_adapter_large_512_80k_pot$dam_ss/best_mIoU_iter_8000.pth was removed
2022-08-30 21:09:49,308 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_16000.pth.
2022-08-30 21:09:49,322 - mmseg - INFO - Best mIoU is 0.7990 at 16000 iter.
2022-08-30 21:09:49,338 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py
@Akinpzx 我更新了config,你可以试试看。另外,麻烦确认一下mmseg的版本,如果是v0.20.2,应该是没法直接跑potsdam这个数据集的,因为他是v0.21才加进来的
2022-08-30 21:08:50,453 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py [15/1870] 2022-08-30 21:08:50,459 - mmseg - INFO - Iter [16000/80000] lr: 1.149e-06, eta: 14:15:07, time: 1.269, data_time: 0.064, memory: 14739, decode.loss_cls: 0.1133, decode.loss_mask : 0.8411, decode.loss_dice: 1.0110, decode.d0.loss_cls: 0.3088, decode.d0.loss_mask: 0.8615, decode.d0.loss_dice: 1.0595, decode.d1.loss_cls: 0.1243, decode.d1.loss_mask: 0.8513, de code.d1.loss_dice: 1.0372, decode.d2.loss_cls: 0.1146, decode.d2.loss_mask: 0.8482, decode.d2.loss_dice: 1.0249, decode.d3.loss_cls: 0.1102, decode.d3.loss_mask: 0.8433, decode.d3.l oss_dice: 1.0234, decode.d4.loss_cls: 0.1071, decode.d4.loss_mask: 0.8428, decode.d4.loss_dice: 1.0266, decode.d5.loss_cls: 0.1121, decode.d5.loss_mask: 0.8383, decode.d5.loss_dice: 1.0219, decode.d6.loss_cls: 0.1120, decode.d6.loss_mask: 0.8406, decode.d6.loss_dice: 1.0128, decode.d7.loss_cls: 0.1144, decode.d7.loss_mask: 0.8400, decode.d7.loss_dice: 1.0127, decode.d8.loss_cls: 0.1059, decode.d8.loss_mask: 0.8423, decode.d8.loss_dice: 1.0103, loss: 20.0125 [>>>>>>>>>>>>>>>>>>>>>>>>>>>] 2016/2016, 58.8 task/s, elapsed: 34s, ETA: 0s 2022-08-30 21:09:25,874 - mmseg - INFO - per class results: 2022-08-30 21:09:25,883 - mmseg - INFO - +--------------------+-------+-------+ | Class | IoU | Acc | +--------------------+-------+-------+ | impervious_surface | 88.24 | 93.48 | | building | 94.79 | 98.11 | | low_vegetation | 78.63 | 90.54 | | tree | 81.08 | 88.79 | | car | 92.88 | 96.93 | | clutter | 43.8 | 52.93 | +--------------------+-------+-------+ 2022-08-30 21:09:25,883 - mmseg - INFO - Summary: 2022-08-30 21:09:25,883 - mmseg - INFO - +-------+------+------+ | aAcc | mIoU | mAcc | +-------+------+------+ | 91.54 | 79.9 | 86.8 | +-------+------+------+ 2022-08-30 21:09:26,071 - mmseg - INFO - The previous best checkpoint /mnt/petrelfs/chenzhe1/workspace/ViT-Adapter/segmentation/work_dirs/mask2former_beit_adapter_large_512_80k_pot$dam_ss/best_mIoU_iter_8000.pth was removed 2022-08-30 21:09:49,308 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_16000.pth. 2022-08-30 21:09:49,322 - mmseg - INFO - Best mIoU is 0.7990 at 16000 iter. 2022-08-30 21:09:49,338 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py
好的好的,多谢您的帮助,我这就去试试
@Akinpzx 最后训完是80左右
2022-08-31 10:56:42,526 - mmseg - INFO - Iter [80000/80000] lr: 1.795e-11, eta: 0:00:00, time: 1.268, data_time: 0.021, memory: 14739, decode.loss_cls: 0.0268, decode.loss_mask:
0.4506, decode.loss_dice: 0.5791, decode.d0.loss_cls: 0.0967, decode.d0.loss_mask: 0.4644, decode.d0.loss_dice: 0.6125, decode.d1.loss_cls: 0.0379, decode.d1.loss_mask: 0.4511, dec
ode.d1.loss_dice: 0.5910, decode.d2.loss_cls: 0.0320, decode.d2.loss_mask: 0.4499, decode.d2.loss_dice: 0.5850, decode.d3.loss_cls: 0.0243, decode.d3.loss_mask: 0.4511, decode.d3.lo
ss_dice: 0.5793, decode.d4.loss_cls: 0.0291, decode.d4.loss_mask: 0.4510, decode.d4.loss_dice: 0.5734, decode.d5.loss_cls: 0.0285, decode.d5.loss_mask: 0.4506, decode.d5.loss_dice:
0.5794, decode.d6.loss_cls: 0.0260, decode.d6.loss_mask: 0.4507, decode.d6.loss_dice: 0.5725, decode.d7.loss_cls: 0.0252, decode.d7.loss_mask: 0.4511, decode.d7.loss_dice: 0.5835, d
ecode.d8.loss_cls: 0.0252, decode.d8.loss_mask: 0.4507, decode.d8.loss_dice: 0.5796, loss: 10.7083
[>>>>>>>>>>>>>>>>>>>>>>>>>>>] 2016/2016, 57.2 task/s, elapsed: 35s, ETA: 0s
2022-08-31 10:57:18,443 - mmseg - INFO - per class results:
2022-08-31 10:57:18,448 - mmseg - INFO -
+--------------------+-------+-------+
| Class | IoU | Acc |
+--------------------+-------+-------+
| impervious_surface | 88.18 | 94.11 |
| building | 94.59 | 98.03 |
| low_vegetation | 78.84 | 90.53 |
| tree | 81.02 | 88.54 |
| car | 93.33 | 96.91 |
| clutter | 43.91 | 50.91 |
+--------------------+-------+-------+
2022-08-31 10:57:18,448 - mmseg - INFO - Summary:
2022-08-31 10:57:18,449 - mmseg - INFO -
+-------+-------+-------+
| aAcc | mIoU | mAcc |
+-------+-------+-------+
| 91.58 | 79.98 | 86.51 |
+-------+-------+-------+
2022-08-31 10:57:18,456 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py
@Akinpzx 最后训完是80左右
2022-08-31 10:56:42,526 - mmseg - INFO - Iter [80000/80000] lr: 1.795e-11, eta: 0:00:00, time: 1.268, data_time: 0.021, memory: 14739, decode.loss_cls: 0.0268, decode.loss_mask: 0.4506, decode.loss_dice: 0.5791, decode.d0.loss_cls: 0.0967, decode.d0.loss_mask: 0.4644, decode.d0.loss_dice: 0.6125, decode.d1.loss_cls: 0.0379, decode.d1.loss_mask: 0.4511, dec ode.d1.loss_dice: 0.5910, decode.d2.loss_cls: 0.0320, decode.d2.loss_mask: 0.4499, decode.d2.loss_dice: 0.5850, decode.d3.loss_cls: 0.0243, decode.d3.loss_mask: 0.4511, decode.d3.lo ss_dice: 0.5793, decode.d4.loss_cls: 0.0291, decode.d4.loss_mask: 0.4510, decode.d4.loss_dice: 0.5734, decode.d5.loss_cls: 0.0285, decode.d5.loss_mask: 0.4506, decode.d5.loss_dice: 0.5794, decode.d6.loss_cls: 0.0260, decode.d6.loss_mask: 0.4507, decode.d6.loss_dice: 0.5725, decode.d7.loss_cls: 0.0252, decode.d7.loss_mask: 0.4511, decode.d7.loss_dice: 0.5835, d ecode.d8.loss_cls: 0.0252, decode.d8.loss_mask: 0.4507, decode.d8.loss_dice: 0.5796, loss: 10.7083 [>>>>>>>>>>>>>>>>>>>>>>>>>>>] 2016/2016, 57.2 task/s, elapsed: 35s, ETA: 0s 2022-08-31 10:57:18,443 - mmseg - INFO - per class results: 2022-08-31 10:57:18,448 - mmseg - INFO - +--------------------+-------+-------+ | Class | IoU | Acc | +--------------------+-------+-------+ | impervious_surface | 88.18 | 94.11 | | building | 94.59 | 98.03 | | low_vegetation | 78.84 | 90.53 | | tree | 81.02 | 88.54 | | car | 93.33 | 96.91 | | clutter | 43.91 | 50.91 | +--------------------+-------+-------+ 2022-08-31 10:57:18,448 - mmseg - INFO - Summary: 2022-08-31 10:57:18,449 - mmseg - INFO - +-------+-------+-------+ | aAcc | mIoU | mAcc | +-------+-------+-------+ | 91.58 | 79.98 | 86.51 | +-------+-------+-------+ 2022-08-31 10:57:18,456 - mmseg - INFO - Exp name: mask2former_beit_adapter_large_512_80k_potsdam_ss.py
多谢您的帮助!
@czczup 我还用一个问题想请教一下就是断点续训在哪设置呢?
@czczup 我还用一个问题想请教一下就是断点续训在哪设置呢?
sh dist_train.sh <config> <gpu-num> --resume-from <checkpoint-path>
用resume-from指定checkpoint
@czczup 出现了一个错误
@czczup 我将sh替换为bash解决了问题,虽然不是很懂其中的原理
The miou and acc are very low but the loss is high