Closed 754467737 closed 1 year ago
Another problem, Why did I generate backbone-corssformer-s.pth according to the order of crossFomer segmentation? Then I completely followed the instruction of Training. The test result was poor after the training.
| monitor | 0.0 | 0.0 | | bulletin board | 0.0 | 0.0 | | shower | 0.0 | 0.0 | | radiator | 0.0 | 0.0 | | glass | 0.0 | 0.0 | | clock | 0.0 | 0.0 | | flag | 0.0 | 0.0 | +---------------------+-------+-------+ 2022-11-04 14:55:17,955 - mmseg - INFO - Summary: 2022-11-04 14:55:17,955 - mmseg - INFO - +--------+-------+-------+-------+ | Scope | mIoU | mAcc | aAcc | +--------+-------+-------+-------+ | global | 17.36 | 24.02 | 70.89 | +--------+-------+-------+-------+
Another problem, Why did I generate backbone-corssformer-s.pth according to the order of crossFomer segmentation? Then I completely followed the instruction of Training. The test result was poor after the training.
| monitor | 0.0 | 0.0 | | bulletin board | 0.0 | 0.0 | | shower | 0.0 | 0.0 | | radiator | 0.0 | 0.0 | | glass | 0.0 | 0.0 | | clock | 0.0 | 0.0 | | flag | 0.0 | 0.0 | +---------------------+-------+-------+ 2022-11-04 14:55:17,955 - mmseg - INFO - Summary: 2022-11-04 14:55:17,955 - mmseg - INFO - +--------+-------+-------+-------+ | Scope | mIoU | mAcc | aAcc | +--------+-------+-------+-------+ | global | 17.36 | 24.02 | 70.89 | +--------+-------+-------+-------+
At the very first of the detection training log, we record whether the pretrained log is loaded correctly. The key words should be Missing keys: xxxx
, Unexpected keys: xxxxx
, or something like that. Theoretically, unexpected keys
only contain weights from classification head. The model may not be loaded correctly (maybe becauseof wrong weight names) if unexpected keys
contain other things.
The following log indicates the model is loaded correctly, which means only weights for classification head is not loaded.
It seems like you didn't load pretrained weights successfully. You should check whether your mmdet version is 2.8.0 as we required. Since in higher version of mmdet you have to call model.init_weights()
in train.py
manually. Any way, as mentioned by cheerss, you should see Unexpected keys: xxxxx
in the very beginning of training log if you load weights successfully.
英语太差了我,两位大佬,我来讲下按照您们给的步骤,我怎么做的,您看下哪里不对: 1.先下载到您的corssFormer在ImageNet上的s模型; 2.按照: import torch ckpt = torch.load("crossformer-s.pth") ## load classification checkpoint torch.save(ckpt["model"], "backbone-corssformer-s.pth") ## only model weights are needed 生成了back-bone-corssformer-s.pth. 3.使用./dist_train.sh configs/fpn_crossformer_b_ade20k_40k.py 8 path/to/backbone-corssformer-s.pth 命令进行了训练,测试,上边就是配置和训练过程,并没有出现加载报错的现象,但是最后的结果就很摆烂。
完全是按照您的方案步骤来的,我也不清楚哪里出了问题。
查看Log文件并没有您说的问题。模型部分和开始训练部分是这样: 2022-11-04 13:43:24,067 - mmseg - INFO - Distributed training: True 2022-11-04 13:43:24,449 - mmseg - INFO - Config: norm_cfg = dict(type='SyncBN', requires_grad=True) model = dict( type='EncoderDecoder', pretrained='backbone-corssformer-s.pth', backbone=dict( type='CrossFormer_S', depth=50, num_stages=4, out_indices=(0, 1, 2, 3), dilations=(1, 1, 1, 1), strides=(1, 2, 2, 2), norm_cfg=dict(type='SyncBN', requires_grad=True), norm_eval=False, style='pytorch', contract_dilation=True, group_size=[14, 14, 7, 7], crs_interval=[16, 8, 2, 1]), neck=dict( type='FPN', in_channels=[96, 192, 384, 768], out_channels=256, num_outs=4), decode_head=dict( type='FPNHead', in_channels=[256, 256, 256, 256], in_index=[0, 1, 2, 3], feature_strides=[4, 8, 16, 32], channels=128, dropout_ratio=0.1, num_classes=150, norm_cfg=dict(type='SyncBN', requires_grad=True), align_corners=False, loss_decode=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0)), train_cfg=dict(), test_cfg=dict(mode='whole'))
2022-11-04 13:43:29,940 - mmseg - INFO - Hooks will be executed in the following order:
before_run:
(VERY_HIGH ) PolyLrUpdaterHook
(NORMAL ) CheckpointHook
(VERY_LOW ) TextLoggerHook
before_train_epoch:
(VERY_HIGH ) PolyLrUpdaterHook
(LOW ) IterTimerHook
(VERY_LOW ) TextLoggerHook
before_train_iter:
(VERY_HIGH ) PolyLrUpdaterHook
(LOW ) IterTimerHook
after_train_iter:
(ABOVE_NORMAL) OptimizerHook
(NORMAL ) CheckpointHook
(NORMAL ) DistEvalHook
(LOW ) IterTimerHook
(VERY_LOW ) TextLoggerHook
after_train_epoch:
(NORMAL ) CheckpointHook
(NORMAL ) DistEvalHook
(VERY_LOW ) TextLoggerHook
before_val_epoch:
(LOW ) IterTimerHook
(VERY_LOW ) TextLoggerHook
before_val_iter: (LOW ) IterTimerHook
after_val_iter: (LOW ) IterTimerHook
after_val_epoch: (VERY_LOW ) TextLoggerHook
after_run: (VERY_LOW ) TextLoggerHook
2022-11-04 13:43:29,944 - mmseg - INFO - workflow: [('train', 1)], max: 80000 iters 2022-11-04 13:43:29,944 - mmseg - INFO - Checkpoints will be saved to /home/**/code/Transformer/CrossFormer-main/segmentation/seg-output by HardDiskBackend. 2022-11-04 13:44:24,903 - mmseg - INFO - Iter [50/80000] lr: 9.994e-05, eta: 12:04:39, time: 0.544, data_time: 0.010, memory: 8092, decode.loss_seg: 3.4683, decode.acc_seg: 16.6169, loss: 3.4683 2022-11-04 13:44:49,215 - mmseg - INFO - Iter [100/80000] lr: 9.989e-05, eta: 11:25:51, time: 0.486, data_time: 0.005, memory: 8092, decode.loss_seg: 2.8443, decode.acc_seg: 26.7270, loss: 2.8443 2022-11-04 13:45:14,513 - mmseg - INFO - Iter [150/80000] lr: 9.983e-05, eta: 11:21:23, time: 0.506, data_time: 0.005, memory: 8092, decode.loss_seg: 2.4701, decode.acc_seg: 31.2440, loss: 2.4701 2022-11-04 13:45:40,163 - mmseg - INFO - Iter [200/80000] lr: 9.978e-05, eta: 11:21:17, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 2.2140, decode.acc_seg: 36.1699, loss: 2.2140 2022-11-04 13:46:05,905 - mmseg - INFO - Iter [250/80000] lr: 9.972e-05, eta: 11:21:33, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 2.0427, decode.acc_seg: 39.1546, loss: 2.0427 2022-11-04 13:46:31,843 - mmseg - INFO - Iter [300/80000] lr: 9.966e-05, eta: 11:22:26, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 2.1060, decode.acc_seg: 34.4056, loss: 2.1060 2022-11-04 13:46:57,434 - mmseg - INFO - Iter [350/80000] lr: 9.961e-05, eta: 11:21:39, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 2.0308, decode.acc_seg: 39.0574, loss: 2.0308 2022-11-04 13:47:23,171 - mmseg - INFO - Iter [400/80000] lr: 9.955e-05, eta: 11:21:25, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 1.8884, decode.acc_seg: 40.4996, loss: 1.8884 2022-11-04 13:47:49,108 - mmseg - INFO - Iter [450/80000] lr: 9.949e-05, eta: 11:21:45, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 1.9453, decode.acc_seg: 41.3078, loss: 1.9453 2022-11-04 13:48:14,501 - mmseg - INFO - Iter [500/80000] lr: 9.944e-05, eta: 11:20:28, time: 0.508, data_time: 0.005, memory: 8092, decode.loss_seg: 2.0508, decode.acc_seg: 41.2130, loss: 2.0508 2022-11-04 13:48:40,141 - mmseg - INFO - Iter [550/80000] lr: 9.938e-05, eta: 11:19:57, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.8725, decode.acc_seg: 41.9013, loss: 1.8725 2022-11-04 13:49:06,029 - mmseg - INFO - Iter [600/80000] lr: 9.933e-05, eta: 11:20:00, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6608, decode.acc_seg: 42.7472, loss: 1.6608 2022-11-04 13:49:32,040 - mmseg - INFO - Iter [650/80000] lr: 9.927e-05, eta: 11:20:13, time: 0.520, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6577, decode.acc_seg: 44.3362, loss: 1.6577 2022-11-04 13:49:57,890 - mmseg - INFO - Iter [700/80000] lr: 9.921e-05, eta: 11:20:02, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5889, decode.acc_seg: 46.3231, loss: 1.5889 2022-11-04 13:50:23,732 - mmseg - INFO - Iter [750/80000] lr: 9.916e-05, eta: 11:19:48, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.7936, decode.acc_seg: 42.7250, loss: 1.7936 2022-11-04 13:50:49,560 - mmseg - INFO - Iter [800/80000] lr: 9.910e-05, eta: 11:19:32, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6898, decode.acc_seg: 41.8700, loss: 1.6898 2022-11-04 13:51:15,230 - mmseg - INFO - Iter [850/80000] lr: 9.904e-05, eta: 11:19:00, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.8207, decode.acc_seg: 42.4152, loss: 1.8207 2022-11-04 13:51:41,152 - mmseg - INFO - Iter [900/80000] lr: 9.899e-05, eta: 11:18:50, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5190, decode.acc_seg: 45.6741, loss: 1.5190 2022-11-04 13:52:06,839 - mmseg - INFO - Iter [950/80000] lr: 9.893e-05, eta: 11:18:19, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6854, decode.acc_seg: 42.2716, loss: 1.6854 2022-11-04 13:52:32,443 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py
sys.platform: linux Python: 3.6.15 | packaged by conda-forge | (default, Dec 3 2021, 18:49:41) [GCC 9.4.0] CUDA available: True GPU 0,1,2,3: TITAN X (Pascal) CUDA_HOME: /home/*****/usr/local/cuda-10.2 NVCC: Cuda compilation tools, release 10.2, V10.2.89 GCC: gcc (Ubuntu 5.4.0-6ubuntu1~16.04.12) 5.4.0 20160609 PyTorch: 1.10.1+cu102 PyTorch compiling details: PyTorch built with:
2022-11-04 13:43:24,067 - mmseg - INFO - Distributed training: True 2022-11-04 13:43:24,449 - mmseg - INFO - Config: norm_cfg = dict(type='SyncBN', requires_grad=True) model = dict( type='EncoderDecoder', pretrained='backbone-corssformer-s.pth', backbone=dict( type='CrossFormer_S', depth=50, num_stages=4, out_indices=(0, 1, 2, 3), dilations=(1, 1, 1, 1), strides=(1, 2, 2, 2), norm_cfg=dict(type='SyncBN', requires_grad=True), norm_eval=False, style='pytorch', contract_dilation=True, group_size=[14, 14, 7, 7], crs_interval=[16, 8, 2, 1]), neck=dict( type='FPN', in_channels=[96, 192, 384, 768], out_channels=256, num_outs=4), decode_head=dict( type='FPNHead', in_channels=[256, 256, 256, 256], in_index=[0, 1, 2, 3], feature_strides=[4, 8, 16, 32], channels=128, dropout_ratio=0.1, num_classes=150, norm_cfg=dict(type='SyncBN', requires_grad=True), align_corners=False, loss_decode=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0)), train_cfg=dict(), test_cfg=dict(mode='whole')) dataset_type = 'ADE20KDataset' data_root = './ADEChallengeData2016' img_norm_cfg = dict( mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True) crop_size = (512, 512) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', reduce_zero_label=True), dict(type='Resize', img_scale=(2048, 512), ratio_range=(0.5, 2.0)), dict(type='RandomCrop', crop_size=(512, 512), cat_max_ratio=0.75), dict(type='RandomFlip', prob=0.5), dict(type='PhotoMetricDistortion'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size=(512, 512), pad_val=0, seg_pad_val=255), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_semantic_seg']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(2048, 512), flip=False, transforms=[ dict(type='AlignResize', keep_ratio=True, size_divisor=32), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=4, workers_per_gpu=4, train=dict( type='RepeatDataset', times=50, dataset=dict( type='ADE20KDataset', data_root='./ADEChallengeData2016', img_dir='images/training', ann_dir='annotations/training', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', reduce_zero_label=True), dict( type='Resize', img_scale=(2048, 512), ratio_range=(0.5, 2.0)), dict( type='RandomCrop', crop_size=(512, 512), cat_max_ratio=0.75), dict(type='RandomFlip', prob=0.5), dict(type='PhotoMetricDistortion'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size=(512, 512), pad_val=0, seg_pad_val=255), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_semantic_seg']) ])), val=dict( type='ADE20KDataset', data_root='./ADEChallengeData2016', img_dir='images/validation', ann_dir='annotations/validation', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(2048, 512), flip=False, transforms=[ dict(type='AlignResize', keep_ratio=True, size_divisor=32), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ]), test=dict( type='ADE20KDataset', data_root='./ADEChallengeData2016', img_dir='images/validation', ann_dir='annotations/validation', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(2048, 512), flip=False, transforms=[ dict(type='AlignResize', keep_ratio=True, size_divisor=32), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ])) log_config = dict( interval=50, hooks=[dict(type='TextLoggerHook', by_epoch=False)]) dist_params = dict(backend='nccl') log_level = 'INFO' load_from = None resume_from = None workflow = [('train', 1)] cudnn_benchmark = True gpu_multiples = 1 optimizer = dict(type='AdamW', lr=0.0001, weight_decay=0.0001) optimizer_config = dict() lr_config = dict(policy='poly', power=0.9, min_lr=0.0, by_epoch=False) runner = dict(type='IterBasedRunner', max_iters=80000) checkpoint_config = dict(by_epoch=False, interval=8000) evaluation = dict(interval=8000, metric='mIoU') work_dir = './seg-output' gpu_ids = range(0, 1)
2022-11-04 13:43:25,326 - mmseg - INFO - EncoderDecoder(
(backbone): CrossFormer_S(
(patch_embed): PatchEmbed(
(projs): ModuleList(
(0): Conv2d(3, 48, kernel_size=(4, 4), stride=(4, 4))
(1): Conv2d(3, 24, kernel_size=(8, 8), stride=(4, 4), padding=(2, 2))
(2): Conv2d(3, 12, kernel_size=(16, 16), stride=(4, 4), padding=(6, 6))
(3): Conv2d(3, 12, kernel_size=(32, 32), stride=(4, 4), padding=(14, 14))
)
(norm): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
)
(pos_drop): Dropout(p=0.0, inplace=False)
(layers): ModuleList(
(0): Stage(
dim=96, depth=2
(blocks): ModuleList(
(0): CrossFormerBlock(
dim=96, input_resolution=(256, 256), num_heads=3, group_size=14, lsda_flag=0, mlp_ratio=4
(norm1): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=96, num_heads=3
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=6, bias=True)
(pos1): Sequential(
(0): LayerNorm((6,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=6, out_features=6, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((6,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=6, out_features=6, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((6,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=6, out_features=3, bias=True)
)
)
(qkv): Linear(in_features=96, out_features=288, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=96, out_features=96, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): Identity()
(norm2): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=96, out_features=384, bias=True)
(act): GELU()
(fc2): Linear(in_features=384, out_features=96, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(1): CrossFormerBlock(
dim=96, input_resolution=(256, 256), num_heads=3, group_size=14, lsda_flag=1, mlp_ratio=4
(norm1): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=96, num_heads=3
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=6, bias=True)
(pos1): Sequential(
(0): LayerNorm((6,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=6, out_features=6, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((6,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=6, out_features=6, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((6,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=6, out_features=3, bias=True)
)
)
(qkv): Linear(in_features=96, out_features=288, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=96, out_features=96, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.018)
(norm2): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=96, out_features=384, bias=True)
(act): GELU()
(fc2): Linear(in_features=384, out_features=96, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
)
(downsample): PatchMerging(
input_resolution=(256, 256), dim=96
(reductions): ModuleList(
(0): Conv2d(96, 96, kernel_size=(2, 2), stride=(2, 2))
(1): Conv2d(96, 96, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
)
(norm): LayerNorm((96,), eps=1e-05, elementwise_affine=True)
)
)
(1): Stage(
dim=192, depth=2
(blocks): ModuleList(
(0): CrossFormerBlock(
dim=192, input_resolution=(128, 128), num_heads=6, group_size=14, lsda_flag=0, mlp_ratio=4
(norm1): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=192, num_heads=6
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=12, bias=True)
(pos1): Sequential(
(0): LayerNorm((12,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=12, out_features=12, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((12,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=12, out_features=12, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((12,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=12, out_features=6, bias=True)
)
)
(qkv): Linear(in_features=192, out_features=576, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=192, out_features=192, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.036)
(norm2): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=192, out_features=768, bias=True)
(act): GELU()
(fc2): Linear(in_features=768, out_features=192, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(1): CrossFormerBlock(
dim=192, input_resolution=(128, 128), num_heads=6, group_size=14, lsda_flag=1, mlp_ratio=4
(norm1): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=192, num_heads=6
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=12, bias=True)
(pos1): Sequential(
(0): LayerNorm((12,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=12, out_features=12, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((12,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=12, out_features=12, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((12,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=12, out_features=6, bias=True)
)
)
(qkv): Linear(in_features=192, out_features=576, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=192, out_features=192, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.055)
(norm2): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=192, out_features=768, bias=True)
(act): GELU()
(fc2): Linear(in_features=768, out_features=192, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
)
(downsample): PatchMerging(
input_resolution=(128, 128), dim=192
(reductions): ModuleList(
(0): Conv2d(192, 192, kernel_size=(2, 2), stride=(2, 2))
(1): Conv2d(192, 192, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
)
(norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
)
)
(2): Stage(
dim=384, depth=6
(blocks): ModuleList(
(0): CrossFormerBlock(
dim=384, input_resolution=(64, 64), num_heads=12, group_size=7, lsda_flag=0, mlp_ratio=4
(norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=384, num_heads=12
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=24, bias=True)
(pos1): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=12, bias=True)
)
)
(qkv): Linear(in_features=384, out_features=1152, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=384, out_features=384, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.073)
(norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=384, out_features=1536, bias=True)
(act): GELU()
(fc2): Linear(in_features=1536, out_features=384, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(1): CrossFormerBlock(
dim=384, input_resolution=(64, 64), num_heads=12, group_size=7, lsda_flag=1, mlp_ratio=4
(norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=384, num_heads=12
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=24, bias=True)
(pos1): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=12, bias=True)
)
)
(qkv): Linear(in_features=384, out_features=1152, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=384, out_features=384, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.091)
(norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=384, out_features=1536, bias=True)
(act): GELU()
(fc2): Linear(in_features=1536, out_features=384, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(2): CrossFormerBlock(
dim=384, input_resolution=(64, 64), num_heads=12, group_size=7, lsda_flag=0, mlp_ratio=4
(norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=384, num_heads=12
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=24, bias=True)
(pos1): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=12, bias=True)
)
)
(qkv): Linear(in_features=384, out_features=1152, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=384, out_features=384, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.109)
(norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=384, out_features=1536, bias=True)
(act): GELU()
(fc2): Linear(in_features=1536, out_features=384, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(3): CrossFormerBlock(
dim=384, input_resolution=(64, 64), num_heads=12, group_size=7, lsda_flag=1, mlp_ratio=4
(norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=384, num_heads=12
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=24, bias=True)
(pos1): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=12, bias=True)
)
)
(qkv): Linear(in_features=384, out_features=1152, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=384, out_features=384, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.127)
(norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=384, out_features=1536, bias=True)
(act): GELU()
(fc2): Linear(in_features=1536, out_features=384, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(4): CrossFormerBlock(
dim=384, input_resolution=(64, 64), num_heads=12, group_size=7, lsda_flag=0, mlp_ratio=4
(norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=384, num_heads=12
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=24, bias=True)
(pos1): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=12, bias=True)
)
)
(qkv): Linear(in_features=384, out_features=1152, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=384, out_features=384, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.145)
(norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=384, out_features=1536, bias=True)
(act): GELU()
(fc2): Linear(in_features=1536, out_features=384, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(5): CrossFormerBlock(
dim=384, input_resolution=(64, 64), num_heads=12, group_size=7, lsda_flag=1, mlp_ratio=4
(norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=384, num_heads=12
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=24, bias=True)
(pos1): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=24, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((24,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=24, out_features=12, bias=True)
)
)
(qkv): Linear(in_features=384, out_features=1152, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=384, out_features=384, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.164)
(norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=384, out_features=1536, bias=True)
(act): GELU()
(fc2): Linear(in_features=1536, out_features=384, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
)
(downsample): PatchMerging(
input_resolution=(64, 64), dim=384
(reductions): ModuleList(
(0): Conv2d(384, 384, kernel_size=(2, 2), stride=(2, 2))
(1): Conv2d(384, 384, kernel_size=(4, 4), stride=(2, 2), padding=(1, 1))
)
(norm): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
)
)
(3): Stage(
dim=768, depth=2
(blocks): ModuleList(
(0): CrossFormerBlock(
dim=768, input_resolution=(32, 32), num_heads=24, group_size=7, lsda_flag=0, mlp_ratio=4
(norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=768, num_heads=24
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=48, bias=True)
(pos1): Sequential(
(0): LayerNorm((48,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=48, out_features=48, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((48,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=48, out_features=48, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((48,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=48, out_features=24, bias=True)
)
)
(qkv): Linear(in_features=768, out_features=2304, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=768, out_features=768, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.182)
(norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=768, out_features=3072, bias=True)
(act): GELU()
(fc2): Linear(in_features=3072, out_features=768, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
(1): CrossFormerBlock(
dim=768, input_resolution=(32, 32), num_heads=24, group_size=7, lsda_flag=1, mlp_ratio=4
(norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(attn): Attention(
dim=768, num_heads=24
(pos): DynamicPosBias(
(pos_proj): Linear(in_features=2, out_features=48, bias=True)
(pos1): Sequential(
(0): LayerNorm((48,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=48, out_features=48, bias=True)
)
(pos2): Sequential(
(0): LayerNorm((48,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=48, out_features=48, bias=True)
)
(pos3): Sequential(
(0): LayerNorm((48,), eps=1e-05, elementwise_affine=True)
(1): ReLU(inplace=True)
(2): Linear(in_features=48, out_features=24, bias=True)
)
)
(qkv): Linear(in_features=768, out_features=2304, bias=True)
(attn_drop): Dropout(p=0.0, inplace=False)
(proj): Linear(in_features=768, out_features=768, bias=True)
(proj_drop): Dropout(p=0.0, inplace=False)
(softmax): Softmax(dim=-1)
)
(drop_path): DropPath(drop_prob=0.200)
(norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(mlp): Mlp(
(fc1): Linear(in_features=768, out_features=3072, bias=True)
(act): GELU()
(fc2): Linear(in_features=3072, out_features=768, bias=True)
(drop): Dropout(p=0.0, inplace=False)
)
)
)
)
)
)
(neck): FPN(
(lateral_convs): ModuleList(
(0): ConvModule(
(conv): Conv2d(96, 256, kernel_size=(1, 1), stride=(1, 1))
)
(1): ConvModule(
(conv): Conv2d(192, 256, kernel_size=(1, 1), stride=(1, 1))
)
(2): ConvModule(
(conv): Conv2d(384, 256, kernel_size=(1, 1), stride=(1, 1))
)
(3): ConvModule(
(conv): Conv2d(768, 256, kernel_size=(1, 1), stride=(1, 1))
)
)
(fpn_convs): ModuleList(
(0): ConvModule(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
)
(1): ConvModule(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
)
(2): ConvModule(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
)
(3): ConvModule(
(conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
)
)
)
(decode_head): FPNHead(
input_transform=multiple_select, ignore_index=255, align_corners=False
(loss_decode): CrossEntropyLoss()
(conv_seg): Conv2d(128, 150, kernel_size=(1, 1), stride=(1, 1))
(dropout): Dropout2d(p=0.1, inplace=False)
(scale_heads): ModuleList(
(0): Sequential(
(0): ConvModule(
(conv): Conv2d(256, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): SyncBatchNorm(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activate): ReLU(inplace=True)
)
)
(1): Sequential(
(0): ConvModule(
(conv): Conv2d(256, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): SyncBatchNorm(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activate): ReLU(inplace=True)
)
(1): Upsample(scale_factor=2.0, mode=bilinear)
)
(2): Sequential(
(0): ConvModule(
(conv): Conv2d(256, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): SyncBatchNorm(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activate): ReLU(inplace=True)
)
(1): Upsample(scale_factor=2.0, mode=bilinear)
(2): ConvModule(
(conv): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): SyncBatchNorm(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activate): ReLU(inplace=True)
)
(3): Upsample(scale_factor=2.0, mode=bilinear)
)
(3): Sequential(
(0): ConvModule(
(conv): Conv2d(256, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): SyncBatchNorm(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activate): ReLU(inplace=True)
)
(1): Upsample(scale_factor=2.0, mode=bilinear)
(2): ConvModule(
(conv): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): SyncBatchNorm(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activate): ReLU(inplace=True)
)
(3): Upsample(scale_factor=2.0, mode=bilinear)
(4): ConvModule(
(conv): Conv2d(128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
(bn): SyncBatchNorm(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
(activate): ReLU(inplace=True)
)
(5): Upsample(scale_factor=2.0, mode=bilinear)
)
)
)
)
2022-11-04 13:43:25,787 - mmseg - INFO - Loaded 20210 images
2022-11-04 13:43:29,938 - mmseg - INFO - Loaded 2000 images
2022-11-04 13:43:29,940 - mmseg - INFO - Start running, host: @bj05, work_dir: /home//code/Transformer/CrossFormer-main/segmentation/seg-output
2022-11-04 13:43:29,940 - mmseg - INFO - Hooks will be executed in the following order:
before_run:
(VERY_HIGH ) PolyLrUpdaterHook
(NORMAL ) CheckpointHook
(VERY_LOW ) TextLoggerHook
before_train_epoch:
(VERY_HIGH ) PolyLrUpdaterHook
(LOW ) IterTimerHook
(VERY_LOW ) TextLoggerHook
before_train_iter:
(VERY_HIGH ) PolyLrUpdaterHook
(LOW ) IterTimerHook
after_train_iter:
(ABOVE_NORMAL) OptimizerHook
(NORMAL ) CheckpointHook
(NORMAL ) DistEvalHook
(LOW ) IterTimerHook
(VERY_LOW ) TextLoggerHook
after_train_epoch:
(NORMAL ) CheckpointHook
(NORMAL ) DistEvalHook
(VERY_LOW ) TextLoggerHook
before_val_epoch:
(LOW ) IterTimerHook
(VERY_LOW ) TextLoggerHook
before_val_iter: (LOW ) IterTimerHook
after_val_iter: (LOW ) IterTimerHook
after_val_epoch: (VERY_LOW ) TextLoggerHook
after_run: (VERY_LOW ) TextLoggerHook
2022-11-04 13:43:29,944 - mmseg - INFO - workflow: [('train', 1)], max: 80000 iters 2022-11-04 13:43:29,944 - mmseg - INFO - Checkpoints will be saved to /home/*****/code/Transformer/CrossFormer-main/segmentation/seg-output by HardDiskBackend. 2022-11-04 13:44:24,903 - mmseg - INFO - Iter [50/80000] lr: 9.994e-05, eta: 12:04:39, time: 0.544, data_time: 0.010, memory: 8092, decode.loss_seg: 3.4683, decode.acc_seg: 16.6169, loss: 3.4683 2022-11-04 13:44:49,215 - mmseg - INFO - Iter [100/80000] lr: 9.989e-05, eta: 11:25:51, time: 0.486, data_time: 0.005, memory: 8092, decode.loss_seg: 2.8443, decode.acc_seg: 26.7270, loss: 2.8443 2022-11-04 13:45:14,513 - mmseg - INFO - Iter [150/80000] lr: 9.983e-05, eta: 11:21:23, time: 0.506, data_time: 0.005, memory: 8092, decode.loss_seg: 2.4701, decode.acc_seg: 31.2440, loss: 2.4701 2022-11-04 13:45:40,163 - mmseg - INFO - Iter [200/80000] lr: 9.978e-05, eta: 11:21:17, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 2.2140, decode.acc_seg: 36.1699, loss: 2.2140 2022-11-04 13:46:05,905 - mmseg - INFO - Iter [250/80000] lr: 9.972e-05, eta: 11:21:33, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 2.0427, decode.acc_seg: 39.1546, loss: 2.0427 2022-11-04 13:46:31,843 - mmseg - INFO - Iter [300/80000] lr: 9.966e-05, eta: 11:22:26, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 2.1060, decode.acc_seg: 34.4056, loss: 2.1060 2022-11-04 13:46:57,434 - mmseg - INFO - Iter [350/80000] lr: 9.961e-05, eta: 11:21:39, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 2.0308, decode.acc_seg: 39.0574, loss: 2.0308 2022-11-04 13:47:23,171 - mmseg - INFO - Iter [400/80000] lr: 9.955e-05, eta: 11:21:25, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 1.8884, decode.acc_seg: 40.4996, loss: 1.8884 2022-11-04 13:47:49,108 - mmseg - INFO - Iter [450/80000] lr: 9.949e-05, eta: 11:21:45, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 1.9453, decode.acc_seg: 41.3078, loss: 1.9453 2022-11-04 13:48:14,501 - mmseg - INFO - Iter [500/80000] lr: 9.944e-05, eta: 11:20:28, time: 0.508, data_time: 0.005, memory: 8092, decode.loss_seg: 2.0508, decode.acc_seg: 41.2130, loss: 2.0508 2022-11-04 13:48:40,141 - mmseg - INFO - Iter [550/80000] lr: 9.938e-05, eta: 11:19:57, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.8725, decode.acc_seg: 41.9013, loss: 1.8725 2022-11-04 13:49:06,029 - mmseg - INFO - Iter [600/80000] lr: 9.933e-05, eta: 11:20:00, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6608, decode.acc_seg: 42.7472, loss: 1.6608 2022-11-04 13:49:32,040 - mmseg - INFO - Iter [650/80000] lr: 9.927e-05, eta: 11:20:13, time: 0.520, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6577, decode.acc_seg: 44.3362, loss: 1.6577 2022-11-04 13:49:57,890 - mmseg - INFO - Iter [700/80000] lr: 9.921e-05, eta: 11:20:02, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5889, decode.acc_seg: 46.3231, loss: 1.5889 2022-11-04 13:50:23,732 - mmseg - INFO - Iter [750/80000] lr: 9.916e-05, eta: 11:19:48, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.7936, decode.acc_seg: 42.7250, loss: 1.7936 2022-11-04 13:50:49,560 - mmseg - INFO - Iter [800/80000] lr: 9.910e-05, eta: 11:19:32, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6898, decode.acc_seg: 41.8700, loss: 1.6898 2022-11-04 13:51:15,230 - mmseg - INFO - Iter [850/80000] lr: 9.904e-05, eta: 11:19:00, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.8207, decode.acc_seg: 42.4152, loss: 1.8207 2022-11-04 13:51:41,152 - mmseg - INFO - Iter [900/80000] lr: 9.899e-05, eta: 11:18:50, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5190, decode.acc_seg: 45.6741, loss: 1.5190 2022-11-04 13:52:06,839 - mmseg - INFO - Iter [950/80000] lr: 9.893e-05, eta: 11:18:19, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6854, decode.acc_seg: 42.2716, loss: 1.6854 2022-11-04 13:52:32,443 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py 2022-11-04 13:52:32,443 - mmseg - INFO - Iter [1000/80000] lr: 9.888e-05, eta: 11:17:43, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6862, decode.acc_seg: 44.6595, loss: 1.6862 2022-11-04 13:52:58,196 - mmseg - INFO - Iter [1050/80000] lr: 9.882e-05, eta: 11:17:18, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5941, decode.acc_seg: 49.2542, loss: 1.5941 2022-11-04 13:53:24,082 - mmseg - INFO - Iter [1100/80000] lr: 9.876e-05, eta: 11:17:03, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4567, decode.acc_seg: 45.4291, loss: 1.4567 2022-11-04 13:53:50,016 - mmseg - INFO - Iter [1150/80000] lr: 9.871e-05, eta: 11:16:50, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6102, decode.acc_seg: 42.4937, loss: 1.6102 2022-11-04 13:54:15,907 - mmseg - INFO - Iter [1200/80000] lr: 9.865e-05, eta: 11:16:34, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5074, decode.acc_seg: 45.4749, loss: 1.5074 2022-11-04 13:54:41,556 - mmseg - INFO - Iter [1250/80000] lr: 9.859e-05, eta: 11:16:01, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5415, decode.acc_seg: 45.3889, loss: 1.5415 2022-11-04 13:55:07,309 - mmseg - INFO - Iter [1300/80000] lr: 9.854e-05, eta: 11:15:35, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6641, decode.acc_seg: 46.5552, loss: 1.6641 2022-11-04 13:55:33,227 - mmseg - INFO - Iter [1350/80000] lr: 9.848e-05, eta: 11:15:19, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5377, decode.acc_seg: 48.5784, loss: 1.5377 2022-11-04 13:55:59,127 - mmseg - INFO - Iter [1400/80000] lr: 9.842e-05, eta: 11:15:01, time: 0.518, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3044, decode.acc_seg: 51.3704, loss: 1.3044 2022-11-04 13:56:24,728 - mmseg - INFO - Iter [1450/80000] lr: 9.837e-05, eta: 11:14:26, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5430, decode.acc_seg: 46.1576, loss: 1.5430 2022-11-04 13:56:50,358 - mmseg - INFO - Iter [1500/80000] lr: 9.831e-05, eta: 11:13:54, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5724, decode.acc_seg: 43.2211, loss: 1.5724 2022-11-04 13:57:16,026 - mmseg - INFO - Iter [1550/80000] lr: 9.826e-05, eta: 11:13:24, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6610, decode.acc_seg: 45.4836, loss: 1.6610 2022-11-04 13:57:41,765 - mmseg - INFO - Iter [1600/80000] lr: 9.820e-05, eta: 11:12:57, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 1.6041, decode.acc_seg: 45.3171, loss: 1.6041 2022-11-04 13:58:07,479 - mmseg - INFO - Iter [1650/80000] lr: 9.814e-05, eta: 11:12:30, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.5666, decode.acc_seg: 43.8042, loss: 1.5666 2022-11-04 13:58:33,418 - mmseg - INFO - Iter [1700/80000] lr: 9.809e-05, eta: 11:12:13, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4654, decode.acc_seg: 45.7577, loss: 1.4654 2022-11-04 13:58:59,164 - mmseg - INFO - Iter [1750/80000] lr: 9.803e-05, eta: 11:11:47, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4059, decode.acc_seg: 47.0010, loss: 1.4059 2022-11-04 13:59:25,233 - mmseg - INFO - Iter [1800/80000] lr: 9.797e-05, eta: 11:11:34, time: 0.521, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4029, decode.acc_seg: 47.7166, loss: 1.4029 2022-11-04 13:59:51,034 - mmseg - INFO - Iter [1850/80000] lr: 9.792e-05, eta: 11:11:10, time: 0.516, data_time: 0.006, memory: 8092, decode.loss_seg: 1.4239, decode.acc_seg: 48.1321, loss: 1.4239 2022-11-04 14:00:16,888 - mmseg - INFO - Iter [1900/80000] lr: 9.786e-05, eta: 11:10:48, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4782, decode.acc_seg: 46.2986, loss: 1.4782 2022-11-04 14:00:42,700 - mmseg - INFO - Iter [1950/80000] lr: 9.780e-05, eta: 11:10:24, time: 0.516, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3927, decode.acc_seg: 48.4327, loss: 1.3927 2022-11-04 14:01:08,398 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py 2022-11-04 14:01:08,399 - mmseg - INFO - Iter [2000/80000] lr: 9.775e-05, eta: 11:09:56, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4079, decode.acc_seg: 48.5354, loss: 1.4079 2022-11-04 14:01:33,865 - mmseg - INFO - Iter [2050/80000] lr: 9.769e-05, eta: 11:09:18, time: 0.509, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4431, decode.acc_seg: 48.6974, loss: 1.4431 2022-11-04 14:01:59,707 - mmseg - INFO - Iter [2100/80000] lr: 9.764e-05, eta: 11:08:56, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3548, decode.acc_seg: 46.7421, loss: 1.3548 2022-11-04 14:02:25,639 - mmseg - INFO - Iter [2150/80000] lr: 9.758e-05, eta: 11:08:36, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4530, decode.acc_seg: 44.5672, loss: 1.4530 2022-11-04 14:02:51,441 - mmseg - INFO - Iter [2200/80000] lr: 9.752e-05, eta: 11:08:12, time: 0.516, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4339, decode.acc_seg: 48.3714, loss: 1.4339 2022-11-04 14:03:17,067 - mmseg - INFO - Iter [2250/80000] lr: 9.747e-05, eta: 11:07:41, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3695, decode.acc_seg: 47.9080, loss: 1.3695 2022-11-04 14:03:43,030 - mmseg - INFO - Iter [2300/80000] lr: 9.741e-05, eta: 11:07:22, time: 0.519, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3732, decode.acc_seg: 45.3119, loss: 1.3732 2022-11-04 14:04:08,692 - mmseg - INFO - Iter [2350/80000] lr: 9.735e-05, eta: 11:06:53, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4142, decode.acc_seg: 48.2505, loss: 1.4142 2022-11-04 14:04:34,412 - mmseg - INFO - Iter [2400/80000] lr: 9.730e-05, eta: 11:06:25, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3220, decode.acc_seg: 47.9718, loss: 1.3220 2022-11-04 14:04:59,932 - mmseg - INFO - Iter [2450/80000] lr: 9.724e-05, eta: 11:05:52, time: 0.510, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2838, decode.acc_seg: 49.7242, loss: 1.2838 2022-11-04 14:05:25,699 - mmseg - INFO - Iter [2500/80000] lr: 9.718e-05, eta: 11:05:26, time: 0.515, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4605, decode.acc_seg: 47.4277, loss: 1.4605 2022-11-04 14:05:51,392 - mmseg - INFO - Iter [2550/80000] lr: 9.713e-05, eta: 11:04:59, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3958, decode.acc_seg: 46.4153, loss: 1.3958 2022-11-04 14:06:17,213 - mmseg - INFO - Iter [2600/80000] lr: 9.707e-05, eta: 11:04:35, time: 0.516, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3851, decode.acc_seg: 49.0836, loss: 1.3851 2022-11-04 14:06:42,818 - mmseg - INFO - Iter [2650/80000] lr: 9.701e-05, eta: 11:04:04, time: 0.512, data_time: 0.006, memory: 8092, decode.loss_seg: 1.4383, decode.acc_seg: 46.1636, loss: 1.4383 2022-11-04 14:07:08,400 - mmseg - INFO - Iter [2700/80000] lr: 9.696e-05, eta: 11:03:34, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3615, decode.acc_seg: 46.4198, loss: 1.3615 2022-11-04 14:07:34,004 - mmseg - INFO - Iter [2750/80000] lr: 9.690e-05, eta: 11:03:04, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4707, decode.acc_seg: 46.3678, loss: 1.4707 2022-11-04 14:07:59,516 - mmseg - INFO - Iter [2800/80000] lr: 9.685e-05, eta: 11:02:31, time: 0.510, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4111, decode.acc_seg: 48.2788, loss: 1.4111 2022-11-04 14:08:25,237 - mmseg - INFO - Iter [2850/80000] lr: 9.679e-05, eta: 11:02:05, time: 0.514, data_time: 0.006, memory: 8092, decode.loss_seg: 1.3652, decode.acc_seg: 48.7730, loss: 1.3652 2022-11-04 14:08:51,254 - mmseg - INFO - Iter [2900/80000] lr: 9.673e-05, eta: 11:01:46, time: 0.520, data_time: 0.006, memory: 8092, decode.loss_seg: 1.3638, decode.acc_seg: 46.0210, loss: 1.3638 2022-11-04 14:09:16,839 - mmseg - INFO - Iter [2950/80000] lr: 9.668e-05, eta: 11:01:16, time: 0.512, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2732, decode.acc_seg: 49.0852, loss: 1.2732 2022-11-04 14:09:42,173 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py 2022-11-04 14:09:42,173 - mmseg - INFO - Iter [3000/80000] lr: 9.662e-05, eta: 11:00:40, time: 0.507, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4224, decode.acc_seg: 45.6549, loss: 1.4224 2022-11-04 14:10:07,917 - mmseg - INFO - Iter [3050/80000] lr: 9.656e-05, eta: 11:00:14, time: 0.515, data_time: 0.006, memory: 8092, decode.loss_seg: 1.4463, decode.acc_seg: 46.3708, loss: 1.4463 2022-11-04 14:10:33,291 - mmseg - INFO - Iter [3100/80000] lr: 9.651e-05, eta: 10:59:39, time: 0.507, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2327, decode.acc_seg: 49.7993, loss: 1.2327 2022-11-04 14:10:58,957 - mmseg - INFO - Iter [3150/80000] lr: 9.645e-05, eta: 10:59:12, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3103, decode.acc_seg: 50.6094, loss: 1.3103 2022-11-04 14:11:24,379 - mmseg - INFO - Iter [3200/80000] lr: 9.639e-05, eta: 10:58:39, time: 0.508, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2830, decode.acc_seg: 50.3910, loss: 1.2830 2022-11-04 14:11:49,879 - mmseg - INFO - Iter [3250/80000] lr: 9.634e-05, eta: 10:58:08, time: 0.510, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1890, decode.acc_seg: 51.5413, loss: 1.1890 2022-11-04 14:12:15,284 - mmseg - INFO - Iter [3300/80000] lr: 9.628e-05, eta: 10:57:34, time: 0.508, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2103, decode.acc_seg: 50.5873, loss: 1.2103 2022-11-04 14:12:40,692 - mmseg - INFO - Iter [3350/80000] lr: 9.622e-05, eta: 10:57:02, time: 0.508, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2148, decode.acc_seg: 48.3905, loss: 1.2148 2022-11-04 14:13:06,694 - mmseg - INFO - Iter [3400/80000] lr: 9.617e-05, eta: 10:56:42, time: 0.520, data_time: 0.006, memory: 8092, decode.loss_seg: 1.4202, decode.acc_seg: 48.1635, loss: 1.4202 2022-11-04 14:13:32,363 - mmseg - INFO - Iter [3450/80000] lr: 9.611e-05, eta: 10:56:15, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3033, decode.acc_seg: 48.5054, loss: 1.3033 2022-11-04 14:13:57,980 - mmseg - INFO - Iter [3500/80000] lr: 9.605e-05, eta: 10:55:48, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3325, decode.acc_seg: 49.3714, loss: 1.3325 2022-11-04 14:14:23,675 - mmseg - INFO - Iter [3550/80000] lr: 9.600e-05, eta: 10:55:21, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3439, decode.acc_seg: 51.6929, loss: 1.3439 2022-11-04 14:14:48,969 - mmseg - INFO - Iter [3600/80000] lr: 9.594e-05, eta: 10:54:47, time: 0.506, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1934, decode.acc_seg: 52.9994, loss: 1.1934 2022-11-04 14:15:14,582 - mmseg - INFO - Iter [3650/80000] lr: 9.589e-05, eta: 10:54:19, time: 0.512, data_time: 0.006, memory: 8092, decode.loss_seg: 1.3977, decode.acc_seg: 48.0340, loss: 1.3977 2022-11-04 14:15:40,202 - mmseg - INFO - Iter [3700/80000] lr: 9.583e-05, eta: 10:53:51, time: 0.512, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2206, decode.acc_seg: 51.8168, loss: 1.2206 2022-11-04 14:16:05,558 - mmseg - INFO - Iter [3750/80000] lr: 9.577e-05, eta: 10:53:18, time: 0.507, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2243, decode.acc_seg: 53.6773, loss: 1.2243 2022-11-04 14:16:30,976 - mmseg - INFO - Iter [3800/80000] lr: 9.572e-05, eta: 10:52:47, time: 0.508, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1981, decode.acc_seg: 50.4609, loss: 1.1981 2022-11-04 14:16:56,442 - mmseg - INFO - Iter [3850/80000] lr: 9.566e-05, eta: 10:52:17, time: 0.509, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1697, decode.acc_seg: 51.5022, loss: 1.1697 2022-11-04 14:17:22,036 - mmseg - INFO - Iter [3900/80000] lr: 9.560e-05, eta: 10:51:49, time: 0.512, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2443, decode.acc_seg: 51.1766, loss: 1.2443 2022-11-04 14:17:47,746 - mmseg - INFO - Iter [3950/80000] lr: 9.555e-05, eta: 10:51:23, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3240, decode.acc_seg: 49.2007, loss: 1.3240 2022-11-04 14:18:13,430 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py 2022-11-04 14:18:13,430 - mmseg - INFO - Iter [4000/80000] lr: 9.549e-05, eta: 10:50:57, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2307, decode.acc_seg: 49.3739, loss: 1.2307 2022-11-04 14:18:39,043 - mmseg - INFO - Iter [4050/80000] lr: 9.543e-05, eta: 10:50:30, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3449, decode.acc_seg: 50.7860, loss: 1.3449 2022-11-04 14:19:04,884 - mmseg - INFO - Iter [4100/80000] lr: 9.538e-05, eta: 10:50:07, time: 0.517, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2381, decode.acc_seg: 49.8337, loss: 1.2381 2022-11-04 14:19:30,187 - mmseg - INFO - Iter [4150/80000] lr: 9.532e-05, eta: 10:49:34, time: 0.506, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2762, decode.acc_seg: 51.9111, loss: 1.2762 2022-11-04 14:19:55,697 - mmseg - INFO - Iter [4200/80000] lr: 9.526e-05, eta: 10:49:05, time: 0.510, data_time: 0.005, memory: 8092, decode.loss_seg: 1.4459, decode.acc_seg: 47.5079, loss: 1.4459 2022-11-04 14:20:21,170 - mmseg - INFO - Iter [4250/80000] lr: 9.521e-05, eta: 10:48:36, time: 0.509, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2267, decode.acc_seg: 51.2204, loss: 1.2267 2022-11-04 14:20:46,687 - mmseg - INFO - Iter [4300/80000] lr: 9.515e-05, eta: 10:48:07, time: 0.510, data_time: 0.006, memory: 8092, decode.loss_seg: 1.3219, decode.acc_seg: 52.4775, loss: 1.3219 2022-11-04 14:21:12,338 - mmseg - INFO - Iter [4350/80000] lr: 9.509e-05, eta: 10:47:41, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0620, decode.acc_seg: 51.0188, loss: 1.0620 2022-11-04 14:21:37,847 - mmseg - INFO - Iter [4400/80000] lr: 9.504e-05, eta: 10:47:12, time: 0.510, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1613, decode.acc_seg: 48.9003, loss: 1.1613 2022-11-04 14:22:03,213 - mmseg - INFO - Iter [4450/80000] lr: 9.498e-05, eta: 10:46:41, time: 0.507, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3079, decode.acc_seg: 49.9059, loss: 1.3079 2022-11-04 14:22:28,560 - mmseg - INFO - Iter [4500/80000] lr: 9.492e-05, eta: 10:46:10, time: 0.507, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3142, decode.acc_seg: 48.3735, loss: 1.3142 2022-11-04 14:22:54,145 - mmseg - INFO - Iter [4550/80000] lr: 9.487e-05, eta: 10:45:42, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1134, decode.acc_seg: 48.2435, loss: 1.1134 2022-11-04 14:23:19,502 - mmseg - INFO - Iter [4600/80000] lr: 9.481e-05, eta: 10:45:12, time: 0.507, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0928, decode.acc_seg: 52.9743, loss: 1.0928 2022-11-04 14:23:45,204 - mmseg - INFO - Iter [4650/80000] lr: 9.475e-05, eta: 10:44:46, time: 0.514, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1834, decode.acc_seg: 49.5523, loss: 1.1834 2022-11-04 14:24:10,786 - mmseg - INFO - Iter [4700/80000] lr: 9.470e-05, eta: 10:44:19, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2363, decode.acc_seg: 52.0498, loss: 1.2363 2022-11-04 14:24:36,290 - mmseg - INFO - Iter [4750/80000] lr: 9.464e-05, eta: 10:43:51, time: 0.510, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1217, decode.acc_seg: 53.9506, loss: 1.1217 2022-11-04 14:25:01,846 - mmseg - INFO - Iter [4800/80000] lr: 9.458e-05, eta: 10:43:24, time: 0.511, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2784, decode.acc_seg: 48.7381, loss: 1.2784 2022-11-04 14:25:26,981 - mmseg - INFO - Iter [4850/80000] lr: 9.453e-05, eta: 10:42:50, time: 0.503, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2098, decode.acc_seg: 55.0698, loss: 1.2098 2022-11-04 14:25:52,625 - mmseg - INFO - Iter [4900/80000] lr: 9.447e-05, eta: 10:42:24, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2256, decode.acc_seg: 49.8625, loss: 1.2256 2022-11-04 14:26:18,256 - mmseg - INFO - Iter [4950/80000] lr: 9.441e-05, eta: 10:41:58, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1211, decode.acc_seg: 50.8198, loss: 1.1211 2022-11-04 14:26:43,861 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py 2022-11-04 14:26:43,862 - mmseg - INFO - Iter [5000/80000] lr: 9.436e-05, eta: 10:41:31, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0685, decode.acc_seg: 52.5552, loss: 1.0685 2022-11-04 14:27:09,464 - mmseg - INFO - Iter [5050/80000] lr: 9.430e-05, eta: 10:41:04, time: 0.512, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1475, decode.acc_seg: 54.2037, loss: 1.1475 2022-11-04 14:27:35,095 - mmseg - INFO - Iter [5100/80000] lr: 9.424e-05, eta: 10:40:38, time: 0.513, data_time: 0.005, memory: 8092, decode.loss_seg: 1.3505, decode.acc_seg: 47.2249, loss: 1.3505 2022-11-04 14:28:00,616 - mmseg - INFO - Iter [5150/80000] lr: 9.419e-05, eta: 10:40:11, time: 0.510, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2653, decode.acc_seg: 52.0987, loss: 1.2653 2022-11-04 14:28:25,761 - mmseg - INFO - Iter [5200/80000] lr: 9.413e-05, eta: 10:39:38, time: 0.503, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1646, decode.acc_seg: 54.0833, loss: 1.1646 2022-11-04 14:28:50,724 - mmseg - INFO - Iter [5250/80000] lr: 9.408e-05, eta: 10:39:02, time: 0.499, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1352, decode.acc_seg: 53.0767, loss: 1.1352 2022-11-04 14:29:15,883 - mmseg - INFO - Iter [5300/80000] lr: 9.402e-05, eta: 10:38:30, time: 0.503, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1585, decode.acc_seg: 51.0351, loss: 1.1585 2022-11-04 14:29:41,495 - mmseg - INFO - Iter [5350/80000] lr: 9.396e-05, eta: 10:38:04, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2224, decode.acc_seg: 51.2331, loss: 1.2224 2022-11-04 14:30:06,730 - mmseg - INFO - Iter [5400/80000] lr: 9.391e-05, eta: 10:37:32, time: 0.505, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2235, decode.acc_seg: 52.5105, loss: 1.2235 2022-11-04 14:30:31,938 - mmseg - INFO - Iter [5450/80000] lr: 9.385e-05, eta: 10:37:01, time: 0.504, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1074, decode.acc_seg: 51.1702, loss: 1.1074 2022-11-04 14:30:57,045 - mmseg - INFO - Iter [5500/80000] lr: 9.379e-05, eta: 10:36:28, time: 0.502, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1601, decode.acc_seg: 51.8448, loss: 1.1601 2022-11-04 14:31:22,496 - mmseg - INFO - Iter [5550/80000] lr: 9.374e-05, eta: 10:36:00, time: 0.509, data_time: 0.006, memory: 8092, decode.loss_seg: 1.0683, decode.acc_seg: 48.2804, loss: 1.0683 2022-11-04 14:31:47,620 - mmseg - INFO - Iter [5600/80000] lr: 9.368e-05, eta: 10:35:28, time: 0.502, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1493, decode.acc_seg: 51.3996, loss: 1.1493 2022-11-04 14:32:13,243 - mmseg - INFO - Iter [5650/80000] lr: 9.362e-05, eta: 10:35:02, time: 0.512, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2124, decode.acc_seg: 50.6224, loss: 1.2124 2022-11-04 14:32:38,487 - mmseg - INFO - Iter [5700/80000] lr: 9.357e-05, eta: 10:34:31, time: 0.505, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2267, decode.acc_seg: 50.9951, loss: 1.2267 2022-11-04 14:33:03,615 - mmseg - INFO - Iter [5750/80000] lr: 9.351e-05, eta: 10:33:59, time: 0.503, data_time: 0.006, memory: 8092, decode.loss_seg: 1.2065, decode.acc_seg: 49.5261, loss: 1.2065 2022-11-04 14:33:29,007 - mmseg - INFO - Iter [5800/80000] lr: 9.345e-05, eta: 10:33:31, time: 0.508, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1523, decode.acc_seg: 53.7562, loss: 1.1523 2022-11-04 14:33:53,793 - mmseg - INFO - Iter [5850/80000] lr: 9.340e-05, eta: 10:32:55, time: 0.496, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1585, decode.acc_seg: 53.8582, loss: 1.1585 2022-11-04 14:34:19,264 - mmseg - INFO - Iter [5900/80000] lr: 9.334e-05, eta: 10:32:27, time: 0.509, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1380, decode.acc_seg: 51.5328, loss: 1.1380 2022-11-04 14:34:44,542 - mmseg - INFO - Iter [5950/80000] lr: 9.328e-05, eta: 10:31:58, time: 0.506, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1912, decode.acc_seg: 50.7204, loss: 1.1912 2022-11-04 14:35:09,888 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py 2022-11-04 14:35:09,889 - mmseg - INFO - Iter [6000/80000] lr: 9.323e-05, eta: 10:31:29, time: 0.507, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1868, decode.acc_seg: 47.0040, loss: 1.1868 2022-11-04 14:35:35,156 - mmseg - INFO - Iter [6050/80000] lr: 9.317e-05, eta: 10:30:59, time: 0.505, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2965, decode.acc_seg: 50.0766, loss: 1.2965 2022-11-04 14:36:00,139 - mmseg - INFO - Iter [6100/80000] lr: 9.311e-05, eta: 10:30:26, time: 0.500, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1056, decode.acc_seg: 51.9845, loss: 1.1056 2022-11-04 14:36:25,264 - mmseg - INFO - Iter [6150/80000] lr: 9.306e-05, eta: 10:29:55, time: 0.502, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1517, decode.acc_seg: 51.8227, loss: 1.1517 2022-11-04 14:36:50,543 - mmseg - INFO - Iter [6200/80000] lr: 9.300e-05, eta: 10:29:26, time: 0.506, data_time: 0.006, memory: 8092, decode.loss_seg: 1.0768, decode.acc_seg: 51.6382, loss: 1.0768 2022-11-04 14:37:15,904 - mmseg - INFO - Iter [6250/80000] lr: 9.294e-05, eta: 10:28:57, time: 0.507, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1544, decode.acc_seg: 51.7665, loss: 1.1544 2022-11-04 14:37:40,692 - mmseg - INFO - Iter [6300/80000] lr: 9.288e-05, eta: 10:28:23, time: 0.496, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1554, decode.acc_seg: 49.4828, loss: 1.1554 2022-11-04 14:38:05,694 - mmseg - INFO - Iter [6350/80000] lr: 9.283e-05, eta: 10:27:50, time: 0.500, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0684, decode.acc_seg: 52.1434, loss: 1.0684 2022-11-04 14:38:30,911 - mmseg - INFO - Iter [6400/80000] lr: 9.277e-05, eta: 10:27:21, time: 0.504, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1044, decode.acc_seg: 49.0757, loss: 1.1044 2022-11-04 14:38:55,883 - mmseg - INFO - Iter [6450/80000] lr: 9.271e-05, eta: 10:26:48, time: 0.499, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1877, decode.acc_seg: 54.8154, loss: 1.1877 2022-11-04 14:39:20,672 - mmseg - INFO - Iter [6500/80000] lr: 9.266e-05, eta: 10:26:14, time: 0.496, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1320, decode.acc_seg: 53.1972, loss: 1.1320 2022-11-04 14:39:45,951 - mmseg - INFO - Iter [6550/80000] lr: 9.260e-05, eta: 10:25:45, time: 0.506, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1438, decode.acc_seg: 49.7748, loss: 1.1438 2022-11-04 14:40:10,943 - mmseg - INFO - Iter [6600/80000] lr: 9.254e-05, eta: 10:25:13, time: 0.500, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2165, decode.acc_seg: 49.8544, loss: 1.2165 2022-11-04 14:40:36,064 - mmseg - INFO - Iter [6650/80000] lr: 9.249e-05, eta: 10:24:43, time: 0.502, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0096, decode.acc_seg: 54.5719, loss: 1.0096 2022-11-04 14:41:01,357 - mmseg - INFO - Iter [6700/80000] lr: 9.243e-05, eta: 10:24:15, time: 0.506, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0554, decode.acc_seg: 52.3424, loss: 1.0554 2022-11-04 14:41:26,399 - mmseg - INFO - Iter [6750/80000] lr: 9.237e-05, eta: 10:23:43, time: 0.501, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1545, decode.acc_seg: 53.1592, loss: 1.1545 2022-11-04 14:41:51,652 - mmseg - INFO - Iter [6800/80000] lr: 9.232e-05, eta: 10:23:15, time: 0.505, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1432, decode.acc_seg: 50.7684, loss: 1.1432 2022-11-04 14:42:16,576 - mmseg - INFO - Iter [6850/80000] lr: 9.226e-05, eta: 10:22:43, time: 0.498, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2101, decode.acc_seg: 52.4039, loss: 1.2101 2022-11-04 14:42:41,551 - mmseg - INFO - Iter [6900/80000] lr: 9.220e-05, eta: 10:22:11, time: 0.500, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2556, decode.acc_seg: 52.2726, loss: 1.2556 2022-11-04 14:43:06,463 - mmseg - INFO - Iter [6950/80000] lr: 9.215e-05, eta: 10:21:39, time: 0.498, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1287, decode.acc_seg: 53.3726, loss: 1.1287 2022-11-04 14:43:31,599 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py 2022-11-04 14:43:31,599 - mmseg - INFO - Iter [7000/80000] lr: 9.209e-05, eta: 10:21:09, time: 0.503, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1263, decode.acc_seg: 52.3125, loss: 1.1263 2022-11-04 14:43:56,888 - mmseg - INFO - Iter [7050/80000] lr: 9.203e-05, eta: 10:20:41, time: 0.506, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0607, decode.acc_seg: 52.1683, loss: 1.0607 2022-11-04 14:44:22,066 - mmseg - INFO - Iter [7100/80000] lr: 9.198e-05, eta: 10:20:12, time: 0.504, data_time: 0.006, memory: 8092, decode.loss_seg: 0.9869, decode.acc_seg: 53.4469, loss: 0.9869 2022-11-04 14:44:47,006 - mmseg - INFO - Iter [7150/80000] lr: 9.192e-05, eta: 10:19:41, time: 0.499, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1478, decode.acc_seg: 49.7663, loss: 1.1478 2022-11-04 14:45:12,227 - mmseg - INFO - Iter [7200/80000] lr: 9.186e-05, eta: 10:19:12, time: 0.504, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1080, decode.acc_seg: 51.7933, loss: 1.1080 2022-11-04 14:45:37,228 - mmseg - INFO - Iter [7250/80000] lr: 9.181e-05, eta: 10:18:42, time: 0.500, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0993, decode.acc_seg: 54.4874, loss: 1.0993 2022-11-04 14:46:01,989 - mmseg - INFO - Iter [7300/80000] lr: 9.175e-05, eta: 10:18:09, time: 0.495, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1173, decode.acc_seg: 53.5056, loss: 1.1173 2022-11-04 14:46:27,204 - mmseg - INFO - Iter [7350/80000] lr: 9.169e-05, eta: 10:17:40, time: 0.504, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0296, decode.acc_seg: 51.2524, loss: 1.0296 2022-11-04 14:46:52,702 - mmseg - INFO - Iter [7400/80000] lr: 9.164e-05, eta: 10:17:15, time: 0.510, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1449, decode.acc_seg: 50.4905, loss: 1.1449 2022-11-04 14:47:17,805 - mmseg - INFO - Iter [7450/80000] lr: 9.158e-05, eta: 10:16:45, time: 0.502, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0572, decode.acc_seg: 52.1370, loss: 1.0572 2022-11-04 14:47:43,290 - mmseg - INFO - Iter [7500/80000] lr: 9.152e-05, eta: 10:16:19, time: 0.510, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0862, decode.acc_seg: 53.8615, loss: 1.0862 2022-11-04 14:48:08,621 - mmseg - INFO - Iter [7550/80000] lr: 9.147e-05, eta: 10:15:52, time: 0.507, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1500, decode.acc_seg: 49.2943, loss: 1.1500 2022-11-04 14:48:33,690 - mmseg - INFO - Iter [7600/80000] lr: 9.141e-05, eta: 10:15:23, time: 0.501, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0267, decode.acc_seg: 53.7445, loss: 1.0267 2022-11-04 14:48:58,724 - mmseg - INFO - Iter [7650/80000] lr: 9.135e-05, eta: 10:14:53, time: 0.501, data_time: 0.006, memory: 8092, decode.loss_seg: 1.1593, decode.acc_seg: 53.0468, loss: 1.1593 2022-11-04 14:49:23,854 - mmseg - INFO - Iter [7700/80000] lr: 9.130e-05, eta: 10:14:24, time: 0.503, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0488, decode.acc_seg: 51.6885, loss: 1.0488 2022-11-04 14:49:48,809 - mmseg - INFO - Iter [7750/80000] lr: 9.124e-05, eta: 10:13:53, time: 0.499, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1812, decode.acc_seg: 54.0784, loss: 1.1812 2022-11-04 14:50:13,782 - mmseg - INFO - Iter [7800/80000] lr: 9.118e-05, eta: 10:13:23, time: 0.499, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0519, decode.acc_seg: 52.2050, loss: 1.0519 2022-11-04 14:50:38,565 - mmseg - INFO - Iter [7850/80000] lr: 9.112e-05, eta: 10:12:51, time: 0.496, data_time: 0.005, memory: 8092, decode.loss_seg: 1.2474, decode.acc_seg: 51.2928, loss: 1.2474 2022-11-04 14:51:03,737 - mmseg - INFO - Iter [7900/80000] lr: 9.107e-05, eta: 10:12:23, time: 0.503, data_time: 0.005, memory: 8092, decode.loss_seg: 1.0264, decode.acc_seg: 56.0549, loss: 1.0264 2022-11-04 14:51:29,084 - mmseg - INFO - Iter [7950/80000] lr: 9.101e-05, eta: 10:11:56, time: 0.507, data_time: 0.005, memory: 8092, decode.loss_seg: 1.1163, decode.acc_seg: 51.0711, loss: 1.1163 2022-11-04 14:51:53,904 - mmseg - INFO - Saving checkpoint at 8000 iterations 2022-11-04 14:55:17,950 - mmseg - INFO - per class results: 2022-11-04 14:55:17,955 - mmseg - INFO - +---------------------+-------+-------+ | Class | IoU | Acc | +---------------------+-------+-------+ | wall | 63.43 | 82.86 | | building | 72.77 | 89.12 | | sky | 91.2 | 94.36 | | floor | 69.61 | 86.03 | | tree | 64.96 | 88.84 | | ceiling | 71.56 | 81.13 | | road | 69.19 | 86.97 | | bed | 64.97 | 92.63 | | windowpane | 42.03 | 76.8 | | grass | 54.86 | 65.06 | | cabinet | 42.62 | 66.73 | | sidewalk | 45.29 | 61.01 | | person | 67.71 | 86.44 | | earth | 31.28 | 49.57 | | door | 7.0 | 7.84 | | table | 30.05 | 41.31 | | mountain | 50.14 | 67.98 | | plant | 40.14 | 46.8 | | curtain | 47.54 | 75.58 | | chair | 24.07 | 27.15 | | car | 70.68 | 88.7 | | water | 37.47 | 49.85 | | painting | 43.7 | 73.54 | | sofa | 38.85 | 69.66 | | shelf | 25.91 | 45.9 | | house | 36.35 | 79.96 | | sea | 38.03 | 78.7 | | mirror | 20.92 | 34.11 | | rug | 38.5 | 45.35 | | field | 20.53 | 60.66 | | armchair | 11.15 | 18.15 | | seat | 33.53 | 39.63 | | fence | 4.68 | 4.98 | | desk | 13.88 | 16.64 | | rock | 20.19 | 27.73 | | wardrobe | 9.86 | 10.0 | | lamp | 34.23 | 57.94 | | bathtub | 33.84 | 52.59 | | railing | 8.68 | 9.26 | | cushion | 23.35 | 37.67 | | base | 3.42 | 4.11 | | box | 7.65 | 14.45 | | column | 0.18 | 0.18 | | signboard | 18.67 | 21.55 | | chest of drawers | 20.35 | 29.69 | | counter | 1.5 | 1.51 | | sand | 16.98 | 21.61 | | sink | 25.48 | 61.99 | | skyscraper | 32.11 | 39.59 | | fireplace | nan | nan | | refrigerator | 8.59 | 8.64 | | grandstand | 27.69 | 36.05 | | path | 0.0 | 0.0 | | stairs | 1.66 | 1.67 | | runway | 18.61 | 19.69 | | case | 12.9 | 14.5 | | pool table | 68.76 | 95.18 | | pillow | 20.52 | 24.63 | | screen door | 0.01 | 0.01 | | stairway | 2.92 | 3.53 | | river | 0.0 | 0.0 | | bridge | 5.69 | 8.6 | | bookcase | 0.14 | 0.14 | | blind | nan | nan | | coffee table | 21.39 | 27.75 | | toilet | 35.13 | 41.91 | | flower | 22.16 | 34.12 | | book | 35.79 | 50.08 | | hill | 0.0 | 0.0 | | bench | 0.0 | 0.0 | | countertop | 10.35 | 12.48 | | stove | 31.11 | 42.27 | | palm | 27.26 | 29.65 | | kitchen island | 0.02 | 0.02 | | computer | 32.41 | 38.14 | | swivel chair | 16.34 | 46.51 | | boat | 0.28 | 0.29 | | bar | 0.0 | 0.0 | | arcade machine | 0.84 | 1.02 | | hovel | 0.0 | 0.0 | | bus | 10.47 | 10.49 | | towel | 11.62 | 14.7 | | light | 0.0 | 0.0 | | truck | 1.19 | 1.92 | | tower | 0.0 | 0.0 | | chandelier | 32.9 | 48.85 | | awning | 0.0 | 0.0 | | streetlight | 0.0 | 0.0 | | booth | 2.92 | 3.0 | | television receiver | nan | nan | | airplane | 38.07 | 58.89 | | dirt track | 0.0 | 0.0 | | apparel | 12.98 | 15.07 | | pole | 0.0 | 0.0 | | land | 0.0 | 0.0 | | bannister | 0.0 | 0.0 | | escalator | 0.0 | 0.0 | | ottoman | 0.0 | 0.0 | | bottle | nan | nan | | buffet | 0.0 | 0.0 | | poster | 0.0 | 0.0 | | stage | 0.0 | 0.0 | | van | 0.0 | 0.0 | | ship | 0.0 | 0.0 | | fountain | 0.0 | 0.0 | | conveyer belt | 0.0 | 0.0 | | canopy | 0.0 | 0.0 | | washer | 47.03 | 55.21 | | plaything | 0.0 | 0.0 | | swimming pool | 0.87 | 0.88 | | stool | 0.0 | 0.0 | | barrel | 0.0 | 0.0 | | basket | 0.49 | 0.49 | | waterfall | 26.74 | 28.23 | | tent | 91.75 | 97.04 | | bag | 0.0 | 0.0 | | minibike | 26.47 | 40.16 | | cradle | nan | nan | | oven | 0.0 | 0.0 | | ball | 43.36 | 63.97 | | food | 0.0 | 0.0 | | step | 0.0 | 0.0 | | tank | 0.0 | 0.0 | | trade name | 0.83 | 0.84 | | microwave | 9.77 | 10.2 | | pot | 0.03 | 0.03 | | animal | nan | nan | | bicycle | 1.13 | 1.14 | | lake | 0.0 | 0.0 | | dishwasher | 0.0 | 0.0 | | screen | 0.0 | 0.0 | | blanket | 0.0 | 0.0 | | sculpture | 0.0 | 0.0 | | hood | 0.0 | 0.0 | | sconce | 0.0 | 0.0 | | vase | 0.18 | 0.18 | | traffic light | 0.0 | 0.0 | | tray | 0.0 | 0.0 | | ashcan | 0.0 | 0.0 | | fan | 0.0 | 0.0 | | pier | 0.0 | 0.0 | | crt screen | 0.0 | 0.0 | | plate | 0.04 | 0.04 | | monitor | 0.0 | 0.0 | | bulletin board | 0.0 | 0.0 | | shower | 0.0 | 0.0 | | radiator | 0.0 | 0.0 | | glass | 0.0 | 0.0 | | clock | 0.0 | 0.0 | | flag | 0.0 | 0.0 | +---------------------+-------+-------+ 2022-11-04 14:55:17,955 - mmseg - INFO - Summary: 2022-11-04 14:55:17,955 - mmseg - INFO - +--------+-------+-------+-------+ | Scope | mIoU | mAcc | aAcc | +--------+-------+-------+-------+ | global | 17.36 | 24.02 | 70.89 | +--------+-------+-------+-------+ 2022-11-04 14:55:17,998 - mmseg - INFO - Exp name: fpn_crossformer_s_ade20k_40k.py
Unexpected keys: xxxxx
didn't appear in your log, which means you didn't load the pretrained weights actually. This error occurs silently without any warning due to incompatible version of mmcv and mmseg. Please strictly follow our instruction and use pip3 install mmcv-full==1.2.7 mmsegmentation==0.12.0
.
Unexpected keys: xxxxx
didn't appear in your log, which means you didn't load the pretrained weights actually. This error occurs silently without any warning due to incompatible version of mmcv and mmseg. Please strictly follow our instruction and usepip3 install mmcv-full==1.2.7 mmsegmentation==0.12.0
.
ok,I'll try,thanks.
Unexpected keys: xxxxx
didn't appear in your log, which means you didn't load the pretrained weights actually. This error occurs silently without any warning due to incompatible version of mmcv and mmseg. Please strictly follow our instruction and usepip3 install mmcv-full==1.2.7 mmsegmentation==0.12.0
.
I'm using mmcv-full==1.2.7 and mmsegmentation==0.11.0.The version difference should not be that big, retrain.
main() File "./test.py", line 127, in main checkpoint = load_checkpoint(model, args.checkpoint, map_location='cpu') File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/mmcv/runner/checkpoint.py", line 522, in load_checkpoint checkpoint = _load_checkpoint(filename, map_location, logger) File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/mmcv/runner/checkpoint.py", line 466, in _load_checkpoint return CheckpointLoader.load_checkpoint(filename, map_location, logger) File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/mmcv/runner/checkpoint.py", line 243, in load_checkpoint return checkpoint_loader(filename, map_location) File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/mmcv/runner/checkpoint.py", line 260, in load_from_local checkpoint = torch.load(filename, map_location=map_location) File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/torch/serialization.py", line 594, in load return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args) File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/torch/serialization.py", line 853, in _load result = unpickler.load() File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/torch/serialization.py", line 845, in persistent_load load_tensor(data_type, size, key, _maybe_decode_ascii(location)) File "/home/wangnan/anaconda3/envs/yolo-v5/lib/python3.6/site-packages/torch/serialization.py", line 833, in load_tensor storage = zip_file.get_storage_from_record(name, size, dtype).storage() RuntimeError: [enforce fail at inline_container.cc:145] . PytorchStreamReader failed reading file data/2154620144: invalid header or archive is corrupted