open-mmlab / mmdetection

OpenMMLab Detection Toolbox and Benchmark
https://mmdetection.readthedocs.io
Apache License 2.0
28.52k stars 9.29k forks source link

ERROR - The testing results of the whole dataset is empty - YOLOX and COCO #6265

Closed tikitong closed 2 years ago

tikitong commented 2 years ago

The evaluation of the results does not take place. It seems that it can come from here:
evaluation = dict(interval=1, metric='bbox', classwise=True)

An error is also raised when I set the save_best argument like this: evaluation = dict(interval=1, classwise=True, metric='bbox', save_best='bbox_mAP')

...
Traceback (most recent call last):
  File "tools/train.py", line 189, in <module>
    main()
  File "tools/train.py", line 185, in main
    meta=meta)
  File "/home/tiki/mmdetection/mmdet/apis/train.py", line 174, in train_detector
    runner.run(data_loaders, cfg.workflow)
  File "/home/tiki/.conda/envs/new/lib/python3.7/site-packages/mmcv/runner/epoch_based_runner.py", line 127, in run
    epoch_runner(data_loaders[i], **kwargs)
  File "/home/tiki/.conda/envs/new/lib/python3.7/site-packages/mmcv/runner/epoch_based_runner.py", line 54, in train
    self.call_hook('after_train_epoch')
  File "/home/tiki/.conda/envs/new/lib/python3.7/site-packages/mmcv/runner/base_runner.py", line 307, in call_hook
    getattr(hook, fn_name)(self)
  File "/home/tiki/.conda/envs/new/lib/python3.7/site-packages/mmcv/runner/hooks/evaluation.py", line 237, in after_train_epoch
    self._do_evaluate(runner)
  File "/home/tiki/mmdetection/mmdet/core/evaluation/eval_hooks.py", line 20, in _do_evaluate
    key_score = self.evaluate(runner, results)
  File "/home/tiki/.conda/envs/new/lib/python3.7/site-packages/mmcv/runner/hooks/evaluation.py", line 334, in evaluate
    return eval_res[self.key_indicator]
KeyError: 'bbox_mAP'

I'm using the same dataset with the same annotations in COCO format for models like VFNET and Faster-RCNN and it works fine. I don't see the error in my setting for YOLOX, do you have an idea? Thanks a lot!

Running python script...
2021-10-12 17:02:39,537 - mmdet - INFO - Environment info:
------------------------------------------------------------
sys.platform: linux
Python: 3.7.11 (default, Jul 27 2021, 14:32:16) [GCC 7.5.0]
CUDA available: True
GPU 0: NVIDIA Tesla V100-SXM2-32GB
CUDA_HOME: None
GCC: gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44)
PyTorch: 1.9.0
PyTorch compiling details: PyTorch built with:
  - GCC 7.3
  - C++ Version: 201402
  - Intel(R) Math Kernel Library Version 2020.0.2 Product Build 20200624 for Intel(R) 64 architecture applications
  - Intel(R) MKL-DNN v2.1.2 (Git Hash 98be7e8afa711dc9b66c8ff3504129cb82013cdb)
  - OpenMP 201511 (a.k.a. OpenMP 4.5)
  - NNPACK is enabled
  - CPU capability usage: AVX2
  - CUDA Runtime 11.1
  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_37,code=compute_37
  - CuDNN 8.0.5
  - Magma 2.5.2
  - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.1, CUDNN_VERSION=8.0.5, CXX_COMPILER=/opt/rh/devtoolset-7/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.9.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, 

TorchVision: 0.2.1
OpenCV: 4.5.3
MMCV: 1.3.13
MMCV Compiler: GCC 10.2
MMCV CUDA Compiler: 11.1
MMDetection: 2.16.0+b1f97c1
------------------------------------------------------------

2021-10-12 17:02:39,885 - mmdet - INFO - Distributed training: False
2021-10-12 17:02:40,103 - mmdet - INFO - Config:
optimizer = dict(
    type='SGD',
    lr=0.00125,
    momentum=0.9,
    weight_decay=0.0005,
    nesterov=True,
    paramwise_cfg=dict(norm_decay_mult=0.0, bias_decay_mult=0.0))
optimizer_config = dict(grad_clip=dict(max_norm=35, norm_type=2))
lr_config = dict(
    policy='YOLOX',
    warmup='exp',
    by_epoch=False,
    warmup_by_epoch=True,
    warmup_ratio=1,
    warmup_iters=5,
    num_last_epochs=15,
    min_lr_ratio=0.05)
runner = dict(type='EpochBasedRunner', max_epochs=120)
checkpoint_config = None
log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')])
custom_hooks = [
    dict(type='YOLOXModeSwitchHook', num_last_epochs=15, priority=48),
    dict(
        type='SyncRandomSizeHook',
        ratio_range=(10, 20),
        img_scale=(640, 640),
        interval=10,
        priority=48),
    dict(type='SyncNormHook', num_last_epochs=15, interval=10, priority=48),
    dict(type='ExpMomentumEMAHook', resume_from=None, priority=49)
]
dist_params = dict(backend='nccl')
log_level = 'INFO'
load_from = 'weight/yolox_tiny_8x8_300e_coco_20210806_234250-4ff3b67e.pth'
resume_from = None
workflow = [('train', 1)]
model = dict(
    type='YOLOX',
    backbone=dict(type='CSPDarknet', deepen_factor=0.33, widen_factor=0.375),
    neck=dict(
        type='YOLOXPAFPN',
        in_channels=[96, 192, 384],
        out_channels=96,
        num_csp_blocks=1),
    bbox_head=dict(
        type='YOLOXHead', num_classes=3, in_channels=96, feat_channels=96),
    train_cfg=dict(assigner=dict(type='SimOTAAssigner', center_radius=2.5)),
    test_cfg=dict(score_thr=0.01, nms=dict(type='nms', iou_threshold=0.65)))
data_root = 'mainset107'
dataset_type = 'CocoDataset'
img_norm_cfg = dict(
    mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True)
img_scale = (640, 640)
train_pipeline = [
    dict(type='LoadImageFromFile'),
    dict(type='LoadAnnotations', with_bbox=True),
    dict(type='Resize', img_scale=(640, 640), keep_ratio=True),
    dict(type='RandomFlip', flip_ratio=0.5),
    dict(
        type='RandomAffine', max_rotate_degree=180.0,
        max_translate_ratio=0.25),
    dict(
        type='Normalize',
        mean=[123.675, 116.28, 103.53],
        std=[58.395, 57.12, 57.375],
        to_rgb=True),
    dict(type='DefaultFormatBundle'),
    dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])
]
train_dataset = dict(
    type='MultiImageMixDataset',
    dataset=dict(
        type='CocoDataset',
        ann_file='mainset107/set/train.json',
        img_prefix='mainset107/image',
        pipeline=[
            dict(type='LoadImageFromFile', to_float32=True),
            dict(type='LoadAnnotations', with_bbox=True)
        ],
        filter_empty_gt=False),
    pipeline=[
        dict(type='LoadImageFromFile'),
        dict(type='LoadAnnotations', with_bbox=True),
        dict(type='Resize', img_scale=(640, 640), keep_ratio=True),
        dict(type='RandomFlip', flip_ratio=0.5),
        dict(
            type='RandomAffine',
            max_rotate_degree=180.0,
            max_translate_ratio=0.25),
        dict(
            type='Normalize',
            mean=[123.675, 116.28, 103.53],
            std=[58.395, 57.12, 57.375],
            to_rgb=True),
        dict(type='DefaultFormatBundle'),
        dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])
    ],
    dynamic_scale=(640, 640))
test_pipeline = [
    dict(type='LoadImageFromFile'),
    dict(
        type='MultiScaleFlipAug',
        img_scale=(640, 640),
        flip=False,
        transforms=[
            dict(type='Resize', keep_ratio=True),
            dict(type='RandomFlip'),
            dict(
                type='Normalize',
                mean=[123.675, 116.28, 103.53],
                std=[58.395, 57.12, 57.375],
                to_rgb=True),
            dict(type='DefaultFormatBundle'),
            dict(type='Collect', keys=['img'])
        ])
]
data = dict(
    samples_per_gpu=8,
    workers_per_gpu=2,
    train=dict(
        type='MultiImageMixDataset',
        dataset=dict(
            type='CocoDataset',
            ann_file='mainset107/set/train.json',
            img_prefix='mainset107/image',
            pipeline=[
                dict(type='LoadImageFromFile', to_float32=True),
                dict(type='LoadAnnotations', with_bbox=True)
            ],
            filter_empty_gt=False),
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(type='LoadAnnotations', with_bbox=True),
            dict(type='Resize', img_scale=(640, 640), keep_ratio=True),
            dict(type='RandomFlip', flip_ratio=0.5),
            dict(
                type='RandomAffine',
                max_rotate_degree=180.0,
                max_translate_ratio=0.25),
            dict(
                type='Normalize',
                mean=[123.675, 116.28, 103.53],
                std=[58.395, 57.12, 57.375],
                to_rgb=True),
            dict(type='DefaultFormatBundle'),
            dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])
        ],
        dynamic_scale=(640, 640)),
    val=dict(
        type='CocoDataset',
        ann_file='mainset107/set/val.json',
        img_prefix='mainset107/image',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(
                type='MultiScaleFlipAug',
                img_scale=(640, 640),
                flip=False,
                transforms=[
                    dict(type='Resize', keep_ratio=True),
                    dict(type='RandomFlip'),
                    dict(
                        type='Normalize',
                        mean=[123.675, 116.28, 103.53],
                        std=[58.395, 57.12, 57.375],
                        to_rgb=True),
                    dict(type='DefaultFormatBundle'),
                    dict(type='Collect', keys=['img'])
                ])
        ]),
    test=dict(
        type='CocoDataset',
        ann_file='mainset107/set/val.json',
        img_prefix='mainset107/image',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(
                type='MultiScaleFlipAug',
                img_scale=(640, 640),
                flip=False,
                transforms=[
                    dict(type='Resize', keep_ratio=True),
                    dict(type='RandomFlip'),
                    dict(
                        type='Normalize',
                        mean=[123.675, 116.28, 103.53],
                        std=[58.395, 57.12, 57.375],
                        to_rgb=True),
                    dict(type='DefaultFormatBundle'),
                    dict(type='Collect', keys=['img'])
                ])
        ]))
interval = 10
evaluation = dict(interval=1, metric='bbox', classwise=True)
classes = ('circle', 'triangle', 'rectangle')
total_epochs = 120
seed = 0
gpu_ids = range(0, 1)
work_dir = './work_dirs/yolox'

2021-10-12 17:02:40,544 - mmdet - INFO - initialize CSPDarknet with init_cfg {'type': 'Kaiming', 'layer': 'Conv2d', 'a': 2.23606797749979, 'distribution': 'uniform', 'mode': 'fan_in', 'nonlinearity': 'leaky_relu'}
2021-10-12 17:02:40,565 - mmdet - INFO - initialize YOLOXPAFPN with init_cfg {'type': 'Kaiming', 'layer': 'Conv2d', 'a': 2.23606797749979, 'distribution': 'uniform', 'mode': 'fan_in', 'nonlinearity': 'leaky_relu'}
2021-10-12 17:02:40,580 - mmdet - INFO - initialize YOLOXHead with init_cfg {'type': 'Kaiming', 'layer': 'Conv2d', 'a': 2.23606797749979, 'distribution': 'uniform', 'mode': 'fan_in', 'nonlinearity': 'leaky_relu'}
2021-10-12 17:02:43,759 - mmdet - INFO - load checkpoint from weight/yolox_tiny_8x8_300e_coco_20210806_234250-4ff3b67e.pth
2021-10-12 17:02:43,759 - mmdet - INFO - Use load_from_local loader
2021-10-12 17:02:44,396 - mmdet - WARNING - The model and loaded state dict do not match exactly

size mismatch for bbox_head.multi_level_conv_cls.0.weight: copying a param with shape torch.Size([80, 96, 1, 1]) from checkpoint, the shape in current model is torch.Size([3, 96, 1, 1]).
size mismatch for bbox_head.multi_level_conv_cls.0.bias: copying a param with shape torch.Size([80]) from checkpoint, the shape in current model is torch.Size([3]).
size mismatch for bbox_head.multi_level_conv_cls.1.weight: copying a param with shape torch.Size([80, 96, 1, 1]) from checkpoint, the shape in current model is torch.Size([3, 96, 1, 1]).
size mismatch for bbox_head.multi_level_conv_cls.1.bias: copying a param with shape torch.Size([80]) from checkpoint, the shape in current model is torch.Size([3]).
size mismatch for bbox_head.multi_level_conv_cls.2.weight: copying a param with shape torch.Size([80, 96, 1, 1]) from checkpoint, the shape in current model is torch.Size([3, 96, 1, 1]).
size mismatch for bbox_head.multi_level_conv_cls.2.bias: copying a param with shape torch.Size([80]) from checkpoint, the shape in current model is torch.Size([3]).
2021-10-12 17:02:44,397 - mmdet - INFO - Start running, host: tiki@gpu609-03, work_dir: /home/tiki/mmdetection/work_dirs/yolox
2021-10-12 17:02:44,397 - mmdet - INFO - Hooks will be executed in the following order:
before_run:
(VERY_HIGH   ) YOLOXLrUpdaterHook                 
(49          ) ExpMomentumEMAHook                 
(LOW         ) EvalHook                           
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_train_epoch:
(VERY_HIGH   ) YOLOXLrUpdaterHook                 
(48          ) YOLOXModeSwitchHook                
(48          ) SyncNormHook                       
(49          ) ExpMomentumEMAHook                 
(LOW         ) IterTimerHook                      
(LOW         ) EvalHook                           
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_train_iter:
(VERY_HIGH   ) YOLOXLrUpdaterHook                 
(LOW         ) IterTimerHook                      
(LOW         ) EvalHook                           
 -------------------- 
after_train_iter:
(ABOVE_NORMAL) OptimizerHook                      
(48          ) SyncRandomSizeHook                 
(49          ) ExpMomentumEMAHook                 
(LOW         ) IterTimerHook                      
(LOW         ) EvalHook                           
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
after_train_epoch:
(48          ) SyncNormHook                       
(49          ) ExpMomentumEMAHook                 
(LOW         ) EvalHook                           
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_val_epoch:
(LOW         ) IterTimerHook                      
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_val_iter:
(LOW         ) IterTimerHook                      
 -------------------- 
after_val_iter:
(LOW         ) IterTimerHook                      
 -------------------- 
after_val_epoch:
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
2021-10-12 17:02:44,397 - mmdet - INFO - workflow: [('train', 1)], max: 120 epochs
loading annotations into memory...
Done (t=0.02s)
creating index...
index created!
loading annotations into memory...
Done (t=0.01s)
creating index...
index created!
/home/tiki/.conda/envs/new/lib/python3.7/site-packages/torch/nn/functional.py:718: UserWarning: Named tensors and all their associated APIs are an experimental feature and subject to change. Please do not use them for anything important until they are released as stable. (Triggered internally at  /opt/conda/conda-bld/pytorch_1623448265233/work/c10/core/TensorImpl.h:1156.)
  return torch.max_pool2d(input, kernel_size, stride, padding, dilation, ceil_mode)
2021-10-12 17:03:30,246 - mmdet - INFO - Epoch [1][50/157]  lr: 5.071e-06, eta: 4:46:53, time: 0.916, data_time: 0.725, memory: 2987, loss_cls: 0.0000, loss_bbox: 0.0000, loss_obj: 123.8029, loss: 123.8029, grad_norm: 6465.0170
2021-10-12 17:04:13,861 - mmdet - INFO - Epoch [1][100/157] lr: 2.028e-05, eta: 4:39:17, time: 0.872, data_time: 0.700, memory: 2987, loss_cls: 0.0000, loss_bbox: 0.0000, loss_obj: 38.6317, loss: 38.6317, grad_norm: 2347.8628
2021-10-12 17:04:57,390 - mmdet - INFO - Epoch [1][150/157] lr: 4.564e-05, eta: 4:36:05, time: 0.871, data_time: 0.697, memory: 2987, loss_cls: 0.0000, loss_bbox: 0.0000, loss_obj: 3.5303, loss: 3.5303, grad_norm: 104.8619
[                                                  ] 0/538, elapsed: 0s, ETA:
[                                 ] 1/538, 2.7 task/s, elapsed: 0s, ETA:   202s
[                                 ] 2/538, 4.9 task/s, elapsed: 0s, ETA:   110s
[                                 ] 3/538, 6.4 task/s, elapsed: 0s, ETA:    83s
[                                 ] 4/538, 8.2 task/s, elapsed: 0s, ETA:    65s
[                                 ] 5/538, 8.9 task/s, elapsed: 1s, ETA:    60s
[                                ] 6/538, 10.2 task/s, elapsed: 1s, ETA:    52s
[                                ] 7/538, 10.8 task/s, elapsed: 1s, ETA:    49s
[                                ] 8/538, 12.0 task/s, elapsed: 1s, ETA:    44s
[                                ] 9/538, 12.3 task/s, elapsed: 1s, ETA:    43s
[                               ] 10/538, 13.2 task/s, elapsed: 1s, ETA:    40s
[                               ] 11/538, 13.4 task/s, elapsed: 1s, ETA:    39s
[                               ] 12/538, 14.1 task/s, elapsed: 1s, ETA:    37s
[                               ] 13/538, 14.3 task/s, elapsed: 1s, ETA:    37s
[                               ] 14/538, 15.1 task/s, elapsed: 1s, ETA:    35s
[                               ] 15/538, 14.9 task/s, elapsed: 1s, ETA:    35s
[                               ] 16/538, 15.5 task/s, elapsed: 1s, ETA:    34s
[                               ] 17/538, 15.3 task/s, elapsed: 1s, ETA:    34s
[>                              ] 18/538, 15.8 task/s, elapsed: 1s, ETA:    33s
[>                              ] 19/538, 16.0 task/s, elapsed: 1s, ETA:    32s
[>                              ] 20/538, 16.5 task/s, elapsed: 1s, ETA:    31s
[>                              ] 21/538, 16.3 task/s, elapsed: 1s, ETA:    32s
[>                              ] 22/538, 16.7 task/s, elapsed: 1s, ETA:    31s
[>                              ] 23/538, 16.7 task/s, elapsed: 1s, ETA:    31s
[>                              ] 24/538, 17.0 task/s, elapsed: 1s, ETA:    30s
[>                              ] 25/538, 17.3 task/s, elapsed: 1s, ETA:    30s
[>                              ] 26/538, 17.6 task/s, elapsed: 1s, ETA:    29s
[>                              ] 27/538, 18.0 task/s, elapsed: 2s, ETA:    28s
[>                              ] 28/538, 18.3 task/s, elapsed: 2s, ETA:    28s
[>                              ] 29/538, 18.4 task/s, elapsed: 2s, ETA:    28s
[>                              ] 30/538, 18.7 task/s, elapsed: 2s, ETA:    27s
[>                              ] 31/538, 18.8 task/s, elapsed: 2s, ETA:    27s
[>                              ] 32/538, 19.0 task/s, elapsed: 2s, ETA:    27s
[>                              ] 33/538, 19.1 task/s, elapsed: 2s, ETA:    27s
[>                              ] 34/538, 19.3 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 35/538, 18.9 task/s, elapsed: 2s, ETA:    27s
[>>                             ] 36/538, 19.1 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 37/538, 19.1 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 38/538, 19.4 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 39/538, 19.1 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 40/538, 19.4 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 41/538, 18.9 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 42/538, 19.1 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 43/538, 18.6 task/s, elapsed: 2s, ETA:    27s
[>>                             ] 44/538, 18.8 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 45/538, 18.8 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 46/538, 19.0 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 47/538, 19.0 task/s, elapsed: 2s, ETA:    26s
[>>                             ] 48/538, 19.2 task/s, elapsed: 3s, ETA:    26s
[>>                             ] 49/538, 19.0 task/s, elapsed: 3s, ETA:    26s
[>>                             ] 50/538, 19.1 task/s, elapsed: 3s, ETA:    26s
[>>                             ] 51/538, 19.1 task/s, elapsed: 3s, ETA:    25s
[>>                             ] 52/538, 19.3 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 53/538, 19.2 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 54/538, 19.4 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 55/538, 19.1 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 56/538, 19.2 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 57/538, 19.3 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 58/538, 19.4 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 59/538, 19.3 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 60/538, 19.5 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 61/538, 19.5 task/s, elapsed: 3s, ETA:    25s
[>>>                            ] 62/538, 19.6 task/s, elapsed: 3s, ETA:    24s
[>>>                            ] 63/538, 19.6 task/s, elapsed: 3s, ETA:    24s
[>>>                            ] 64/538, 19.8 task/s, elapsed: 3s, ETA:    24s
[>>>                            ] 65/538, 19.7 task/s, elapsed: 3s, ETA:    24s
[>>>                            ] 66/538, 19.8 task/s, elapsed: 3s, ETA:    24s
[>>>                            ] 67/538, 19.9 task/s, elapsed: 3s, ETA:    24s
[>>>                            ] 68/538, 19.9 task/s, elapsed: 3s, ETA:    24s
[>>>                            ] 69/538, 20.0 task/s, elapsed: 3s, ETA:    23s
[>>>>                           ] 70/538, 19.9 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 71/538, 20.0 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 72/538, 20.0 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 73/538, 20.1 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 74/538, 19.7 task/s, elapsed: 4s, ETA:    24s
[>>>>                           ] 75/538, 19.8 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 76/538, 19.8 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 77/538, 20.0 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 78/538, 20.1 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 79/538, 20.0 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 80/538, 20.1 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 81/538, 20.2 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 82/538, 20.2 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 83/538, 20.2 task/s, elapsed: 4s, ETA:    23s
[>>>>                           ] 84/538, 20.2 task/s, elapsed: 4s, ETA:    22s
[>>>>                           ] 85/538, 20.2 task/s, elapsed: 4s, ETA:    22s
[>>>>                           ] 86/538, 20.3 task/s, elapsed: 4s, ETA:    22s
[>>>>>                          ] 87/538, 20.2 task/s, elapsed: 4s, ETA:    22s
[>>>>>                          ] 88/538, 20.3 task/s, elapsed: 4s, ETA:    22s
[>>>>>                          ] 89/538, 20.2 task/s, elapsed: 4s, ETA:    22s
[>>>>>                          ] 90/538, 20.3 task/s, elapsed: 4s, ETA:    22s
[>>>>>                          ] 91/538, 20.2 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 92/538, 20.3 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 93/538, 20.2 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 94/538, 20.2 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 95/538, 20.2 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 96/538, 20.3 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 97/538, 20.2 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 98/538, 20.3 task/s, elapsed: 5s, ETA:    22s
[>>>>>                          ] 99/538, 20.3 task/s, elapsed: 5s, ETA:    22s
[>>>>>                         ] 100/538, 20.4 task/s, elapsed: 5s, ETA:    22s
[>>>>>                         ] 101/538, 20.3 task/s, elapsed: 5s, ETA:    22s
[>>>>>                         ] 102/538, 20.4 task/s, elapsed: 5s, ETA:    21s
[>>>>>                         ] 103/538, 20.3 task/s, elapsed: 5s, ETA:    21s
[>>>>>                         ] 104/538, 20.4 task/s, elapsed: 5s, ETA:    21s
[>>>>>                         ] 105/538, 20.3 task/s, elapsed: 5s, ETA:    21s
[>>>>>                         ] 106/538, 20.4 task/s, elapsed: 5s, ETA:    21s
[>>>>>                         ] 107/538, 20.4 task/s, elapsed: 5s, ETA:    21s
[>>>>>>                        ] 108/538, 20.4 task/s, elapsed: 5s, ETA:    21s
[>>>>>>                        ] 109/538, 20.3 task/s, elapsed: 5s, ETA:    21s
[>>>>>>                        ] 110/538, 20.4 task/s, elapsed: 5s, ETA:    21s
[>>>>>>                        ] 111/538, 20.4 task/s, elapsed: 5s, ETA:    21s
[>>>>>>                        ] 112/538, 20.5 task/s, elapsed: 5s, ETA:    21s
[>>>>>>                        ] 113/538, 20.5 task/s, elapsed: 6s, ETA:    21s
[>>>>>>                        ] 114/538, 20.6 task/s, elapsed: 6s, ETA:    21s
[>>>>>>                        ] 115/538, 20.5 task/s, elapsed: 6s, ETA:    21s
[>>>>>>                        ] 116/538, 20.6 task/s, elapsed: 6s, ETA:    21s
[>>>>>>                        ] 117/538, 20.5 task/s, elapsed: 6s, ETA:    21s
[>>>>>>                        ] 118/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>                        ] 119/538, 20.5 task/s, elapsed: 6s, ETA:    20s
[>>>>>>                        ] 120/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>                        ] 121/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>                        ] 122/538, 20.7 task/s, elapsed: 6s, ETA:    20s
[>>>>>>                        ] 123/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>                        ] 124/538, 20.7 task/s, elapsed: 6s, ETA:    20s
[>>>>>>                        ] 125/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 126/538, 20.7 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 127/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 128/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 129/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 130/538, 20.7 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 131/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 132/538, 20.7 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 133/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 134/538, 20.6 task/s, elapsed: 6s, ETA:    20s
[>>>>>>>                       ] 135/538, 20.6 task/s, elapsed: 7s, ETA:    20s
[>>>>>>>                       ] 136/538, 20.7 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>                       ] 137/538, 20.7 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>                       ] 138/538, 20.7 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>                       ] 139/538, 20.7 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>                       ] 140/538, 20.7 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>                       ] 141/538, 20.7 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>                       ] 142/538, 20.8 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>                       ] 143/538, 20.7 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 144/538, 20.8 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 145/538, 20.8 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 146/538, 20.9 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 147/538, 20.9 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 148/538, 20.9 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 149/538, 20.9 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 150/538, 21.0 task/s, elapsed: 7s, ETA:    19s
[>>>>>>>>                      ] 151/538, 21.0 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 152/538, 21.1 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 153/538, 21.0 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 154/538, 21.1 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 155/538, 21.1 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 156/538, 21.1 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 157/538, 21.1 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 158/538, 21.2 task/s, elapsed: 7s, ETA:    18s
[>>>>>>>>                      ] 159/538, 21.2 task/s, elapsed: 8s, ETA:    18s
[>>>>>>>>                      ] 160/538, 21.2 task/s, elapsed: 8s, ETA:    18s
[>>>>>>>>                      ] 161/538, 21.2 task/s, elapsed: 8s, ETA:    18s
[>>>>>>>>>                     ] 162/538, 21.3 task/s, elapsed: 8s, ETA:    18s
[>>>>>>>>>                     ] 163/538, 21.3 task/s, elapsed: 8s, ETA:    18s
[>>>>>>>>>                     ] 164/538, 21.3 task/s, elapsed: 8s, ETA:    18s
[>>>>>>>>>                     ] 165/538, 21.3 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 166/538, 21.4 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 167/538, 21.4 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 168/538, 21.4 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 169/538, 21.4 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 170/538, 21.5 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 171/538, 21.4 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 172/538, 21.5 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 173/538, 21.5 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 174/538, 21.5 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 175/538, 21.5 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 176/538, 21.5 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 177/538, 21.6 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 178/538, 21.6 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>                     ] 179/538, 21.6 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>>                    ] 180/538, 21.7 task/s, elapsed: 8s, ETA:    17s
[>>>>>>>>>>                    ] 181/538, 21.7 task/s, elapsed: 8s, ETA:    16s
[>>>>>>>>>>                    ] 182/538, 21.7 task/s, elapsed: 8s, ETA:    16s
[>>>>>>>>>>                    ] 183/538, 21.7 task/s, elapsed: 8s, ETA:    16s
[>>>>>>>>>>                    ] 184/538, 21.8 task/s, elapsed: 8s, ETA:    16s
[>>>>>>>>>>                    ] 185/538, 21.7 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 186/538, 21.8 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 187/538, 21.8 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 188/538, 21.8 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 189/538, 21.8 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 190/538, 21.9 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 191/538, 21.8 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 192/538, 21.9 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 193/538, 21.9 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 194/538, 21.9 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 195/538, 21.9 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 196/538, 21.9 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>                    ] 197/538, 22.0 task/s, elapsed: 9s, ETA:    16s
[>>>>>>>>>>>                   ] 198/538, 22.0 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 199/538, 22.0 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 200/538, 22.0 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 201/538, 22.0 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 202/538, 22.0 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 203/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 204/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 205/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 206/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 207/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 208/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 209/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                   ] 210/538, 22.1 task/s, elapsed: 9s, ETA:    15s
[>>>>>>>>>>>                  ] 211/538, 22.2 task/s, elapsed: 10s, ETA:    15s
[>>>>>>>>>>>                  ] 212/538, 22.1 task/s, elapsed: 10s, ETA:    15s
[>>>>>>>>>>>                  ] 213/538, 22.2 task/s, elapsed: 10s, ETA:    15s
[>>>>>>>>>>>                  ] 214/538, 22.2 task/s, elapsed: 10s, ETA:    15s
[>>>>>>>>>>>                  ] 215/538, 22.2 task/s, elapsed: 10s, ETA:    15s
[>>>>>>>>>>>                  ] 216/538, 22.2 task/s, elapsed: 10s, ETA:    15s
[>>>>>>>>>>>                  ] 217/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>                  ] 218/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>                  ] 219/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>                  ] 220/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>                  ] 221/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>                  ] 222/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 223/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 224/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 225/538, 22.3 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 226/538, 22.2 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 227/538, 22.3 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 228/538, 22.3 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 229/538, 22.3 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 230/538, 22.3 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 231/538, 22.4 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 232/538, 22.3 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 233/538, 22.3 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 234/538, 22.4 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 235/538, 22.4 task/s, elapsed: 10s, ETA:    14s
[>>>>>>>>>>>>                 ] 236/538, 22.4 task/s, elapsed: 11s, ETA:    14s
[>>>>>>>>>>>>                 ] 237/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>                 ] 238/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>                 ] 239/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>                 ] 240/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>                 ] 241/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 242/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 243/538, 22.5 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 244/538, 22.5 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 245/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 246/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 247/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 248/538, 22.5 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 249/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 250/538, 22.5 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 251/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 252/538, 22.5 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 253/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 254/538, 22.5 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 255/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 256/538, 22.4 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 257/538, 22.5 task/s, elapsed: 11s, ETA:    13s
[>>>>>>>>>>>>>                ] 258/538, 22.5 task/s, elapsed: 11s, ETA:    12s
[>>>>>>>>>>>>>                ] 259/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 260/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 261/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 262/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 263/538, 22.4 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 264/538, 22.4 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 265/538, 22.4 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 266/538, 22.4 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 267/538, 22.4 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 268/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 269/538, 22.4 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 270/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 271/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 272/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 273/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 274/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 275/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 276/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 277/538, 22.5 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>               ] 278/538, 22.6 task/s, elapsed: 12s, ETA:    12s
[>>>>>>>>>>>>>>>              ] 279/538, 22.5 task/s, elapsed: 12s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 280/538, 22.5 task/s, elapsed: 12s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 281/538, 22.6 task/s, elapsed: 12s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 282/538, 22.6 task/s, elapsed: 12s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 283/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 284/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 285/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 286/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 287/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 288/538, 22.7 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 289/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 290/538, 22.7 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 291/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 292/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 293/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 294/538, 22.6 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 295/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>              ] 296/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>>             ] 297/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>>             ] 298/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>>             ] 299/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>>             ] 300/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>>             ] 301/538, 22.4 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>>             ] 302/538, 22.5 task/s, elapsed: 13s, ETA:    11s
[>>>>>>>>>>>>>>>>             ] 303/538, 22.5 task/s, elapsed: 13s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 304/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 305/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 306/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 307/538, 22.4 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 308/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 309/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 310/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 311/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 312/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 313/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 314/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>             ] 315/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 316/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 317/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 318/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 319/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 320/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 321/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 322/538, 22.4 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 323/538, 22.5 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 324/538, 22.4 task/s, elapsed: 14s, ETA:    10s
[>>>>>>>>>>>>>>>>>            ] 325/538, 22.5 task/s, elapsed: 14s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 326/538, 22.4 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 327/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 328/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 329/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 330/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 331/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 332/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>            ] 333/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 334/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 335/538, 22.5 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 336/538, 22.4 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 337/538, 22.4 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 338/538, 22.4 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 339/538, 22.4 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 340/538, 22.4 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 341/538, 22.4 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 342/538, 22.3 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 343/538, 22.3 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 344/538, 22.3 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 345/538, 22.3 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 346/538, 22.3 task/s, elapsed: 15s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 347/538, 22.4 task/s, elapsed: 16s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 348/538, 22.3 task/s, elapsed: 16s, ETA:     9s
[>>>>>>>>>>>>>>>>>>           ] 349/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>           ] 350/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>           ] 351/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>           ] 352/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 353/538, 22.4 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 354/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 355/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 356/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 357/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 358/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 359/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 360/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 361/538, 22.4 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 362/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 363/538, 22.4 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 364/538, 22.4 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 365/538, 22.3 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 366/538, 22.4 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 367/538, 22.4 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 368/538, 22.4 task/s, elapsed: 16s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 369/538, 22.3 task/s, elapsed: 17s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 370/538, 22.4 task/s, elapsed: 17s, ETA:     8s
[>>>>>>>>>>>>>>>>>>>          ] 371/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 372/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 373/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 374/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 375/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 376/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 377/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 378/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 379/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 380/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 381/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 382/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 383/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 384/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 385/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 386/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 387/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 388/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>         ] 389/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>>        ] 390/538, 22.4 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>>        ] 391/538, 22.3 task/s, elapsed: 17s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>>        ] 392/538, 22.4 task/s, elapsed: 18s, ETA:     7s
[>>>>>>>>>>>>>>>>>>>>>        ] 393/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 394/538, 22.4 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 395/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 396/538, 22.4 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 397/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 398/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 399/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 400/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 401/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 402/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 403/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 404/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 405/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 406/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 407/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>        ] 408/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 409/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 410/538, 22.2 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 411/538, 22.3 task/s, elapsed: 18s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 412/538, 22.2 task/s, elapsed: 19s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 413/538, 22.2 task/s, elapsed: 19s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 414/538, 22.2 task/s, elapsed: 19s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 415/538, 22.3 task/s, elapsed: 19s, ETA:     6s
[>>>>>>>>>>>>>>>>>>>>>>       ] 416/538, 22.2 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 417/538, 22.2 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 418/538, 22.2 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 419/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 420/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 421/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 422/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 423/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 424/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 425/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>       ] 426/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 427/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 428/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 429/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 430/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 431/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 432/538, 22.3 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 433/538, 22.4 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 434/538, 22.4 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 435/538, 22.4 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 436/538, 22.4 task/s, elapsed: 19s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 437/538, 22.4 task/s, elapsed: 20s, ETA:     5s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 438/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 439/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 440/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 441/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 442/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 443/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 444/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>      ] 445/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 446/538, 22.4 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 447/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 448/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 449/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 450/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 451/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 452/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 453/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 454/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 455/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 456/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 457/538, 22.6 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 458/538, 22.5 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 459/538, 22.6 task/s, elapsed: 20s, ETA:     4s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 460/538, 22.6 task/s, elapsed: 20s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 461/538, 22.6 task/s, elapsed: 20s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 462/538, 22.6 task/s, elapsed: 20s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>     ] 463/538, 22.6 task/s, elapsed: 20s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 464/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 465/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 466/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 467/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 468/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 469/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 470/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 471/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 472/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 473/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 474/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 475/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 476/538, 22.5 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 477/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 478/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 479/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 480/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 481/538, 22.6 task/s, elapsed: 21s, ETA:     3s
[>>>>>>>>>>>>>>>>>>>>>>>>>    ] 482/538, 22.6 task/s, elapsed: 21s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 483/538, 22.6 task/s, elapsed: 21s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 484/538, 22.6 task/s, elapsed: 21s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 485/538, 22.6 task/s, elapsed: 21s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 486/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 487/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 488/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 489/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 490/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 491/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 492/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 493/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 494/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 495/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 496/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 497/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 498/538, 22.6 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 499/538, 22.7 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>   ] 500/538, 22.7 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 501/538, 22.7 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 502/538, 22.7 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 503/538, 22.7 task/s, elapsed: 22s, ETA:     2s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 504/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 505/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 506/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 507/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 508/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 509/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 510/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 511/538, 22.7 task/s, elapsed: 22s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 512/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 513/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 514/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 515/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 516/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 517/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 518/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>  ] 519/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 520/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 521/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 522/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 523/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 524/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 525/538, 22.7 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 526/538, 22.6 task/s, elapsed: 23s, ETA:     1s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 527/538, 22.7 task/s, elapsed: 23s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 528/538, 22.6 task/s, elapsed: 23s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 529/538, 22.7 task/s, elapsed: 23s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 530/538, 22.6 task/s, elapsed: 23s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 531/538, 22.6 task/s, elapsed: 23s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 532/538, 22.6 task/s, elapsed: 24s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 533/538, 22.6 task/s, elapsed: 24s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 534/538, 22.6 task/s, elapsed: 24s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 535/538, 22.7 task/s, elapsed: 24s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 536/538, 22.6 task/s, elapsed: 24s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>> ] 537/538, 22.7 task/s, elapsed: 24s, ETA:     0s
[>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 538/538, 22.7 task/s, elapsed: 24s, ETA:     0s2021-10-12 17:05:26,696 - mmdet - INFO - Evaluating bbox...
2021-10-12 17:05:26,697 - mmdet - ERROR - The testing results of the whole dataset is empty.
2021-10-12 17:05:26,700 - mmdet - INFO - Exp name: yolox.py
2021-10-12 17:05:26,700 - mmdet - INFO - Epoch(val) [1][538]    
Loading and preparing results...
2021-10-12 17:06:11,531 - mmdet - INFO - Epoch [2][50/157]  lr: 8.692e-05, eta: 4:26:39, time: 0.896, data_time: 0.775, memory: 2987, loss_cls: 0.0000, loss_bbox: 0.0000, loss_obj: 0.3131, loss: 0.3131, grad_norm: 5.3416
2021-10-12 17:06:51,297 - mmdet - INFO - Epoch [2][100/157] lr: 1.340e-04, eta: 4:22:07, time: 0.795, data_time: 0.676, memory: 2987, loss_cls: 0.0000, loss_bbox: 0.0000, loss_obj: 0.2228, loss: 0.2228, grad_norm: 3.0542
2021-10-12 17:07:30,941 - mmdet - INFO - Epoch [2][150/157] lr: 1.912e-04, eta: 4:18:43, time: 0.793, data_time: 0.671, memory: 2987, loss_cls: 0.0000, loss_bbox: 0.0000, loss_obj: 0.1744, loss: 0.1744, grad_norm: 2.0940
[                                                  ] 0/538, elapsed: 0s, ETA:
[                                 ] 1/538, 3.4 task/s, elapsed: 0s, ETA:   158s
...
hhaAndroid commented 2 years ago

@tikitong The setting should be evaluation = dict(interval=1, classwise=True, metric='bbox', save_best='auto') and if the output is empty, the program will still report an error, and we will fix it as soon as possible. Thank you.

neel04 commented 2 years ago

@hhaAndroid I did your change, but I am now erroring out validation post-epoch, whereas validation was successful before 🤔

Traceback (most recent call last):
  File "tools/train.py", line 187, in <module>
    main()
  File "tools/train.py", line 183, in main
    meta=meta)
  File "/kaggle/working/Swin-Transformer-Object-Detection/mmdet/apis/train.py", line 185, in train_detector
    runner.run(data_loaders, cfg.workflow)
  File "/opt/conda/lib/python3.7/site-packages/mmcv/runner/epoch_based_runner.py", line 127, in run
    epoch_runner(data_loaders[i], **kwargs)
  File "/opt/conda/lib/python3.7/site-packages/mmcv/runner/epoch_based_runner.py", line 54, in train
    self.call_hook('after_train_epoch')
  File "/opt/conda/lib/python3.7/site-packages/mmcv/runner/base_runner.py", line 307, in call_hook
    getattr(hook, fn_name)(self)
  File "/kaggle/working/Swin-Transformer-Object-Detection/mmdet/core/evaluation/eval_hooks.py", line 149, in after_train_epoch
    self.save_best_checkpoint(runner, key_score)
  File "/kaggle/working/Swin-Transformer-Object-Detection/mmdet/core/evaluation/eval_hooks.py", line 166, in save_best_checkpoint
    last_ckpt = runner.meta['hook_msgs']['last_ckpt']
KeyError: 'last_ckpt' 
okotaku commented 2 years ago

@tikitong

classes = ('circle', 'triangle', 'rectangle')
data = dict(
    train=dict(dataset=dict(classes=classes)),
    val=dict(classes=classes),
    test=dict(classes=classes))

loss_cls=0, the reason may be that you are not passing the class to the dataset.

fhf526 commented 2 years ago

I also encountered this problem. How did you solve it?

zhouzaida commented 2 years ago

I also encountered this problem. How did you solve it?

The issue can be resolved by https://github.com/open-mmlab/mmcv/pull/1398

LuisFelipeLeivaH commented 2 years ago

I got the same problem running MaskRCNN with a custom datasaset written with VGG VIA and exported as COCO style:

sys.platform: linux Python: 3.8.10 (default, Jun 16 2021, 14:20:20) [GCC 9.3.0] CUDA available: True GPU 0: Tesla P100-PCIE-12GB CUDA_HOME: /cvmfs/soft.computecanada.ca/easybuild/software/2020/Core/cudacore/11.1.1 NVCC: Build cuda_11.1.TC455_06.29190527_0 GCC: gcc (GCC) 9.3.0 PyTorch: 1.9.0+cu111 PyTorch compiling details: PyTorch built with:

TorchVision: 0.10.0+cu111 OpenCV: 4.5.3 MMCV: 1.4.4 MMCV Compiler: GCC 9.3 MMCV CUDA Compiler: 11.1 MMDetection: 2.20.0+

2022-02-09 17:36:56,669 - mmdet - INFO - Distributed training: False 2022-02-09 17:36:57,771 - mmdet - INFO - Config: model = dict( type='MaskRCNN', backbone=dict( type='ResNet', depth=50, num_stages=4, out_indices=(0, 1, 2, 3), frozen_stages=1, norm_cfg=dict(type='BN', requires_grad=True), norm_eval=True, style='pytorch', init_cfg=dict(type='Pretrained', checkpoint='torchvision://resnet50')), neck=dict( type='FPN', in_channels=[256, 512, 1024, 2048], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_generator=dict( type='AnchorGenerator', scales=[8], ratios=[0.5, 1.0, 2.0], strides=[4, 8, 16, 32, 64]), bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[1.0, 1.0, 1.0, 1.0]), loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), roi_head=dict( type='StandardRoIHead', bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=7, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), bbox_head=dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=1, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0.0, 0.0, 0.0, 0.0], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=False, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='L1Loss', loss_weight=1.0)), mask_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', output_size=14, sampling_ratio=0), out_channels=256, featmap_strides=[4, 8, 16, 32]), mask_head=dict( type='FCNMaskHead', num_convs=4, in_channels=256, conv_out_channels=256, num_classes=1, loss_mask=dict( type='CrossEntropyLoss', use_mask=True, loss_weight=1.0))), train_cfg=dict( rpn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.7, neg_iou_thr=0.3, min_pos_iou=0.3, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=256, pos_fraction=0.5, neg_pos_ub=-1, add_gt_as_proposals=False), allowed_border=-1, pos_weight=-1, debug=False), rpn_proposal=dict( nms_pre=2000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( assigner=dict( type='MaxIoUAssigner', pos_iou_thr=0.5, neg_iou_thr=0.5, min_pos_iou=0.5, match_low_quality=True, ignore_iof_thr=-1), sampler=dict( type='RandomSampler', num=512, pos_fraction=0.25, neg_pos_ub=-1, add_gt_as_proposals=True), mask_size=28, pos_weight=-1, debug=False)), test_cfg=dict( rpn=dict( nms_pre=1000, max_per_img=1000, nms=dict(type='nms', iou_threshold=0.7), min_bbox_size=0), rcnn=dict( score_thr=0.05, nms=dict(type='nms', iou_threshold=0.5), max_per_img=100, mask_thr_binary=0.5))) dataset_type = 'CocoDataset' data_root = 'data/coco/' img_norm_cfg = dict( mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True) train_pipeline = [ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1333, 800), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='DefaultFormatBundle'), dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ] test_pipeline = [ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1333, 800), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ] data = dict( samples_per_gpu=2, workers_per_gpu=2, train=dict( type='CocoDataset', ann_file= '/localscratch/felipe.26491647.0/datasets/salmons/COCO_annotations/Train_1.json', img_prefix= '/localscratch/felipe.26491647.0/datasets/salmons/images/All', pipeline=[ dict(type='LoadImageFromFile'), dict(type='LoadAnnotations', with_bbox=True, with_mask=True), dict(type='Resize', img_scale=(1333, 800), keep_ratio=True), dict(type='RandomFlip', flip_ratio=0.5), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='DefaultFormatBundle'), dict( type='Collect', keys=['img', 'gt_bboxes', 'gt_labels', 'gt_masks']) ], classes=['salmon']), val=dict( type='CocoDataset', ann_file= '/localscratch/felipe.26491647.0/datasets/salmons/COCO_annotations/Val_3.json', img_prefix= '/localscratch/felipe.26491647.0/datasets/salmons/images/All', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1333, 800), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ], classes=['salmon']), test=dict( type='CocoDataset', ann_file= '/localscratch/felipe.26491647.0/datasets/salmons/COCO_annotations/Val_3.json', img_prefix= '/localscratch/felipe.26491647.0/datasets/salmons/images/All', pipeline=[ dict(type='LoadImageFromFile'), dict( type='MultiScaleFlipAug', img_scale=(1333, 800), flip=False, transforms=[ dict(type='Resize', keep_ratio=True), dict(type='RandomFlip'), dict( type='Normalize', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True), dict(type='Pad', size_divisor=32), dict(type='ImageToTensor', keys=['img']), dict(type='Collect', keys=['img']) ]) ], classes=['salmon'])) evaluation = dict(interval=1, metric=['bbox', 'segm']) optimizer = dict(type='SGD', lr=0.02, momentum=0.9, weight_decay=0.0001) optimizer_config = dict(grad_clip=None) lr_config = None runner = dict(type='EpochBasedRunner', max_epochs=6) checkpoint_config = dict(interval=1) log_config = dict(interval=1, hooks=[dict(type='TensorboardLoggerHook')]) custom_hooks = [dict(type='NumClassCheckHook')] dist_params = dict(backend='nccl') log_level = 'INFO' load_from = '/home/felipe/projects/def-akhloufi/felipe/thesis/probando-funcionalidad/checkpoints/mask_rcnn_r50_fpn_1x_coco_pretrained.pth' resume_from = None workflow = [('train', 1), ('val', 1)] work_dir = '/localscratch/felipe.26491647.0/datasets/Coco/checkpoints/Experiment 1/MaskRCNN/Salmons/Resnet50/hyper_1/Checkpoint 2022-02-09@17:36:53' gpu_ids = range(0, 1)

2022-02-09 17:36:57,786 - mmdet - INFO - Set random seed to 1950499736, deterministic: False 2022-02-09 17:36:58,673 - mmdet - INFO - initialize ResNet with init_cfg {'type': 'Pretrained', 'checkpoint': 'torchvision://resnet50'} 2022-02-09 17:36:58,674 - mmcv - INFO - load model from: torchvision://resnet50 2022-02-09 17:36:58,674 - mmcv - INFO - load checkpoint from torchvision path: torchvision://resnet50 2022-02-09 17:36:59,232 - mmcv - WARNING - The model and loaded state dict do not match exactly

unexpected key in source state_dict: fc.weight, fc.bias

2022-02-09 17:36:59,266 - mmdet - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'} 2022-02-09 17:36:59,291 - mmdet - INFO - initialize RPNHead with init_cfg {'type': 'Normal', 'layer': 'Conv2d', 'std': 0.01} 2022-02-09 17:36:59,296 - mmdet - INFO - initialize Shared2FCBBoxHead with init_cfg [{'type': 'Normal', 'std': 0.01, 'override': {'name': 'fc_cls'}}, {'type': 'Normal', 'std': 0.001, 'override': {'name': 'fc_reg'}}, {'type': 'Xavier', 'distribution': 'uniform', 'override': [{'name': 'shared_fcs'}, {'name': 'cls_fcs'}, {'name': 'reg_fcs'}]}] loading annotations into memory... Done (t=0.00s) creating index... index created! loading annotations into memory... Done (t=0.00s) creating index... index created! fatal: not a git repository (or any parent up to mount point /) Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set). /project/6005433/felipe/thesis/env-mmdet/lib/python3.8/site-packages/torch/utils/data/dataloader.py:478: UserWarning: This DataLoader will create 2 worker processes in total. Our suggested max number of worker in current system is 1, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary. warnings.warn(_create_warning_msg( loading annotations into memory... Done (t=0.01s) creating index... index created! 2022-02-09 17:37:11,856 - mmdet - INFO - load checkpoint from local path: /home/felipe/projects/def-akhloufi/felipe/thesis/probando-funcionalidad/checkpoints/mask_rcnn_r50_fpn_1x_coco_pretrained.pth 2022-02-09 17:37:12,296 - mmdet - WARNING - The model and loaded state dict do not match exactly

size mismatch for roi_head.bbox_head.fc_cls.weight: copying a param with shape torch.Size([81, 1024]) from checkpoint, the shape in current model is torch.Size([2, 1024]). size mismatch for roi_head.bbox_head.fc_cls.bias: copying a param with shape torch.Size([81]) from checkpoint, the shape in current model is torch.Size([2]). size mismatch for roi_head.bbox_head.fc_reg.weight: copying a param with shape torch.Size([320, 1024]) from checkpoint, the shape in current model is torch.Size([4, 1024]). size mismatch for roi_head.bbox_head.fc_reg.bias: copying a param with shape torch.Size([320]) from checkpoint, the shape in current model is torch.Size([4]). size mismatch for roi_head.mask_head.conv_logits.weight: copying a param with shape torch.Size([80, 256, 1, 1]) from checkpoint, the shape in current model is torch.Size([1, 256, 1, 1]). size mismatch for roi_head.mask_head.conv_logits.bias: copying a param with shape torch.Size([80]) from checkpoint, the shape in current model is torch.Size([1]). 2022-02-09 17:37:12,302 - mmdet - INFO - Start running, host: felipe@cdr352.int.cedar.computecanada.ca, work_dir: /localscratch/felipe.26491647.0/datasets/Coco/checkpoints/Experiment 1/MaskRCNN/Salmons/Resnet50/hyper_1/Checkpoint 2022-02-09@17:36:53 2022-02-09 17:37:12,303 - mmdet - INFO - Hooks will be executed in the following order: before_run: (NORMAL ) CheckpointHook (LOW ) EvalHook (VERY_LOW ) TensorboardLoggerHook

before_train_epoch: (NORMAL ) NumClassCheckHook (LOW ) IterTimerHook (LOW ) EvalHook (VERY_LOW ) TensorboardLoggerHook

before_train_iter: (LOW ) IterTimerHook (LOW ) EvalHook

after_train_iter: (ABOVE_NORMAL) OptimizerHook (NORMAL ) CheckpointHook (LOW ) IterTimerHook (LOW ) EvalHook (VERY_LOW ) TensorboardLoggerHook

after_train_epoch: (NORMAL ) CheckpointHook (LOW ) EvalHook (VERY_LOW ) TensorboardLoggerHook

before_val_epoch: (NORMAL ) NumClassCheckHook (LOW ) IterTimerHook (VERY_LOW ) TensorboardLoggerHook

before_val_iter: (LOW ) IterTimerHook

after_val_iter: (LOW ) IterTimerHook

after_val_epoch: (VERY_LOW ) TensorboardLoggerHook

after_run: (VERY_LOW ) TensorboardLoggerHook

2022-02-09 17:37:12,306 - mmdet - INFO - workflow: [('train', 1), ('val', 1)], max: 6 epochs 2022-02-09 17:37:12,306 - mmdet - INFO - Checkpoints will be saved to /localscratch/felipe.26491647.0/datasets/Coco/checkpoints/Experiment 1/MaskRCNN/Salmons/Resnet50/hyper_1/Checkpoint 2022-02-09@17:36:53 by HardDiskBackend. /project/6005433/felipe/thesis/env-mmdet/lib/python3.8/site-packages/torch/nn/functional.py:718: UserWarning: Named tensors and all their associated APIs are an experimental feature and subject to change. Please do not use them for anything important until they are released as stable. (Triggered internally at /pytorch/c10/core/TensorImpl.h:1156.) return torch.max_pool2d(input, kernel_size, stride, padding, dilation, ceil_mode) 2022-02-09 17:37:19,526 - mmdet - INFO - Saving checkpoint at 1 epochs [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 86/86, 10.1 task/s, elapsed: 8s, ETA: 0s2022-02-09 17:37:28,753 - mmdet - INFO - Evaluating bbox... Loading and preparing results... 2022-02-09 17:37:28,754 - mmdet - ERROR - The testing results of the whole dataset is empty. 2022-02-09 17:37:45,671 - mmdet - INFO - Saving checkpoint at 2 epochs [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 86/86, 9.8 task/s, elapsed: 9s, ETA: 0s2022-02-09 17:37:55,223 - mmdet - INFO - Evaluating bbox... Loading and preparing results... 2022-02-09 17:37:55,224 - mmdet - ERROR - The testing results of the whole dataset is empty. 2022-02-09 17:38:12,625 - mmdet - INFO - Saving checkpoint at 3 epochs [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 86/86, 10.2 task/s, elapsed: 8s, ETA: 0s2022-02-09 17:38:21,799 - mmdet - INFO - Evaluating bbox... Loading and preparing results... 2022-02-09 17:38:21,800 - mmdet - ERROR - The testing results of the whole dataset is empty. 2022-02-09 17:38:38,849 - mmdet - INFO - Saving checkpoint at 4 epochs [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 86/86, 10.4 task/s, elapsed: 8s, ETA: 0s2022-02-09 17:38:47,868 - mmdet - INFO - Evaluating bbox... Loading and preparing results... 2022-02-09 17:38:47,869 - mmdet - ERROR - The testing results of the whole dataset is empty. 2022-02-09 17:39:05,898 - mmdet - INFO - Saving checkpoint at 5 epochs [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 86/86, 10.3 task/s, elapsed: 8s, ETA: 0s2022-02-09 17:39:15,076 - mmdet - INFO - Evaluating bbox... Loading and preparing results... 2022-02-09 17:39:15,078 - mmdet - ERROR - The testing results of the whole dataset is empty. 2022-02-09 17:39:32,140 - mmdet - INFO - Saving checkpoint at 6 epochs [>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>] 86/86, 10.3 task/s, elapsed: 8s, ETA: 0s2022-02-09 17:39:41,219 - mmdet - INFO - Evaluating bbox... Loading and preparing results... 2022-02-09 17:39:41,220 - mmdet - ERROR - The testing results of the whole dataset is empty.

mahilaMoghadami commented 8 months ago

Evaluating bbox... Loading and preparing results... The testing results of the whole dataset is empty.

how I can solve it?

lzx101 commented 7 months ago

正在评估 bbox... 正在加载并准备结果... 整个数据集的测试结果为空。

我怎样才能解决它?

你好,这个问题解决了吗

w577658669 commented 2 months ago

正在评估 bbox... 正在加载并准备结果... 整个数据集的测试结果为空。 我怎样才能解决它?

你好,这个问题解决了吗

请问你解决了吗