open-mmlab / mmrotate

OpenMMLab Rotated Object Detection Toolbox and Benchmark
https://mmrotate.readthedocs.io/en/latest/
Apache License 2.0
1.85k stars 549 forks source link

[Common Issues] Unable to reproduce the training of the official example model #911

Open qinmengyuan opened 1 year ago

qinmengyuan commented 1 year ago

Prerequisite

Task

I'm using the official example scripts/configs for the officially supported tasks/models/datasets.

Branch

master branch https://github.com/open-mmlab/mmrotate

Environment

I use the official docker image(details will be shown in log)

Reproduces the problem - code sample

Contents of document configs/official/oriented_rcnn_r50_fpn_fp16_1x_dota_le90.py

dataset_type = 'DOTADataset'
data_root = 'data/split_ss_dota/'
img_norm_cfg = dict(
    mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True)
train_pipeline = [
    dict(type='LoadImageFromFile'),
    dict(type='LoadAnnotations', with_bbox=True),
    dict(type='RResize', img_scale=(1024, 1024)),
    dict(
        type='RRandomFlip',
        flip_ratio=[0.25, 0.25, 0.25],
        direction=['horizontal', 'vertical', 'diagonal'],
        version='le90'),
    dict(
        type='Normalize',
        mean=[123.675, 116.28, 103.53],
        std=[58.395, 57.12, 57.375],
        to_rgb=True),
    dict(type='Pad', size_divisor=32),
    dict(type='DefaultFormatBundle'),
    dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])
]
test_pipeline = [
    dict(type='LoadImageFromFile'),
    dict(
        type='MultiScaleFlipAug',
        img_scale=(1024, 1024),
        flip=False,
        transforms=[
            dict(type='RResize'),
            dict(
                type='Normalize',
                mean=[123.675, 116.28, 103.53],
                std=[58.395, 57.12, 57.375],
                to_rgb=True),
            dict(type='Pad', size_divisor=32),
            dict(type='DefaultFormatBundle'),
            dict(type='Collect', keys=['img'])
        ])
]
data = dict(
    samples_per_gpu=2,
    workers_per_gpu=2,
    train=dict(
        type='DOTADataset',
        ann_file='data/split_ss_dota/train/annfiles/',
        img_prefix='data/split_ss_dota/train/images/',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(type='LoadAnnotations', with_bbox=True),
            dict(type='RResize', img_scale=(1024, 1024)),
            dict(
                type='RRandomFlip',
                flip_ratio=[0.25, 0.25, 0.25],
                direction=['horizontal', 'vertical', 'diagonal'],
                version='le90'),
            dict(
                type='Normalize',
                mean=[123.675, 116.28, 103.53],
                std=[58.395, 57.12, 57.375],
                to_rgb=True),
            dict(type='Pad', size_divisor=32),
            dict(type='DefaultFormatBundle'),
            dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])
        ],
        version='le90'),
    val=dict(
        type='DOTADataset',
        ann_file=
        'data/split_ss_dota/trainval/annfiles/',
        img_prefix='data/split_ss_dota/trainval/images/',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(
                type='MultiScaleFlipAug',
                img_scale=(1024, 1024),
                flip=False,
                transforms=[
                    dict(type='RResize'),
                    dict(
                        type='Normalize',
                        mean=[123.675, 116.28, 103.53],
                        std=[58.395, 57.12, 57.375],
                        to_rgb=True),
                    dict(type='Pad', size_divisor=32),
                    dict(type='DefaultFormatBundle'),
                    dict(type='Collect', keys=['img'])
                ])
        ],
        version='le90'),
    test=dict(
        type='DOTADataset',
        ann_file='data/split_ss_dota/trainval/annfiles/',
        img_prefix='data/split_ss_dota/trainval/images/',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(
                type='MultiScaleFlipAug',
                img_scale=(1024, 1024),
                flip=False,
                transforms=[
                    dict(type='RResize'),
                    dict(
                        type='Normalize',
                        mean=[123.675, 116.28, 103.53],
                        std=[58.395, 57.12, 57.375],
                        to_rgb=True),
                    dict(type='Pad', size_divisor=32),
                    dict(type='DefaultFormatBundle'),
                    dict(type='Collect', keys=['img'])
                ])
        ],
        version='le90'))
evaluation = dict(interval=12, metric='mAP')
optimizer = dict(type='SGD', lr=0.005, momentum=0.9, weight_decay=0.0001)
optimizer_config = dict(grad_clip=dict(max_norm=35, norm_type=2))
lr_config = dict(
    policy='step',
    warmup='linear',
    warmup_iters=500,
    warmup_ratio=0.3333333333333333,
    step=[8, 11])
runner = dict(type='EpochBasedRunner', max_epochs=1200)
checkpoint_config = dict(interval=12)
log_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')])
dist_params = dict(backend='nccl')
log_level = 'INFO'
load_from = None
resume_from = None
workflow = [('train', 1)]
angle_version = 'le90'
model = dict(
    type='OrientedRCNN',
    backbone=dict(
        type='ResNet',
        depth=50,
        num_stages=4,
        out_indices=(0, 1, 2, 3),
        frozen_stages=1,
        norm_cfg=dict(type='BN', requires_grad=True),
        norm_eval=True,
        style='pytorch',
        init_cfg=dict(type='Pretrained', checkpoint='torchvision://resnet50')),
    neck=dict(
        type='FPN',
        in_channels=[256, 512, 1024, 2048],
        out_channels=256,
        num_outs=5),
    rpn_head=dict(
        type='OrientedRPNHead',
        in_channels=256,
        feat_channels=256,
        version='le90',
        anchor_generator=dict(
            type='AnchorGenerator',
            scales=[8],
            ratios=[0.5, 1.0, 2.0],
            strides=[4, 8, 16, 32, 64]),
        bbox_coder=dict(
            type='MidpointOffsetCoder',
            angle_range='le90',
            target_means=[0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
            target_stds=[1.0, 1.0, 1.0, 1.0, 0.5, 0.5]),
        loss_cls=dict(
            type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0),
        loss_bbox=dict(
            type='SmoothL1Loss', beta=0.1111111111111111, loss_weight=1.0)),
    roi_head=dict(
        type='OrientedStandardRoIHead',
        bbox_roi_extractor=dict(
            type='RotatedSingleRoIExtractor',
            roi_layer=dict(
                type='RoIAlignRotated',
                out_size=7,
                sample_num=2,
                clockwise=True),
            out_channels=256,
            featmap_strides=[4, 8, 16, 32]),
        bbox_head=dict(
            type='RotatedShared2FCBBoxHead',
            in_channels=256,
            fc_out_channels=1024,
            roi_feat_size=7,
            num_classes=15,
            bbox_coder=dict(
                type='DeltaXYWHAOBBoxCoder',
                angle_range='le90',
                norm_factor=None,
                edge_swap=True,
                proj_xy=True,
                target_means=(0.0, 0.0, 0.0, 0.0, 0.0),
                target_stds=(0.1, 0.1, 0.2, 0.2, 0.1)),
            reg_class_agnostic=True,
            loss_cls=dict(
                type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0),
            loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0))),
    train_cfg=dict(
        rpn=dict(
            assigner=dict(
                type='MaxIoUAssigner',
                pos_iou_thr=0.7,
                neg_iou_thr=0.3,
                min_pos_iou=0.3,
                match_low_quality=True,
                ignore_iof_thr=-1),
            sampler=dict(
                type='RandomSampler',
                num=256,
                pos_fraction=0.5,
                neg_pos_ub=-1,
                add_gt_as_proposals=False),
            allowed_border=0,
            pos_weight=-1,
            debug=False),
        rpn_proposal=dict(
            nms_pre=2000,
            max_per_img=2000,
            nms=dict(type='nms', iou_threshold=0.8),
            min_bbox_size=0),
        rcnn=dict(
            assigner=dict(
                type='MaxIoUAssigner',
                pos_iou_thr=0.5,
                neg_iou_thr=0.5,
                min_pos_iou=0.5,
                match_low_quality=False,
                iou_calculator=dict(type='RBboxOverlaps2D'),
                ignore_iof_thr=-1),
            sampler=dict(
                type='RRandomSampler',
                num=512,
                pos_fraction=0.25,
                neg_pos_ub=-1,
                add_gt_as_proposals=True),
            pos_weight=-1,
            debug=False)),
    test_cfg=dict(
        rpn=dict(
            nms_pre=2000,
            max_per_img=2000,
            nms=dict(type='nms', iou_threshold=0.8),
            min_bbox_size=0),
        rcnn=dict(
            nms_pre=2000,
            min_bbox_size=0,
            score_thr=0.05,
            nms=dict(iou_thr=0.1),
            max_per_img=2000)))
fp16 = dict(loss_scale='dynamic')
work_dir = 'tutorial_exps/6/'
auto_resume = False

Reproduces the problem - command or script

./tools/dist_train.sh configs/official/oriented_rcnn_r50_fpn_fp16_1x_dota_le90.py 8 --work-dir tutorial_exps/6 --seed 1002054415

Reproduces the problem - error message

too long to write in one isssue, I'll add it below.

Additional information

Firstly, sorry for using the bug type to submit this common issue, as the jump link (https://github.com/open-mmlab/mmdetection/blob/master/.github/ISSUE_TEMPLATE/error-report.md/) for common issues is not available.

Even if I use the exact same configuration as the official example(https://github.com/open-mmlab/mmrotate/tree/main/configs/oriented_rcnn), I still can't train a model that works as well as the official example([oriented_rcnn_r50_fpn_fp16_1x_dota_le90](https://github.com/open-mmlab/mmrotate/blob/main/configs/oriented_rcnn/oriented_rcnn_r50_fpn_fp16_1x_dota_le90.py)), how can I reproduce the official example model? Attached is my training log, which looks exactly the same as the configuration in the official example logs.

I would be grateful if you could reply to me, Thank the authors for providing such an excellent rotating target detection framework.

qinmengyuan commented 1 year ago

Contents of document tutorial_exps/6/20230731_101831.log.json

{"env_info": "sys.platform: linux\nPython: 3.7.7 (default, May  7 2020, 21:25:33) [GCC 7.3.0]\nCUDA available: True\nGPU 0,1,2,3,4,5,6,7: NVIDIA GeForce RTX 2080 Ti\nCUDA_HOME: /usr/local/cuda\nNVCC: Cuda compilation tools, release 10.1, V10.1.24\nGCC: gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0\nPyTorch: 1.6.0\nPyTorch compiling details: PyTorch built with:\n  - GCC 7.3\n  - C++ Version: 201402\n  - Intel(R) Math Kernel Library Version 2020.0.1 Product Build 20200208 for Intel(R) 64 architecture applications\n  - Intel(R) MKL-DNN v1.5.0 (Git Hash e2ac1fac44c5078ca927cb9b90e1b3066a0b2ed0)\n  - OpenMP 201511 (a.k.a. OpenMP 4.5)\n  - NNPACK is enabled\n  - CPU capability usage: AVX2\n  - CUDA Runtime 10.1\n  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_37,code=compute_37\n  - CuDNN 7.6.3\n  - Magma 2.5.2\n  - Build settings: BLAS=MKL, BUILD_TYPE=Release, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DUSE_VULKAN_WRAPPER -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, USE_CUDA=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_STATIC_DISPATCH=OFF, \n\nTorchVision: 0.7.0\nOpenCV: 4.7.0\nMMCV: 1.7.1\nMMCV Compiler: GCC 7.3\nMMCV CUDA Compiler: 10.1\nMMRotate: 0.3.4+fe36494", "config": "dataset_type = 'DOTADataset'\ndata_root = 'data/split_ss_dota/'\nimg_norm_cfg = dict(\n    mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True)\ntrain_pipeline = [\n    dict(type='LoadImageFromFile'),\n    dict(type='LoadAnnotations', with_bbox=True),\n    dict(type='RResize', img_scale=(1024, 1024)),\n    dict(\n        type='RRandomFlip',\n        flip_ratio=[0.25, 0.25, 0.25],\n        direction=['horizontal', 'vertical', 'diagonal'],\n        version='le90'),\n    dict(\n        type='Normalize',\n        mean=[123.675, 116.28, 103.53],\n        std=[58.395, 57.12, 57.375],\n        to_rgb=True),\n    dict(type='Pad', size_divisor=32),\n    dict(type='DefaultFormatBundle'),\n    dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])\n]\ntest_pipeline = [\n    dict(type='LoadImageFromFile'),\n    dict(\n        type='MultiScaleFlipAug',\n        img_scale=(1024, 1024),\n        flip=False,\n        transforms=[\n            dict(type='RResize'),\n            dict(\n                type='Normalize',\n                mean=[123.675, 116.28, 103.53],\n                std=[58.395, 57.12, 57.375],\n                to_rgb=True),\n            dict(type='Pad', size_divisor=32),\n            dict(type='DefaultFormatBundle'),\n            dict(type='Collect', keys=['img'])\n        ])\n]\ndata = dict(\n    samples_per_gpu=2,\n    workers_per_gpu=2,\n    train=dict(\n        type='DOTADataset',\n        ann_file='data/split_ss_dota/train/annfiles/',\n        img_prefix='data/split_ss_dota/train/images/',\n        pipeline=[\n            dict(type='LoadImageFromFile'),\n            dict(type='LoadAnnotations', with_bbox=True),\n            dict(type='RResize', img_scale=(1024, 1024)),\n            dict(\n                type='RRandomFlip',\n                flip_ratio=[0.25, 0.25, 0.25],\n                direction=['horizontal', 'vertical', 'diagonal'],\n                version='le90'),\n            dict(\n                type='Normalize',\n                mean=[123.675, 116.28, 103.53],\n                std=[58.395, 57.12, 57.375],\n                to_rgb=True),\n            dict(type='Pad', size_divisor=32),\n            dict(type='DefaultFormatBundle'),\n            dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])\n        ],\n        version='le90'),\n    val=dict(\n        type='DOTADataset',\n        ann_file='data/split_ss_dota/trainval/annfiles/',\n        img_prefix='data/split_ss_dota/trainval/images/',\n        pipeline=[\n            dict(type='LoadImageFromFile'),\n            dict(\n                type='MultiScaleFlipAug',\n                img_scale=(1024, 1024),\n                flip=False,\n                transforms=[\n                    dict(type='RResize'),\n                    dict(\n                        type='Normalize',\n                        mean=[123.675, 116.28, 103.53],\n                        std=[58.395, 57.12, 57.375],\n                        to_rgb=True),\n                    dict(type='Pad', size_divisor=32),\n                    dict(type='DefaultFormatBundle'),\n                    dict(type='Collect', keys=['img'])\n                ])\n        ],\n        version='le90'),\n    test=dict(\n        type='DOTADataset',\n        ann_file='data/split_ss_dota/trainval/annfiles/',\n        img_prefix='data/split_ss_dota/trainval/images/',\n        pipeline=[\n            dict(type='LoadImageFromFile'),\n            dict(\n                type='MultiScaleFlipAug',\n                img_scale=(1024, 1024),\n                flip=False,\n                transforms=[\n                    dict(type='RResize'),\n                    dict(\n                        type='Normalize',\n                        mean=[123.675, 116.28, 103.53],\n                        std=[58.395, 57.12, 57.375],\n                        to_rgb=True),\n                    dict(type='Pad', size_divisor=32),\n                    dict(type='DefaultFormatBundle'),\n                    dict(type='Collect', keys=['img'])\n                ])\n        ],\n        version='le90'))\nevaluation = dict(interval=12, metric='mAP')\noptimizer = dict(type='SGD', lr=0.005, momentum=0.9, weight_decay=0.0001)\noptimizer_config = dict(grad_clip=dict(max_norm=35, norm_type=2))\nlr_config = dict(\n    policy='step',\n    warmup='linear',\n    warmup_iters=500,\n    warmup_ratio=0.3333333333333333,\n    step=[8, 11])\nrunner = dict(type='EpochBasedRunner', max_epochs=1200)\ncheckpoint_config = dict(interval=12)\nlog_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')])\ndist_params = dict(backend='nccl')\nlog_level = 'INFO'\nload_from = None\nresume_from = None\nworkflow = [('train', 1)]\nangle_version = 'le90'\nmodel = dict(\n    type='OrientedRCNN',\n    backbone=dict(\n        type='ResNet',\n        depth=50,\n        num_stages=4,\n        out_indices=(0, 1, 2, 3),\n        frozen_stages=1,\n        norm_cfg=dict(type='BN', requires_grad=True),\n        norm_eval=True,\n        style='pytorch',\n        init_cfg=dict(type='Pretrained', checkpoint='torchvision://resnet50')),\n    neck=dict(\n        type='FPN',\n        in_channels=[256, 512, 1024, 2048],\n        out_channels=256,\n        num_outs=5),\n    rpn_head=dict(\n        type='OrientedRPNHead',\n        in_channels=256,\n        feat_channels=256,\n        version='le90',\n        anchor_generator=dict(\n            type='AnchorGenerator',\n            scales=[8],\n            ratios=[0.5, 1.0, 2.0],\n            strides=[4, 8, 16, 32, 64]),\n        bbox_coder=dict(\n            type='MidpointOffsetCoder',\n            angle_range='le90',\n            target_means=[0.0, 0.0, 0.0, 0.0, 0.0, 0.0],\n            target_stds=[1.0, 1.0, 1.0, 1.0, 0.5, 0.5]),\n        loss_cls=dict(\n            type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0),\n        loss_bbox=dict(\n            type='SmoothL1Loss', beta=0.1111111111111111, loss_weight=1.0)),\n    roi_head=dict(\n        type='OrientedStandardRoIHead',\n        bbox_roi_extractor=dict(\n            type='RotatedSingleRoIExtractor',\n            roi_layer=dict(\n                type='RoIAlignRotated',\n                out_size=7,\n                sample_num=2,\n                clockwise=True),\n            out_channels=256,\n            featmap_strides=[4, 8, 16, 32]),\n        bbox_head=dict(\n            type='RotatedShared2FCBBoxHead',\n            in_channels=256,\n            fc_out_channels=1024,\n            roi_feat_size=7,\n            num_classes=15,\n            bbox_coder=dict(\n                type='DeltaXYWHAOBBoxCoder',\n                angle_range='le90',\n                norm_factor=None,\n                edge_swap=True,\n                proj_xy=True,\n                target_means=(0.0, 0.0, 0.0, 0.0, 0.0),\n                target_stds=(0.1, 0.1, 0.2, 0.2, 0.1)),\n            reg_class_agnostic=True,\n            loss_cls=dict(\n                type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0),\n            loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0))),\n    train_cfg=dict(\n        rpn=dict(\n            assigner=dict(\n                type='MaxIoUAssigner',\n                pos_iou_thr=0.7,\n                neg_iou_thr=0.3,\n                min_pos_iou=0.3,\n                match_low_quality=True,\n                ignore_iof_thr=-1),\n            sampler=dict(\n                type='RandomSampler',\n                num=256,\n                pos_fraction=0.5,\n                neg_pos_ub=-1,\n                add_gt_as_proposals=False),\n            allowed_border=0,\n            pos_weight=-1,\n            debug=False),\n        rpn_proposal=dict(\n            nms_pre=2000,\n            max_per_img=2000,\n            nms=dict(type='nms', iou_threshold=0.8),\n            min_bbox_size=0),\n        rcnn=dict(\n            assigner=dict(\n                type='MaxIoUAssigner',\n                pos_iou_thr=0.5,\n                neg_iou_thr=0.5,\n                min_pos_iou=0.5,\n                match_low_quality=False,\n                iou_calculator=dict(type='RBboxOverlaps2D'),\n                ignore_iof_thr=-1),\n            sampler=dict(\n                type='RRandomSampler',\n                num=512,\n                pos_fraction=0.25,\n                neg_pos_ub=-1,\n                add_gt_as_proposals=True),\n            pos_weight=-1,\n            debug=False)),\n    test_cfg=dict(\n        rpn=dict(\n            nms_pre=2000,\n            max_per_img=2000,\n            nms=dict(type='nms', iou_threshold=0.8),\n            min_bbox_size=0),\n        rcnn=dict(\n            nms_pre=2000,\n            min_bbox_size=0,\n            score_thr=0.05,\n            nms=dict(iou_thr=0.1),\n            max_per_img=2000)))\nfp16 = dict(loss_scale='dynamic')\nwork_dir = 'tutorial_exps/6'\nauto_resume = False\ngpu_ids = range(0, 8)\n", "seed": 1002054415, "exp_name": "oriented_rcnn_r50_fpn_fp16_1x_dota_le90.py"}
{"mode": "train", "epoch": 1, "iter": 50, "lr": 0.00199, "memory": 5225, "data_time": 0.06591, "loss_rpn_cls": 0.49008, "loss_rpn_bbox": 0.24451, "loss_cls": 0.5884, "acc": 89.26196, "loss_bbox": 0.07441, "loss": 1.3974, "grad_norm": Infinity, "time": 0.39103}
{"mode": "train", "epoch": 1, "iter": 100, "lr": 0.00233, "memory": 5827, "data_time": 0.01341, "loss_rpn_cls": 0.25922, "loss_rpn_bbox": 0.27324, "loss_cls": 0.3099, "acc": 93.64111, "loss_bbox": 0.19406, "loss": 1.03642, "grad_norm": 2.8591, "time": 0.32763}
{"mode": "train", "epoch": 1, "iter": 150, "lr": 0.00266, "memory": 5972, "data_time": 0.01375, "loss_rpn_cls": 0.21154, "loss_rpn_bbox": 0.2416, "loss_cls": 0.24554, "acc": 94.07812, "loss_bbox": 0.22503, "loss": 0.92371, "grad_norm": 3.08655, "time": 0.31599}
{"mode": "train", "epoch": 1, "iter": 200, "lr": 0.00299, "memory": 5972, "data_time": 0.01507, "loss_rpn_cls": 0.17166, "loss_rpn_bbox": 0.23847, "loss_cls": 0.24146, "acc": 93.6853, "loss_bbox": 0.25926, "loss": 0.91086, "grad_norm": 3.90545, "time": 0.31348}
{"mode": "train", "epoch": 1, "iter": 250, "lr": 0.00333, "memory": 5972, "data_time": 0.01522, "loss_rpn_cls": 0.14846, "loss_rpn_bbox": 0.24486, "loss_cls": 0.24624, "acc": 93.05493, "loss_bbox": 0.29654, "loss": 0.9361, "grad_norm": 3.75738, "time": 0.30953}
{"mode": "train", "epoch": 1, "iter": 300, "lr": 0.00366, "memory": 5972, "data_time": 0.01396, "loss_rpn_cls": 0.12617, "loss_rpn_bbox": 0.22107, "loss_cls": 0.23617, "acc": 93.15283, "loss_bbox": 0.28501, "loss": 0.86842, "grad_norm": 3.64423, "time": 0.30898}
{"mode": "train", "epoch": 1, "iter": 350, "lr": 0.00399, "memory": 5972, "data_time": 0.01454, "loss_rpn_cls": 0.11642, "loss_rpn_bbox": 0.22439, "loss_cls": 0.26844, "acc": 91.93848, "loss_bbox": 0.32938, "loss": 0.93863, "grad_norm": 4.13674, "time": 0.30617}
{"mode": "train", "epoch": 1, "iter": 400, "lr": 0.00433, "memory": 5972, "data_time": 0.0125, "loss_rpn_cls": 0.09355, "loss_rpn_bbox": 0.20377, "loss_cls": 0.26567, "acc": 91.57178, "loss_bbox": 0.31137, "loss": 0.87435, "grad_norm": 3.78112, "time": 0.30788}
{"mode": "train", "epoch": 1, "iter": 450, "lr": 0.00466, "memory": 5972, "data_time": 0.01373, "loss_rpn_cls": 0.08562, "loss_rpn_bbox": 0.16761, "loss_cls": 0.25621, "acc": 91.83545, "loss_bbox": 0.29948, "loss": 0.80892, "grad_norm": 3.3404, "time": 0.31039}
{"mode": "train", "epoch": 1, "iter": 500, "lr": 0.00499, "memory": 5972, "data_time": 0.01322, "loss_rpn_cls": 0.0892, "loss_rpn_bbox": 0.14871, "loss_cls": 0.25657, "acc": 91.66187, "loss_bbox": 0.29561, "loss": 0.79009, "grad_norm": 3.5171, "time": 0.30781}
{"mode": "train", "epoch": 1, "iter": 550, "lr": 0.005, "memory": 5972, "data_time": 0.01299, "loss_rpn_cls": 0.08214, "loss_rpn_bbox": 0.15301, "loss_cls": 0.25519, "acc": 91.47144, "loss_bbox": 0.30947, "loss": 0.79981, "grad_norm": 3.53342, "time": 0.30582}
{"mode": "train", "epoch": 1, "iter": 600, "lr": 0.005, "memory": 7297, "data_time": 0.01262, "loss_rpn_cls": 0.08469, "loss_rpn_bbox": 0.148, "loss_cls": 0.26125, "acc": 91.22021, "loss_bbox": 0.28039, "loss": 0.77434, "grad_norm": 3.43214, "time": 0.32085}
{"mode": "train", "epoch": 2, "iter": 50, "lr": 0.005, "memory": 7302, "data_time": 0.06201, "loss_rpn_cls": 0.07263, "loss_rpn_bbox": 0.13418, "loss_cls": 0.2542, "acc": 91.04639, "loss_bbox": 0.29143, "loss": 0.75244, "grad_norm": 3.06555, "time": 0.36229}
{"mode": "train", "epoch": 2, "iter": 100, "lr": 0.005, "memory": 7302, "data_time": 0.01481, "loss_rpn_cls": 0.07452, "loss_rpn_bbox": 0.13444, "loss_cls": 0.2615, "acc": 90.88672, "loss_bbox": 0.28135, "loss": 0.7518, "grad_norm": 3.36351, "time": 0.30876}
{"mode": "train", "epoch": 2, "iter": 150, "lr": 0.005, "memory": 7302, "data_time": 0.01429, "loss_rpn_cls": 0.0627, "loss_rpn_bbox": 0.13066, "loss_cls": 0.25449, "acc": 91.04077, "loss_bbox": 0.2799, "loss": 0.72775, "grad_norm": 3.10981, "time": 0.30872}
{"mode": "train", "epoch": 2, "iter": 200, "lr": 0.005, "memory": 7302, "data_time": 0.01328, "loss_rpn_cls": 0.06462, "loss_rpn_bbox": 0.12688, "loss_cls": 0.2448, "acc": 91.33496, "loss_bbox": 0.28028, "loss": 0.71658, "grad_norm": 3.13727, "time": 0.31271}
{"mode": "train", "epoch": 2, "iter": 250, "lr": 0.005, "memory": 7302, "data_time": 0.01391, "loss_rpn_cls": 0.06489, "loss_rpn_bbox": 0.12779, "loss_cls": 0.24462, "acc": 91.43579, "loss_bbox": 0.28111, "loss": 0.71841, "grad_norm": 3.12275, "time": 0.3149}
{"mode": "train", "epoch": 2, "iter": 300, "lr": 0.005, "memory": 7302, "data_time": 0.01447, "loss_rpn_cls": 0.06365, "loss_rpn_bbox": 0.12144, "loss_cls": 0.24944, "acc": 91.16821, "loss_bbox": 0.2757, "loss": 0.71023, "grad_norm": 3.1206, "time": 0.3088}
{"mode": "train", "epoch": 2, "iter": 350, "lr": 0.005, "memory": 7302, "data_time": 0.01416, "loss_rpn_cls": 0.06299, "loss_rpn_bbox": 0.10959, "loss_cls": 0.23042, "acc": 91.89478, "loss_bbox": 0.24875, "loss": 0.65174, "grad_norm": 2.74787, "time": 0.30823}
{"mode": "train", "epoch": 2, "iter": 400, "lr": 0.005, "memory": 7312, "data_time": 0.01394, "loss_rpn_cls": 0.06043, "loss_rpn_bbox": 0.10851, "loss_cls": 0.23645, "acc": 91.31006, "loss_bbox": 0.25611, "loss": 0.6615, "grad_norm": 3.11993, "time": 0.31424}
{"mode": "train", "epoch": 2, "iter": 450, "lr": 0.005, "memory": 7312, "data_time": 0.01252, "loss_rpn_cls": 0.05776, "loss_rpn_bbox": 0.10813, "loss_cls": 0.22053, "acc": 92.05542, "loss_bbox": 0.23997, "loss": 0.62639, "grad_norm": 2.71383, "time": 0.3125}
{"mode": "train", "epoch": 2, "iter": 500, "lr": 0.005, "memory": 7312, "data_time": 0.01385, "loss_rpn_cls": 0.0577, "loss_rpn_bbox": 0.10813, "loss_cls": 0.23127, "acc": 91.60962, "loss_bbox": 0.24847, "loss": 0.64556, "grad_norm": 2.9239, "time": 0.3109}
{"mode": "train", "epoch": 2, "iter": 550, "lr": 0.005, "memory": 7312, "data_time": 0.01292, "loss_rpn_cls": 0.05479, "loss_rpn_bbox": 0.10286, "loss_cls": 0.22517, "acc": 91.80444, "loss_bbox": 0.23906, "loss": 0.62188, "grad_norm": 2.66494, "time": 0.31704}
{"mode": "train", "epoch": 2, "iter": 600, "lr": 0.005, "memory": 7312, "data_time": 0.01349, "loss_rpn_cls": 0.05428, "loss_rpn_bbox": 0.1064, "loss_cls": 0.22532, "acc": 91.81494, "loss_bbox": 0.23804, "loss": 0.62405, "grad_norm": 2.75621, "time": 0.31111}
{"mode": "train", "epoch": 3, "iter": 50, "lr": 0.005, "memory": 7312, "data_time": 0.06007, "loss_rpn_cls": 0.05233, "loss_rpn_bbox": 0.10132, "loss_cls": 0.22508, "acc": 91.5918, "loss_bbox": 0.23478, "loss": 0.61351, "grad_norm": 2.69264, "time": 0.36143}
{"mode": "train", "epoch": 3, "iter": 100, "lr": 0.005, "memory": 7312, "data_time": 0.01315, "loss_rpn_cls": 0.04968, "loss_rpn_bbox": 0.10358, "loss_cls": 0.21309, "acc": 92.06812, "loss_bbox": 0.24289, "loss": 0.60924, "grad_norm": 2.78402, "time": 0.31405}
{"mode": "train", "epoch": 3, "iter": 150, "lr": 0.005, "memory": 7312, "data_time": 0.01367, "loss_rpn_cls": 0.04873, "loss_rpn_bbox": 0.10553, "loss_cls": 0.22101, "acc": 91.66113, "loss_bbox": 0.24174, "loss": 0.61701, "grad_norm": 2.70125, "time": 0.30781}
{"mode": "train", "epoch": 3, "iter": 200, "lr": 0.005, "memory": 7312, "data_time": 0.01418, "loss_rpn_cls": 0.04183, "loss_rpn_bbox": 0.09339, "loss_cls": 0.2214, "acc": 91.82031, "loss_bbox": 0.23976, "loss": 0.59637, "grad_norm": 2.62679, "time": 0.31188}
{"mode": "train", "epoch": 3, "iter": 250, "lr": 0.005, "memory": 7312, "data_time": 0.01428, "loss_rpn_cls": 0.05342, "loss_rpn_bbox": 0.09869, "loss_cls": 0.21171, "acc": 92.16797, "loss_bbox": 0.22656, "loss": 0.59038, "grad_norm": 2.87926, "time": 0.31004}
{"mode": "train", "epoch": 3, "iter": 300, "lr": 0.005, "memory": 7312, "data_time": 0.01386, "loss_rpn_cls": 0.0471, "loss_rpn_bbox": 0.09685, "loss_cls": 0.21024, "acc": 92.17139, "loss_bbox": 0.22463, "loss": 0.57882, "grad_norm": 2.80212, "time": 0.31132}
{"mode": "train", "epoch": 3, "iter": 350, "lr": 0.005, "memory": 7312, "data_time": 0.01409, "loss_rpn_cls": 0.04459, "loss_rpn_bbox": 0.09252, "loss_cls": 0.20088, "acc": 92.50146, "loss_bbox": 0.21729, "loss": 0.55528, "grad_norm": 2.49329, "time": 0.31192}
{"mode": "train", "epoch": 3, "iter": 400, "lr": 0.005, "memory": 7312, "data_time": 0.01386, "loss_rpn_cls": 0.04395, "loss_rpn_bbox": 0.09071, "loss_cls": 0.20239, "acc": 92.55347, "loss_bbox": 0.21332, "loss": 0.55037, "grad_norm": 2.55151, "time": 0.31458}
{"mode": "train", "epoch": 3, "iter": 450, "lr": 0.005, "memory": 7312, "data_time": 0.01465, "loss_rpn_cls": 0.04597, "loss_rpn_bbox": 0.09003, "loss_cls": 0.20191, "acc": 92.49658, "loss_bbox": 0.22008, "loss": 0.55798, "grad_norm": 2.51531, "time": 0.31776}
{"mode": "train", "epoch": 3, "iter": 500, "lr": 0.005, "memory": 7312, "data_time": 0.01451, "loss_rpn_cls": 0.04595, "loss_rpn_bbox": 0.10073, "loss_cls": 0.20615, "acc": 92.2561, "loss_bbox": 0.21531, "loss": 0.56814, "grad_norm": 2.77899, "time": 0.30753}
{"mode": "train", "epoch": 3, "iter": 550, "lr": 0.005, "memory": 7312, "data_time": 0.01453, "loss_rpn_cls": 0.04731, "loss_rpn_bbox": 0.08981, "loss_cls": 0.20043, "acc": 92.58643, "loss_bbox": 0.20307, "loss": 0.54061, "grad_norm": 2.63092, "time": 0.31465}
{"mode": "train", "epoch": 3, "iter": 600, "lr": 0.005, "memory": 7312, "data_time": 0.01491, "loss_rpn_cls": 0.04281, "loss_rpn_bbox": 0.08573, "loss_cls": 0.20061, "acc": 92.34961, "loss_bbox": 0.20997, "loss": 0.53912, "grad_norm": 2.5258, "time": 0.31727}
{"mode": "train", "epoch": 4, "iter": 50, "lr": 0.005, "memory": 7312, "data_time": 0.06159, "loss_rpn_cls": 0.04014, "loss_rpn_bbox": 0.09228, "loss_cls": 0.20265, "acc": 92.37842, "loss_bbox": 0.22107, "loss": 0.55615, "grad_norm": 2.49584, "time": 0.36054}
{"mode": "train", "epoch": 4, "iter": 100, "lr": 0.005, "memory": 7312, "data_time": 0.0145, "loss_rpn_cls": 0.04111, "loss_rpn_bbox": 0.0916, "loss_cls": 0.19791, "acc": 92.50684, "loss_bbox": 0.21272, "loss": 0.54334, "grad_norm": 2.53911, "time": 0.31153}
{"mode": "train", "epoch": 4, "iter": 150, "lr": 0.005, "memory": 7312, "data_time": 0.01387, "loss_rpn_cls": 0.04112, "loss_rpn_bbox": 0.0852, "loss_cls": 0.19452, "acc": 92.68945, "loss_bbox": 0.21114, "loss": 0.53197, "grad_norm": 2.64917, "time": 0.31116}
{"mode": "train", "epoch": 4, "iter": 200, "lr": 0.005, "memory": 7312, "data_time": 0.01401, "loss_rpn_cls": 0.04283, "loss_rpn_bbox": 0.09205, "loss_cls": 0.20543, "acc": 92.25439, "loss_bbox": 0.21132, "loss": 0.55163, "grad_norm": 2.73124, "time": 0.31287}
{"mode": "train", "epoch": 4, "iter": 250, "lr": 0.005, "memory": 7312, "data_time": 0.01419, "loss_rpn_cls": 0.04001, "loss_rpn_bbox": 0.07857, "loss_cls": 0.19259, "acc": 92.66724, "loss_bbox": 0.19585, "loss": 0.50702, "grad_norm": 2.34148, "time": 0.31155}
{"mode": "train", "epoch": 4, "iter": 300, "lr": 0.005, "memory": 7312, "data_time": 0.01361, "loss_rpn_cls": 0.03641, "loss_rpn_bbox": 0.07927, "loss_cls": 0.18487, "acc": 93.10425, "loss_bbox": 0.19072, "loss": 0.49127, "grad_norm": 2.4257, "time": 0.30573}
{"mode": "train", "epoch": 4, "iter": 350, "lr": 0.005, "memory": 7312, "data_time": 0.01461, "loss_rpn_cls": 0.03734, "loss_rpn_bbox": 0.08723, "loss_cls": 0.1898, "acc": 92.88086, "loss_bbox": 0.20116, "loss": 0.51552, "grad_norm": 2.53326, "time": 0.31718}
{"mode": "train", "epoch": 4, "iter": 400, "lr": 0.005, "memory": 7312, "data_time": 0.01406, "loss_rpn_cls": 0.03761, "loss_rpn_bbox": 0.07493, "loss_cls": 0.18336, "acc": 93.01709, "loss_bbox": 0.19075, "loss": 0.48665, "grad_norm": 2.32669, "time": 0.31124}
{"mode": "train", "epoch": 4, "iter": 450, "lr": 0.005, "memory": 7312, "data_time": 0.01363, "loss_rpn_cls": 0.03812, "loss_rpn_bbox": 0.07534, "loss_cls": 0.18923, "acc": 92.771, "loss_bbox": 0.19254, "loss": 0.49523, "grad_norm": 2.44344, "time": 0.31058}
{"mode": "train", "epoch": 4, "iter": 500, "lr": 0.005, "memory": 7312, "data_time": 0.01344, "loss_rpn_cls": 0.03534, "loss_rpn_bbox": 0.08069, "loss_cls": 0.18534, "acc": 92.85107, "loss_bbox": 0.20247, "loss": 0.50384, "grad_norm": 2.39246, "time": 0.31021}
{"mode": "train", "epoch": 4, "iter": 550, "lr": 0.005, "memory": 7312, "data_time": 0.01411, "loss_rpn_cls": 0.03876, "loss_rpn_bbox": 0.09405, "loss_cls": 0.19607, "acc": 92.47803, "loss_bbox": 0.20755, "loss": 0.53643, "grad_norm": 2.36415, "time": 0.31305}
{"mode": "train", "epoch": 4, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.0132, "loss_rpn_cls": 0.03804, "loss_rpn_bbox": 0.09053, "loss_cls": 0.19229, "acc": 92.61841, "loss_bbox": 0.20123, "loss": 0.52209, "grad_norm": 2.31333, "time": 0.31802}
{"mode": "train", "epoch": 5, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.06095, "loss_rpn_cls": 0.03379, "loss_rpn_bbox": 0.08313, "loss_cls": 0.18539, "acc": 92.75269, "loss_bbox": 0.19516, "loss": 0.49747, "grad_norm": 2.30116, "time": 0.36295}
{"mode": "train", "epoch": 5, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.01404, "loss_rpn_cls": 0.03469, "loss_rpn_bbox": 0.08453, "loss_cls": 0.17725, "acc": 93.15894, "loss_bbox": 0.19032, "loss": 0.48679, "grad_norm": 2.20512, "time": 0.31054}
{"mode": "train", "epoch": 5, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01498, "loss_rpn_cls": 0.0353, "loss_rpn_bbox": 0.08337, "loss_cls": 0.1794, "acc": 92.93628, "loss_bbox": 0.19327, "loss": 0.49134, "grad_norm": 2.32082, "time": 0.30632}
{"mode": "train", "epoch": 5, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01451, "loss_rpn_cls": 0.03388, "loss_rpn_bbox": 0.07641, "loss_cls": 0.17562, "acc": 93.23145, "loss_bbox": 0.18344, "loss": 0.46935, "grad_norm": 2.38215, "time": 0.31452}
{"mode": "train", "epoch": 5, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01382, "loss_rpn_cls": 0.03194, "loss_rpn_bbox": 0.07727, "loss_cls": 0.18474, "acc": 92.84668, "loss_bbox": 0.19159, "loss": 0.48554, "grad_norm": 2.26403, "time": 0.31116}
{"mode": "train", "epoch": 5, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.01481, "loss_rpn_cls": 0.03346, "loss_rpn_bbox": 0.08042, "loss_cls": 0.18061, "acc": 93.13745, "loss_bbox": 0.19731, "loss": 0.49181, "grad_norm": 2.55265, "time": 0.31005}
{"mode": "train", "epoch": 5, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.01453, "loss_rpn_cls": 0.03336, "loss_rpn_bbox": 0.08538, "loss_cls": 0.17584, "acc": 93.24805, "loss_bbox": 0.18881, "loss": 0.48339, "grad_norm": 2.37044, "time": 0.31255}
{"mode": "train", "epoch": 5, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01475, "loss_rpn_cls": 0.03343, "loss_rpn_bbox": 0.08086, "loss_cls": 0.1788, "acc": 93.10376, "loss_bbox": 0.18745, "loss": 0.48053, "grad_norm": 2.26766, "time": 0.30894}
{"mode": "train", "epoch": 5, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.01428, "loss_rpn_cls": 0.03249, "loss_rpn_bbox": 0.07307, "loss_cls": 0.17886, "acc": 93.05396, "loss_bbox": 0.18366, "loss": 0.46808, "grad_norm": 2.19324, "time": 0.31336}
{"mode": "train", "epoch": 5, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.01468, "loss_rpn_cls": 0.03114, "loss_rpn_bbox": 0.07528, "loss_cls": 0.17163, "acc": 93.26343, "loss_bbox": 0.17732, "loss": 0.45537, "grad_norm": 2.13121, "time": 0.30904}
{"mode": "train", "epoch": 5, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01444, "loss_rpn_cls": 0.03199, "loss_rpn_bbox": 0.06925, "loss_cls": 0.17958, "acc": 93.0874, "loss_bbox": 0.18071, "loss": 0.46153, "grad_norm": 2.38434, "time": 0.31644}
{"mode": "train", "epoch": 5, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.0143, "loss_rpn_cls": 0.03297, "loss_rpn_bbox": 0.08784, "loss_cls": 0.17233, "acc": 93.35352, "loss_bbox": 0.18075, "loss": 0.47389, "grad_norm": 2.25789, "time": 0.31783}
{"mode": "train", "epoch": 6, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.06146, "loss_rpn_cls": 0.02954, "loss_rpn_bbox": 0.07941, "loss_cls": 0.17918, "acc": 93.06152, "loss_bbox": 0.19024, "loss": 0.47836, "grad_norm": 2.37168, "time": 0.35847}
{"mode": "train", "epoch": 6, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.01511, "loss_rpn_cls": 0.03239, "loss_rpn_bbox": 0.07837, "loss_cls": 0.1779, "acc": 93.16699, "loss_bbox": 0.18747, "loss": 0.47614, "grad_norm": 2.39204, "time": 0.31202}
{"mode": "train", "epoch": 6, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01464, "loss_rpn_cls": 0.02903, "loss_rpn_bbox": 0.07646, "loss_cls": 0.17412, "acc": 93.18896, "loss_bbox": 0.18167, "loss": 0.46129, "grad_norm": 2.11462, "time": 0.3101}
{"mode": "train", "epoch": 6, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01437, "loss_rpn_cls": 0.0296, "loss_rpn_bbox": 0.08031, "loss_cls": 0.17118, "acc": 93.24585, "loss_bbox": 0.17889, "loss": 0.45999, "grad_norm": 2.29988, "time": 0.31385}
{"mode": "train", "epoch": 6, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01476, "loss_rpn_cls": 0.02782, "loss_rpn_bbox": 0.0733, "loss_cls": 0.16798, "acc": 93.43359, "loss_bbox": 0.17842, "loss": 0.44753, "grad_norm": 2.14625, "time": 0.31419}
{"mode": "train", "epoch": 6, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.01415, "loss_rpn_cls": 0.03186, "loss_rpn_bbox": 0.08136, "loss_cls": 0.17695, "acc": 93.03833, "loss_bbox": 0.18658, "loss": 0.47675, "grad_norm": 2.45252, "time": 0.31426}
{"mode": "train", "epoch": 6, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.01396, "loss_rpn_cls": 0.02763, "loss_rpn_bbox": 0.06762, "loss_cls": 0.16542, "acc": 93.56372, "loss_bbox": 0.17913, "loss": 0.43979, "grad_norm": 2.18137, "time": 0.30965}
{"mode": "train", "epoch": 6, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01441, "loss_rpn_cls": 0.02881, "loss_rpn_bbox": 0.07452, "loss_cls": 0.16753, "acc": 93.51831, "loss_bbox": 0.17418, "loss": 0.44504, "grad_norm": 2.25841, "time": 0.31463}
{"mode": "train", "epoch": 6, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.01374, "loss_rpn_cls": 0.02786, "loss_rpn_bbox": 0.07493, "loss_cls": 0.16427, "acc": 93.52905, "loss_bbox": 0.17334, "loss": 0.4404, "grad_norm": 2.14399, "time": 0.30558}
{"mode": "train", "epoch": 6, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.029, "loss_rpn_bbox": 0.08372, "loss_cls": 0.17277, "acc": 93.25439, "loss_bbox": 0.18627, "loss": 0.47177, "grad_norm": 2.32484, "time": 0.31076}
{"mode": "train", "epoch": 6, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01445, "loss_rpn_cls": 0.02822, "loss_rpn_bbox": 0.07152, "loss_cls": 0.16706, "acc": 93.40674, "loss_bbox": 0.17395, "loss": 0.44075, "grad_norm": 2.13454, "time": 0.30917}
{"mode": "train", "epoch": 6, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.01479, "loss_rpn_cls": 0.03212, "loss_rpn_bbox": 0.07697, "loss_cls": 0.1595, "acc": 93.77539, "loss_bbox": 0.17143, "loss": 0.44002, "grad_norm": 2.10196, "time": 0.31597}
{"mode": "train", "epoch": 7, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.0611, "loss_rpn_cls": 0.02693, "loss_rpn_bbox": 0.07014, "loss_cls": 0.16082, "acc": 93.71948, "loss_bbox": 0.16799, "loss": 0.42587, "grad_norm": 2.17101, "time": 0.36067}
{"mode": "train", "epoch": 7, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.0149, "loss_rpn_cls": 0.02821, "loss_rpn_bbox": 0.07734, "loss_cls": 0.16917, "acc": 93.34497, "loss_bbox": 0.1799, "loss": 0.45462, "grad_norm": 2.29488, "time": 0.30968}
{"mode": "train", "epoch": 7, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01433, "loss_rpn_cls": 0.02542, "loss_rpn_bbox": 0.07637, "loss_cls": 0.16321, "acc": 93.51294, "loss_bbox": 0.17546, "loss": 0.44046, "grad_norm": 2.24608, "time": 0.30882}
{"mode": "train", "epoch": 7, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01481, "loss_rpn_cls": 0.02734, "loss_rpn_bbox": 0.07285, "loss_cls": 0.15937, "acc": 93.84204, "loss_bbox": 0.17377, "loss": 0.43332, "grad_norm": 2.28271, "time": 0.31246}
{"mode": "train", "epoch": 7, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01354, "loss_rpn_cls": 0.02956, "loss_rpn_bbox": 0.07564, "loss_cls": 0.16403, "acc": 93.59766, "loss_bbox": 0.17184, "loss": 0.44107, "grad_norm": 2.22358, "time": 0.31582}
{"mode": "train", "epoch": 7, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.014, "loss_rpn_cls": 0.02664, "loss_rpn_bbox": 0.08083, "loss_cls": 0.16915, "acc": 93.31055, "loss_bbox": 0.17329, "loss": 0.44992, "grad_norm": 2.15659, "time": 0.30934}
{"mode": "train", "epoch": 7, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.0137, "loss_rpn_cls": 0.02401, "loss_rpn_bbox": 0.06939, "loss_cls": 0.1594, "acc": 93.69141, "loss_bbox": 0.16416, "loss": 0.41696, "grad_norm": 2.09128, "time": 0.3084}
{"mode": "train", "epoch": 7, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01459, "loss_rpn_cls": 0.02546, "loss_rpn_bbox": 0.07119, "loss_cls": 0.16399, "acc": 93.4978, "loss_bbox": 0.16933, "loss": 0.42997, "grad_norm": 2.07036, "time": 0.3072}
{"mode": "train", "epoch": 7, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.01262, "loss_rpn_cls": 0.02698, "loss_rpn_bbox": 0.07773, "loss_cls": 0.16177, "acc": 93.6355, "loss_bbox": 0.17049, "loss": 0.43697, "grad_norm": Infinity, "time": 0.31227}
{"mode": "train", "epoch": 7, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.0126, "loss_rpn_cls": 0.0247, "loss_rpn_bbox": 0.07727, "loss_cls": 0.15663, "acc": 93.74268, "loss_bbox": 0.17805, "loss": 0.43665, "grad_norm": 2.25696, "time": 0.31292}
{"mode": "train", "epoch": 7, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01273, "loss_rpn_cls": 0.02553, "loss_rpn_bbox": 0.07469, "loss_cls": 0.1605, "acc": 93.70557, "loss_bbox": 0.17148, "loss": 0.43219, "grad_norm": 2.24083, "time": 0.31151}
{"mode": "train", "epoch": 7, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.01328, "loss_rpn_cls": 0.02568, "loss_rpn_bbox": 0.06938, "loss_cls": 0.1583, "acc": 93.77808, "loss_bbox": 0.1704, "loss": 0.42375, "grad_norm": 2.0953, "time": 0.31171}
{"mode": "train", "epoch": 8, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.06064, "loss_rpn_cls": 0.02812, "loss_rpn_bbox": 0.0785, "loss_cls": 0.16256, "acc": 93.5686, "loss_bbox": 0.17591, "loss": 0.44509, "grad_norm": 2.33617, "time": 0.35966}
{"mode": "train", "epoch": 8, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.01442, "loss_rpn_cls": 0.02208, "loss_rpn_bbox": 0.07757, "loss_cls": 0.15539, "acc": 93.88232, "loss_bbox": 0.17151, "loss": 0.42656, "grad_norm": 2.11713, "time": 0.31481}
{"mode": "train", "epoch": 8, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01441, "loss_rpn_cls": 0.02335, "loss_rpn_bbox": 0.07125, "loss_cls": 0.15516, "acc": 93.85571, "loss_bbox": 0.17244, "loss": 0.4222, "grad_norm": 2.18155, "time": 0.31175}
{"mode": "train", "epoch": 8, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01416, "loss_rpn_cls": 0.02382, "loss_rpn_bbox": 0.06559, "loss_cls": 0.16096, "acc": 93.61865, "loss_bbox": 0.16771, "loss": 0.41808, "grad_norm": 2.03908, "time": 0.3091}
{"mode": "train", "epoch": 8, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.02473, "loss_rpn_bbox": 0.07049, "loss_cls": 0.15361, "acc": 93.90576, "loss_bbox": 0.16591, "loss": 0.41475, "grad_norm": 2.10348, "time": 0.31601}
{"mode": "train", "epoch": 8, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.01331, "loss_rpn_cls": 0.02459, "loss_rpn_bbox": 0.0799, "loss_cls": 0.16039, "acc": 93.69312, "loss_bbox": 0.16626, "loss": 0.43114, "grad_norm": 2.14169, "time": 0.31713}
{"mode": "train", "epoch": 8, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.01416, "loss_rpn_cls": 0.02285, "loss_rpn_bbox": 0.06958, "loss_cls": 0.15422, "acc": 93.98218, "loss_bbox": 0.16386, "loss": 0.41051, "grad_norm": 2.06042, "time": 0.31523}
{"mode": "train", "epoch": 8, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01458, "loss_rpn_cls": 0.02606, "loss_rpn_bbox": 0.0706, "loss_cls": 0.15806, "acc": 93.83447, "loss_bbox": 0.16004, "loss": 0.41476, "grad_norm": 2.09746, "time": 0.31189}
{"mode": "train", "epoch": 8, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.0145, "loss_rpn_cls": 0.02456, "loss_rpn_bbox": 0.06934, "loss_cls": 0.15397, "acc": 93.96973, "loss_bbox": 0.16603, "loss": 0.41391, "grad_norm": 2.1831, "time": 0.31656}
{"mode": "train", "epoch": 8, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.01593, "loss_rpn_cls": 0.02404, "loss_rpn_bbox": 0.07383, "loss_cls": 0.15636, "acc": 93.8418, "loss_bbox": 0.16659, "loss": 0.42083, "grad_norm": 2.19792, "time": 0.31881}
{"mode": "train", "epoch": 8, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01498, "loss_rpn_cls": 0.02348, "loss_rpn_bbox": 0.06787, "loss_cls": 0.14917, "acc": 94.09253, "loss_bbox": 0.16846, "loss": 0.40899, "grad_norm": 2.09245, "time": 0.30738}
{"mode": "train", "epoch": 8, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.01471, "loss_rpn_cls": 0.02094, "loss_rpn_bbox": 0.06632, "loss_cls": 0.15636, "acc": 93.77515, "loss_bbox": 0.16864, "loss": 0.41226, "grad_norm": 2.10976, "time": 0.31301}
{"mode": "train", "epoch": 9, "iter": 50, "lr": 0.0005, "memory": 7313, "data_time": 0.06221, "loss_rpn_cls": 0.02075, "loss_rpn_bbox": 0.06499, "loss_cls": 0.14423, "acc": 94.25098, "loss_bbox": 0.15458, "loss": 0.38455, "grad_norm": 1.69776, "time": 0.36615}
{"mode": "train", "epoch": 9, "iter": 100, "lr": 0.0005, "memory": 7313, "data_time": 0.01492, "loss_rpn_cls": 0.01938, "loss_rpn_bbox": 0.06572, "loss_cls": 0.14807, "acc": 94.14624, "loss_bbox": 0.15034, "loss": 0.38352, "grad_norm": 1.70021, "time": 0.31029}
{"mode": "train", "epoch": 9, "iter": 150, "lr": 0.0005, "memory": 7313, "data_time": 0.01541, "loss_rpn_cls": 0.01884, "loss_rpn_bbox": 0.05553, "loss_cls": 0.1417, "acc": 94.41797, "loss_bbox": 0.14541, "loss": 0.36148, "grad_norm": 1.63026, "time": 0.31095}
{"mode": "train", "epoch": 9, "iter": 200, "lr": 0.0005, "memory": 7313, "data_time": 0.01554, "loss_rpn_cls": 0.01882, "loss_rpn_bbox": 0.05564, "loss_cls": 0.13558, "acc": 94.60181, "loss_bbox": 0.14752, "loss": 0.35757, "grad_norm": 1.65712, "time": 0.31143}
{"mode": "train", "epoch": 9, "iter": 250, "lr": 0.0005, "memory": 7313, "data_time": 0.01499, "loss_rpn_cls": 0.0199, "loss_rpn_bbox": 0.06004, "loss_cls": 0.14321, "acc": 94.25928, "loss_bbox": 0.15421, "loss": 0.37735, "grad_norm": 1.69668, "time": 0.31209}
{"mode": "train", "epoch": 9, "iter": 300, "lr": 0.0005, "memory": 7313, "data_time": 0.01487, "loss_rpn_cls": 0.02019, "loss_rpn_bbox": 0.0619, "loss_cls": 0.13946, "acc": 94.41382, "loss_bbox": 0.14506, "loss": 0.36661, "grad_norm": 1.69229, "time": 0.31445}
{"mode": "train", "epoch": 9, "iter": 350, "lr": 0.0005, "memory": 7313, "data_time": 0.01441, "loss_rpn_cls": 0.01854, "loss_rpn_bbox": 0.059, "loss_cls": 0.13904, "acc": 94.42065, "loss_bbox": 0.15023, "loss": 0.36681, "grad_norm": 1.74397, "time": 0.30896}
{"mode": "train", "epoch": 9, "iter": 400, "lr": 0.0005, "memory": 7313, "data_time": 0.015, "loss_rpn_cls": 0.0196, "loss_rpn_bbox": 0.06042, "loss_cls": 0.13881, "acc": 94.45898, "loss_bbox": 0.14764, "loss": 0.36648, "grad_norm": 1.67291, "time": 0.31278}
{"mode": "train", "epoch": 9, "iter": 450, "lr": 0.0005, "memory": 7313, "data_time": 0.01568, "loss_rpn_cls": 0.01871, "loss_rpn_bbox": 0.06556, "loss_cls": 0.14822, "acc": 94.02417, "loss_bbox": 0.15517, "loss": 0.38765, "grad_norm": 1.74661, "time": 0.31829}
{"mode": "train", "epoch": 9, "iter": 500, "lr": 0.0005, "memory": 7313, "data_time": 0.01536, "loss_rpn_cls": 0.02078, "loss_rpn_bbox": 0.06408, "loss_cls": 0.14372, "acc": 94.28687, "loss_bbox": 0.1532, "loss": 0.38178, "grad_norm": 1.71049, "time": 0.31833}
{"mode": "train", "epoch": 9, "iter": 550, "lr": 0.0005, "memory": 7313, "data_time": 0.01523, "loss_rpn_cls": 0.02115, "loss_rpn_bbox": 0.05762, "loss_cls": 0.14714, "acc": 94.16504, "loss_bbox": 0.15361, "loss": 0.37952, "grad_norm": 1.72701, "time": 0.30793}
{"mode": "train", "epoch": 9, "iter": 600, "lr": 0.0005, "memory": 7313, "data_time": 0.01453, "loss_rpn_cls": 0.01883, "loss_rpn_bbox": 0.06205, "loss_cls": 0.14241, "acc": 94.27417, "loss_bbox": 0.15384, "loss": 0.37712, "grad_norm": 1.75762, "time": 0.31536}
{"mode": "train", "epoch": 10, "iter": 50, "lr": 0.0005, "memory": 7313, "data_time": 0.06277, "loss_rpn_cls": 0.01745, "loss_rpn_bbox": 0.05541, "loss_cls": 0.14045, "acc": 94.38159, "loss_bbox": 0.15084, "loss": 0.36415, "grad_norm": 1.6777, "time": 0.35428}
{"mode": "train", "epoch": 10, "iter": 100, "lr": 0.0005, "memory": 7313, "data_time": 0.01439, "loss_rpn_cls": 0.01876, "loss_rpn_bbox": 0.06658, "loss_cls": 0.1411, "acc": 94.33472, "loss_bbox": 0.15342, "loss": 0.37986, "grad_norm": 1.75149, "time": 0.3127}
{"mode": "train", "epoch": 10, "iter": 150, "lr": 0.0005, "memory": 7313, "data_time": 0.01486, "loss_rpn_cls": 0.01846, "loss_rpn_bbox": 0.05933, "loss_cls": 0.13914, "acc": 94.42554, "loss_bbox": 0.14945, "loss": 0.36638, "grad_norm": 1.67852, "time": 0.3129}
{"mode": "train", "epoch": 10, "iter": 200, "lr": 0.0005, "memory": 7313, "data_time": 0.01447, "loss_rpn_cls": 0.01851, "loss_rpn_bbox": 0.0596, "loss_cls": 0.13278, "acc": 94.65723, "loss_bbox": 0.13899, "loss": 0.34988, "grad_norm": 1.74557, "time": 0.30957}
{"mode": "train", "epoch": 10, "iter": 250, "lr": 0.0005, "memory": 7313, "data_time": 0.01421, "loss_rpn_cls": 0.01889, "loss_rpn_bbox": 0.05989, "loss_cls": 0.14123, "acc": 94.3584, "loss_bbox": 0.15242, "loss": 0.37242, "grad_norm": 1.73603, "time": 0.31111}
{"mode": "train", "epoch": 10, "iter": 300, "lr": 0.0005, "memory": 7313, "data_time": 0.0145, "loss_rpn_cls": 0.01908, "loss_rpn_bbox": 0.06328, "loss_cls": 0.14472, "acc": 94.20361, "loss_bbox": 0.15231, "loss": 0.37939, "grad_norm": 1.72936, "time": 0.30653}
{"mode": "train", "epoch": 10, "iter": 350, "lr": 0.0005, "memory": 7313, "data_time": 0.01555, "loss_rpn_cls": 0.01798, "loss_rpn_bbox": 0.05822, "loss_cls": 0.13819, "acc": 94.48975, "loss_bbox": 0.14984, "loss": 0.36423, "grad_norm": 1.68749, "time": 0.31404}
{"mode": "train", "epoch": 10, "iter": 400, "lr": 0.0005, "memory": 7313, "data_time": 0.0138, "loss_rpn_cls": 0.01789, "loss_rpn_bbox": 0.0587, "loss_cls": 0.14044, "acc": 94.33911, "loss_bbox": 0.14691, "loss": 0.36394, "grad_norm": 1.69943, "time": 0.30966}
{"mode": "train", "epoch": 10, "iter": 450, "lr": 0.0005, "memory": 7313, "data_time": 0.01318, "loss_rpn_cls": 0.01979, "loss_rpn_bbox": 0.06118, "loss_cls": 0.14117, "acc": 94.39917, "loss_bbox": 0.15219, "loss": 0.37433, "grad_norm": 1.79924, "time": 0.31224}
{"mode": "train", "epoch": 10, "iter": 500, "lr": 0.0005, "memory": 7313, "data_time": 0.01435, "loss_rpn_cls": 0.01804, "loss_rpn_bbox": 0.06213, "loss_cls": 0.13887, "acc": 94.5332, "loss_bbox": 0.1451, "loss": 0.36414, "grad_norm": 1.71161, "time": 0.31389}
{"mode": "train", "epoch": 10, "iter": 550, "lr": 0.0005, "memory": 7313, "data_time": 0.01468, "loss_rpn_cls": 0.01684, "loss_rpn_bbox": 0.06169, "loss_cls": 0.13813, "acc": 94.45312, "loss_bbox": 0.14684, "loss": 0.36351, "grad_norm": 1.69689, "time": 0.31035}
{"mode": "train", "epoch": 10, "iter": 600, "lr": 0.0005, "memory": 7313, "data_time": 0.01484, "loss_rpn_cls": 0.01959, "loss_rpn_bbox": 0.06354, "loss_cls": 0.14206, "acc": 94.27026, "loss_bbox": 0.14872, "loss": 0.3739, "grad_norm": 1.72743, "time": 0.31639}
{"mode": "train", "epoch": 11, "iter": 50, "lr": 0.0005, "memory": 7313, "data_time": 0.06049, "loss_rpn_cls": 0.01772, "loss_rpn_bbox": 0.05691, "loss_cls": 0.13755, "acc": 94.46899, "loss_bbox": 0.1493, "loss": 0.36148, "grad_norm": 1.69198, "time": 0.36497}
{"mode": "train", "epoch": 11, "iter": 100, "lr": 0.0005, "memory": 7313, "data_time": 0.01474, "loss_rpn_cls": 0.01852, "loss_rpn_bbox": 0.06347, "loss_cls": 0.13469, "acc": 94.57202, "loss_bbox": 0.14904, "loss": 0.36572, "grad_norm": 1.70533, "time": 0.31679}
{"mode": "train", "epoch": 11, "iter": 150, "lr": 0.0005, "memory": 7313, "data_time": 0.01526, "loss_rpn_cls": 0.0175, "loss_rpn_bbox": 0.06002, "loss_cls": 0.13693, "acc": 94.56812, "loss_bbox": 0.14313, "loss": 0.35757, "grad_norm": 1.76658, "time": 0.31253}
{"mode": "train", "epoch": 11, "iter": 200, "lr": 0.0005, "memory": 7313, "data_time": 0.01517, "loss_rpn_cls": 0.01849, "loss_rpn_bbox": 0.06311, "loss_cls": 0.14064, "acc": 94.34961, "loss_bbox": 0.15083, "loss": 0.37308, "grad_norm": 1.81965, "time": 0.3092}
{"mode": "train", "epoch": 11, "iter": 250, "lr": 0.0005, "memory": 7313, "data_time": 0.01229, "loss_rpn_cls": 0.01916, "loss_rpn_bbox": 0.05908, "loss_cls": 0.13861, "acc": 94.51758, "loss_bbox": 0.146, "loss": 0.36285, "grad_norm": 1.71436, "time": 0.3103}
{"mode": "train", "epoch": 11, "iter": 300, "lr": 0.0005, "memory": 7313, "data_time": 0.01418, "loss_rpn_cls": 0.01966, "loss_rpn_bbox": 0.06162, "loss_cls": 0.14146, "acc": 94.30225, "loss_bbox": 0.14982, "loss": 0.37256, "grad_norm": 1.7506, "time": 0.31276}
{"mode": "train", "epoch": 11, "iter": 350, "lr": 0.0005, "memory": 7313, "data_time": 0.01548, "loss_rpn_cls": 0.01824, "loss_rpn_bbox": 0.06113, "loss_cls": 0.13888, "acc": 94.40601, "loss_bbox": 0.15354, "loss": 0.37179, "grad_norm": 1.76402, "time": 0.30962}
{"mode": "train", "epoch": 11, "iter": 400, "lr": 0.0005, "memory": 7313, "data_time": 0.01471, "loss_rpn_cls": 0.01744, "loss_rpn_bbox": 0.06175, "loss_cls": 0.1347, "acc": 94.51123, "loss_bbox": 0.14869, "loss": 0.36258, "grad_norm": 1.72716, "time": 0.30919}
{"mode": "train", "epoch": 11, "iter": 450, "lr": 0.0005, "memory": 7313, "data_time": 0.01526, "loss_rpn_cls": 0.01831, "loss_rpn_bbox": 0.05458, "loss_cls": 0.13744, "acc": 94.58569, "loss_bbox": 0.1437, "loss": 0.35403, "grad_norm": 1.76165, "time": 0.31}
{"mode": "train", "epoch": 11, "iter": 500, "lr": 0.0005, "memory": 7313, "data_time": 0.01485, "loss_rpn_cls": 0.01879, "loss_rpn_bbox": 0.0606, "loss_cls": 0.13808, "acc": 94.49902, "loss_bbox": 0.14958, "loss": 0.36705, "grad_norm": 1.75726, "time": 0.30969}
{"mode": "train", "epoch": 11, "iter": 550, "lr": 0.0005, "memory": 7313, "data_time": 0.01637, "loss_rpn_cls": 0.01781, "loss_rpn_bbox": 0.05641, "loss_cls": 0.138, "acc": 94.4436, "loss_bbox": 0.14501, "loss": 0.35722, "grad_norm": 1.74624, "time": 0.3157}
{"mode": "train", "epoch": 11, "iter": 600, "lr": 0.0005, "memory": 7313, "data_time": 0.01489, "loss_rpn_cls": 0.01751, "loss_rpn_bbox": 0.0583, "loss_cls": 0.1394, "acc": 94.37305, "loss_bbox": 0.15177, "loss": 0.36698, "grad_norm": 1.72964, "time": 0.31123}
{"mode": "train", "epoch": 12, "iter": 50, "lr": 5e-05, "memory": 7313, "data_time": 0.06148, "loss_rpn_cls": 0.01837, "loss_rpn_bbox": 0.06605, "loss_cls": 0.13662, "acc": 94.54126, "loss_bbox": 0.14617, "loss": 0.36721, "grad_norm": 1.71705, "time": 0.35697}
{"mode": "train", "epoch": 12, "iter": 100, "lr": 5e-05, "memory": 7313, "data_time": 0.01526, "loss_rpn_cls": 0.0175, "loss_rpn_bbox": 0.05514, "loss_cls": 0.13364, "acc": 94.6875, "loss_bbox": 0.14034, "loss": 0.34662, "grad_norm": Infinity, "time": 0.31688}
{"mode": "train", "epoch": 12, "iter": 150, "lr": 5e-05, "memory": 7313, "data_time": 0.01457, "loss_rpn_cls": 0.01833, "loss_rpn_bbox": 0.0583, "loss_cls": 0.1367, "acc": 94.52051, "loss_bbox": 0.14669, "loss": 0.36002, "grad_norm": 1.75122, "time": 0.3127}
{"mode": "train", "epoch": 12, "iter": 200, "lr": 5e-05, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.01878, "loss_rpn_bbox": 0.06293, "loss_cls": 0.14046, "acc": 94.3479, "loss_bbox": 0.15025, "loss": 0.37242, "grad_norm": 1.76831, "time": 0.31381}
{"mode": "train", "epoch": 12, "iter": 250, "lr": 5e-05, "memory": 7313, "data_time": 0.01472, "loss_rpn_cls": 0.01641, "loss_rpn_bbox": 0.05748, "loss_cls": 0.13463, "acc": 94.58154, "loss_bbox": 0.14437, "loss": 0.35288, "grad_norm": 1.64531, "time": 0.31433}
{"mode": "train", "epoch": 12, "iter": 300, "lr": 5e-05, "memory": 7313, "data_time": 0.01427, "loss_rpn_cls": 0.01647, "loss_rpn_bbox": 0.05725, "loss_cls": 0.1377, "acc": 94.52368, "loss_bbox": 0.1465, "loss": 0.35791, "grad_norm": 1.68576, "time": 0.30987}
{"mode": "train", "epoch": 12, "iter": 350, "lr": 5e-05, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.01789, "loss_rpn_bbox": 0.06142, "loss_cls": 0.13718, "acc": 94.5105, "loss_bbox": 0.14815, "loss": 0.36465, "grad_norm": 1.70723, "time": 0.31219}
{"mode": "train", "epoch": 12, "iter": 400, "lr": 5e-05, "memory": 7313, "data_time": 0.01403, "loss_rpn_cls": 0.01669, "loss_rpn_bbox": 0.05459, "loss_cls": 0.13911, "acc": 94.43726, "loss_bbox": 0.14909, "loss": 0.35948, "grad_norm": 1.76735, "time": 0.3101}
{"mode": "train", "epoch": 12, "iter": 450, "lr": 5e-05, "memory": 7313, "data_time": 0.01404, "loss_rpn_cls": 0.01668, "loss_rpn_bbox": 0.05402, "loss_cls": 0.13558, "acc": 94.58423, "loss_bbox": 0.14675, "loss": 0.35303, "grad_norm": 1.68917, "time": 0.30829}
{"mode": "train", "epoch": 12, "iter": 500, "lr": 5e-05, "memory": 7313, "data_time": 0.01385, "loss_rpn_cls": 0.01888, "loss_rpn_bbox": 0.06378, "loss_cls": 0.13996, "acc": 94.40747, "loss_bbox": 0.14547, "loss": 0.36808, "grad_norm": 1.73619, "time": 0.31841}
{"mode": "train", "epoch": 12, "iter": 550, "lr": 5e-05, "memory": 7313, "data_time": 0.01388, "loss_rpn_cls": 0.01788, "loss_rpn_bbox": 0.05709, "loss_cls": 0.13766, "acc": 94.5, "loss_bbox": 0.1458, "loss": 0.35843, "grad_norm": 1.75452, "time": 0.3105}
{"mode": "train", "epoch": 12, "iter": 600, "lr": 5e-05, "memory": 7313, "data_time": 0.01372, "loss_rpn_cls": 0.01985, "loss_rpn_bbox": 0.06297, "loss_cls": 0.14479, "acc": 94.18164, "loss_bbox": 0.15355, "loss": 0.38116, "grad_norm": 1.84367, "time": 0.31359}
{"mode": "val", "epoch": 12, "iter": 8541, "lr": 5e-05, "mAP": 0.66536}

It's too long so I'll just show the subsequent mAP changes

# grep "mAP" tutorial_exps/6/20230731_1
01831.log     
evaluation = dict(interval=12, metric='mAP')
| mAP                |        |        |        | 0.665 |
2023-07-31 11:29:11,760 - mmrotate - INFO - Epoch(val) [12][8541]       mAP: 0.6654
| mAP                |        |        |        | 0.668 |
2023-07-31 12:19:05,220 - mmrotate - INFO - Epoch(val) [24][8541]       mAP: 0.6675
| mAP                |        |        |        | 0.668 |
2023-07-31 13:08:50,656 - mmrotate - INFO - Epoch(val) [36][8541]       mAP: 0.6682
| mAP                |        |        |        | 0.670 |
2023-07-31 13:59:27,904 - mmrotate - INFO - Epoch(val) [48][8541]       mAP: 0.6699
| mAP                |        |        |        | 0.671 |
2023-07-31 14:49:44,282 - mmrotate - INFO - Epoch(val) [60][8541]       mAP: 0.6709
| mAP                |        |        |        | 0.672 |
2023-07-31 15:40:11,998 - mmrotate - INFO - Epoch(val) [72][8541]       mAP: 0.6720
| mAP                |        |        |        | 0.673 |
2023-07-31 16:30:12,368 - mmrotate - INFO - Epoch(val) [84][8541]       mAP: 0.6726
| mAP                |        |        |        | 0.673 |
2023-07-31 17:20:18,145 - mmrotate - INFO - Epoch(val) [96][8541]       mAP: 0.6733
| mAP                |        |        |        | 0.674 |
2023-07-31 18:10:21,347 - mmrotate - INFO - Epoch(val) [108][8541]      mAP: 0.6743
| mAP                |        |        |        | 0.675 |
2023-07-31 19:00:17,892 - mmrotate - INFO - Epoch(val) [120][8541]      mAP: 0.6751
| mAP                |        |        |        | 0.676 |
2023-07-31 19:50:30,967 - mmrotate - INFO - Epoch(val) [132][8541]      mAP: 0.6758
| mAP                |        |        |        | 0.677 |
2023-07-31 20:40:40,306 - mmrotate - INFO - Epoch(val) [144][8541]      mAP: 0.6768
| mAP                |        |        |        | 0.678 |
2023-07-31 21:30:40,664 - mmrotate - INFO - Epoch(val) [156][8541]      mAP: 0.6777
| mAP                |        |        |        | 0.678 |
2023-07-31 22:20:39,693 - mmrotate - INFO - Epoch(val) [168][8541]      mAP: 0.6778
| mAP                |        |        |        | 0.679 |
2023-07-31 23:10:36,160 - mmrotate - INFO - Epoch(val) [180][8541]      mAP: 0.6787
| mAP                |        |        |        | 0.679 |
2023-08-01 00:00:31,284 - mmrotate - INFO - Epoch(val) [192][8541]      mAP: 0.6795
| mAP                |        |        |        | 0.680 |
2023-08-01 00:50:29,689 - mmrotate - INFO - Epoch(val) [204][8541]      mAP: 0.6797
| mAP                |        |        |        | 0.680 |
2023-08-01 01:40:17,485 - mmrotate - INFO - Epoch(val) [216][8541]      mAP: 0.6804
| mAP                |        |        |        | 0.681 |
2023-08-01 02:30:05,030 - mmrotate - INFO - Epoch(val) [228][8541]      mAP: 0.6814
| mAP                |        |        |        | 0.682 |
2023-08-01 03:19:05,696 - mmrotate - INFO - Epoch(val) [240][8541]      mAP: 0.6816
| mAP                |        |        |        | 0.682 |
2023-08-01 04:08:08,600 - mmrotate - INFO - Epoch(val) [252][8541]      mAP: 0.6822
| mAP                |        |        |        | 0.683 |
2023-08-01 04:57:24,387 - mmrotate - INFO - Epoch(val) [264][8541]      mAP: 0.6830
| mAP                |        |        |        | 0.683 |
2023-08-01 05:46:30,490 - mmrotate - INFO - Epoch(val) [276][8541]      mAP: 0.6834
| mAP                |        |        |        | 0.683 |
2023-08-01 06:35:37,152 - mmrotate - INFO - Epoch(val) [288][8541]      mAP: 0.6829
| mAP                |        |        |        | 0.684 |
2023-08-01 07:24:51,862 - mmrotate - INFO - Epoch(val) [300][8541]      mAP: 0.6840
| mAP                |        |        |        | 0.684 |
2023-08-01 08:14:15,776 - mmrotate - INFO - Epoch(val) [312][8541]      mAP: 0.6835
| mAP                |        |        |        | 0.685 |
2023-08-01 09:03:35,380 - mmrotate - INFO - Epoch(val) [324][8541]      mAP: 0.6847
| mAP                |        |        |        | 0.685 |
2023-08-01 09:52:50,364 - mmrotate - INFO - Epoch(val) [336][8541]      mAP: 0.6846
| mAP                |        |        |        | 0.683 |
2023-08-01 10:42:07,869 - mmrotate - INFO - Epoch(val) [348][8541]      mAP: 0.6832
| mAP                |        |        |        | 0.684 |
2023-08-01 11:31:20,057 - mmrotate - INFO - Epoch(val) [360][8541]      mAP: 0.6839
| mAP                |        |        |        | 0.686 |
2023-08-01 12:20:27,998 - mmrotate - INFO - Epoch(val) [372][8541]      mAP: 0.6856
| mAP                |        |        |        | 0.686 |
2023-08-01 13:09:37,734 - mmrotate - INFO - Epoch(val) [384][8541]      mAP: 0.6865
| mAP                |        |        |        | 0.687 |
2023-08-01 13:58:48,836 - mmrotate - INFO - Epoch(val) [396][8541]      mAP: 0.6865
| mAP                |        |        |        | 0.683 |
2023-08-01 14:48:13,449 - mmrotate - INFO - Epoch(val) [408][8541]      mAP: 0.6831
| mAP                |        |        |        | 0.685 |
2023-08-01 15:37:20,166 - mmrotate - INFO - Epoch(val) [420][8541]      mAP: 0.6852
| mAP                |        |        |        | 0.685 |
2023-08-01 16:26:37,679 - mmrotate - INFO - Epoch(val) [432][8541]      mAP: 0.6850
| mAP                |        |        |        | 0.685 |
2023-08-01 17:15:49,519 - mmrotate - INFO - Epoch(val) [444][8541]      mAP: 0.6851
| mAP                |        |        |        | 0.687 |
2023-08-01 18:04:56,135 - mmrotate - INFO - Epoch(val) [456][8541]      mAP: 0.6866
| mAP                |        |        |        | 0.686 |
2023-08-01 18:54:17,240 - mmrotate - INFO - Epoch(val) [468][8541]      mAP: 0.6865
| mAP                |        |        |        | 0.687 |
2023-08-01 19:43:47,853 - mmrotate - INFO - Epoch(val) [480][8541]      mAP: 0.6870
| mAP                |        |        |        | 0.687 |
2023-08-01 20:33:11,036 - mmrotate - INFO - Epoch(val) [492][8541]      mAP: 0.6873
| mAP                |        |        |        | 0.687 |
2023-08-01 21:22:32,373 - mmrotate - INFO - Epoch(val) [504][8541]      mAP: 0.6872
| mAP                |        |        |        | 0.687 |
2023-08-01 22:12:05,530 - mmrotate - INFO - Epoch(val) [516][8541]      mAP: 0.6872
| mAP                |        |        |        | 0.688 |
2023-08-01 23:01:31,712 - mmrotate - INFO - Epoch(val) [528][8541]      mAP: 0.6876
| mAP                |        |        |        | 0.688 |
2023-08-01 23:50:51,220 - mmrotate - INFO - Epoch(val) [540][8541]      mAP: 0.6881
| mAP                |        |        |        | 0.688 |
2023-08-02 00:40:22,192 - mmrotate - INFO - Epoch(val) [552][8541]      mAP: 0.6879
| mAP                |        |        |        | 0.688 |
2023-08-02 01:29:36,078 - mmrotate - INFO - Epoch(val) [564][8541]      mAP: 0.6881
| mAP                |        |        |        | 0.685 |
2023-08-02 02:19:13,104 - mmrotate - INFO - Epoch(val) [576][8541]      mAP: 0.6850
| mAP                |        |        |        | 0.690 |
2023-08-02 03:08:36,319 - mmrotate - INFO - Epoch(val) [588][8541]      mAP: 0.6895
| mAP                |        |        |        | 0.685 |
2023-08-02 03:58:00,523 - mmrotate - INFO - Epoch(val) [600][8541]      mAP: 0.6854
| mAP                |        |        |        | 0.690 |
2023-08-02 04:47:33,776 - mmrotate - INFO - Epoch(val) [612][8541]      mAP: 0.6896
| mAP                |        |        |        | 0.689 |
2023-08-02 05:36:50,538 - mmrotate - INFO - Epoch(val) [624][8541]      mAP: 0.6893
| mAP                |        |        |        | 0.689 |
2023-08-02 06:26:04,417 - mmrotate - INFO - Epoch(val) [636][8541]      mAP: 0.6891
| mAP                |        |        |        | 0.690 |
2023-08-02 07:15:12,725 - mmrotate - INFO - Epoch(val) [648][8541]      mAP: 0.6902
| mAP                |        |        |        | 0.686 |
2023-08-02 08:04:08,799 - mmrotate - INFO - Epoch(val) [660][8541]      mAP: 0.6859
| mAP                |        |        |        | 0.690 |
2023-08-02 08:54:15,594 - mmrotate - INFO - Epoch(val) [672][8541]      mAP: 0.6896
| mAP                |        |        |        | 0.690 |
2023-08-02 09:43:23,146 - mmrotate - INFO - Epoch(val) [684][8541]      mAP: 0.6898
| mAP                |        |        |        | 0.686 |
2023-08-02 10:33:07,083 - mmrotate - INFO - Epoch(val) [696][8541]      mAP: 0.6861
| mAP                |        |        |        | 0.690 |
2023-08-02 11:23:02,084 - mmrotate - INFO - Epoch(val) [708][8541]      mAP: 0.6898
| mAP                |        |        |        | 0.690 |
2023-08-02 12:12:48,204 - mmrotate - INFO - Epoch(val) [720][8541]      mAP: 0.6896
| mAP                |        |        |        | 0.685 |
2023-08-02 13:02:27,879 - mmrotate - INFO - Epoch(val) [732][8541]      mAP: 0.6851
| mAP                |        |        |        | 0.689 |
2023-08-02 13:52:06,785 - mmrotate - INFO - Epoch(val) [744][8541]      mAP: 0.6895
| mAP                |        |        |        | 0.687 |
2023-08-02 14:41:48,040 - mmrotate - INFO - Epoch(val) [756][8541]      mAP: 0.6868
| mAP                |        |        |        | 0.685 |
2023-08-02 15:31:33,748 - mmrotate - INFO - Epoch(val) [768][8541]      mAP: 0.6853
| mAP                |        |        |        | 0.684 |
2023-08-02 16:21:19,817 - mmrotate - INFO - Epoch(val) [780][8541]      mAP: 0.6836
| mAP                |        |        |        | 0.686 |
2023-08-02 17:11:09,557 - mmrotate - INFO - Epoch(val) [792][8541]      mAP: 0.6864
| mAP                |        |        |        | 0.687 |
2023-08-02 18:00:54,260 - mmrotate - INFO - Epoch(val) [804][8541]      mAP: 0.6872
| mAP                |        |        |        | 0.685 |
2023-08-02 18:50:50,314 - mmrotate - INFO - Epoch(val) [816][8541]      mAP: 0.6850
| mAP                |        |        |        | 0.684 |
2023-08-02 19:40:39,120 - mmrotate - INFO - Epoch(val) [828][8541]      mAP: 0.6836
| mAP                |        |        |        | 0.684 |
2023-08-02 20:30:39,360 - mmrotate - INFO - Epoch(val) [840][8541]      mAP: 0.6836
| mAP                |        |        |        | 0.681 |
2023-08-02 21:20:26,515 - mmrotate - INFO - Epoch(val) [852][8541]      mAP: 0.6813
| mAP                |        |        |        | 0.685 |
2023-08-02 22:10:21,591 - mmrotate - INFO - Epoch(val) [864][8541]      mAP: 0.6853
| mAP                |        |        |        | 0.684 |
2023-08-02 23:00:10,578 - mmrotate - INFO - Epoch(val) [876][8541]      mAP: 0.6838
| mAP                |        |        |        | 0.683 |
2023-08-02 23:50:06,990 - mmrotate - INFO - Epoch(val) [888][8541]      mAP: 0.6833
| mAP                |        |        |        | 0.681 |
2023-08-03 00:40:04,047 - mmrotate - INFO - Epoch(val) [900][8541]      mAP: 0.6812
| mAP                |        |        |        | 0.682 |
2023-08-03 01:29:53,060 - mmrotate - INFO - Epoch(val) [912][8541]      mAP: 0.6815
| mAP                |        |        |        | 0.681 |
2023-08-03 02:19:42,982 - mmrotate - INFO - Epoch(val) [924][8541]      mAP: 0.6810
| mAP                |        |        |        | 0.682 |
2023-08-03 03:09:26,042 - mmrotate - INFO - Epoch(val) [936][8541]      mAP: 0.6817
| mAP                |        |        |        | 0.682 |
2023-08-03 03:59:15,195 - mmrotate - INFO - Epoch(val) [948][8541]      mAP: 0.6817
| mAP                |        |        |        | 0.682 |
2023-08-03 04:49:13,036 - mmrotate - INFO - Epoch(val) [960][8541]      mAP: 0.6818
| mAP                |        |        |        | 0.682 |
2023-08-03 05:38:56,656 - mmrotate - INFO - Epoch(val) [972][8541]      mAP: 0.6816
| mAP                |        |        |        | 0.681 |
2023-08-03 06:28:41,580 - mmrotate - INFO - Epoch(val) [984][8541]      mAP: 0.6812
| mAP                |        |        |        | 0.681 |
2023-08-03 07:18:26,913 - mmrotate - INFO - Epoch(val) [996][8541]      mAP: 0.6811
timresink commented 11 months ago

I see that the loss is increasing for the bbox regression here nad I am experiencing the same. Could anyone help with this?

xpLF1314 commented 3 months ago

Contents of document tutorial_exps/6/20230731_101831.log.json

{"env_info": "sys.platform: linux\nPython: 3.7.7 (default, May  7 2020, 21:25:33) [GCC 7.3.0]\nCUDA available: True\nGPU 0,1,2,3,4,5,6,7: NVIDIA GeForce RTX 2080 Ti\nCUDA_HOME: /usr/local/cuda\nNVCC: Cuda compilation tools, release 10.1, V10.1.24\nGCC: gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0\nPyTorch: 1.6.0\nPyTorch compiling details: PyTorch built with:\n  - GCC 7.3\n  - C++ Version: 201402\n  - Intel(R) Math Kernel Library Version 2020.0.1 Product Build 20200208 for Intel(R) 64 architecture applications\n  - Intel(R) MKL-DNN v1.5.0 (Git Hash e2ac1fac44c5078ca927cb9b90e1b3066a0b2ed0)\n  - OpenMP 201511 (a.k.a. OpenMP 4.5)\n  - NNPACK is enabled\n  - CPU capability usage: AVX2\n  - CUDA Runtime 10.1\n  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_37,code=compute_37\n  - CuDNN 7.6.3\n  - Magma 2.5.2\n  - Build settings: BLAS=MKL, BUILD_TYPE=Release, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DUSE_VULKAN_WRAPPER -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, USE_CUDA=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_STATIC_DISPATCH=OFF, \n\nTorchVision: 0.7.0\nOpenCV: 4.7.0\nMMCV: 1.7.1\nMMCV Compiler: GCC 7.3\nMMCV CUDA Compiler: 10.1\nMMRotate: 0.3.4+fe36494", "config": "dataset_type = 'DOTADataset'\ndata_root = 'data/split_ss_dota/'\nimg_norm_cfg = dict(\n    mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True)\ntrain_pipeline = [\n    dict(type='LoadImageFromFile'),\n    dict(type='LoadAnnotations', with_bbox=True),\n    dict(type='RResize', img_scale=(1024, 1024)),\n    dict(\n        type='RRandomFlip',\n        flip_ratio=[0.25, 0.25, 0.25],\n        direction=['horizontal', 'vertical', 'diagonal'],\n        version='le90'),\n    dict(\n        type='Normalize',\n        mean=[123.675, 116.28, 103.53],\n        std=[58.395, 57.12, 57.375],\n        to_rgb=True),\n    dict(type='Pad', size_divisor=32),\n    dict(type='DefaultFormatBundle'),\n    dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])\n]\ntest_pipeline = [\n    dict(type='LoadImageFromFile'),\n    dict(\n        type='MultiScaleFlipAug',\n        img_scale=(1024, 1024),\n        flip=False,\n        transforms=[\n            dict(type='RResize'),\n            dict(\n                type='Normalize',\n                mean=[123.675, 116.28, 103.53],\n                std=[58.395, 57.12, 57.375],\n                to_rgb=True),\n            dict(type='Pad', size_divisor=32),\n            dict(type='DefaultFormatBundle'),\n            dict(type='Collect', keys=['img'])\n        ])\n]\ndata = dict(\n    samples_per_gpu=2,\n    workers_per_gpu=2,\n    train=dict(\n        type='DOTADataset',\n        ann_file='data/split_ss_dota/train/annfiles/',\n        img_prefix='data/split_ss_dota/train/images/',\n        pipeline=[\n            dict(type='LoadImageFromFile'),\n            dict(type='LoadAnnotations', with_bbox=True),\n            dict(type='RResize', img_scale=(1024, 1024)),\n            dict(\n                type='RRandomFlip',\n                flip_ratio=[0.25, 0.25, 0.25],\n                direction=['horizontal', 'vertical', 'diagonal'],\n                version='le90'),\n            dict(\n                type='Normalize',\n                mean=[123.675, 116.28, 103.53],\n                std=[58.395, 57.12, 57.375],\n                to_rgb=True),\n            dict(type='Pad', size_divisor=32),\n            dict(type='DefaultFormatBundle'),\n            dict(type='Collect', keys=['img', 'gt_bboxes', 'gt_labels'])\n        ],\n        version='le90'),\n    val=dict(\n        type='DOTADataset',\n        ann_file='data/split_ss_dota/trainval/annfiles/',\n        img_prefix='data/split_ss_dota/trainval/images/',\n        pipeline=[\n            dict(type='LoadImageFromFile'),\n            dict(\n                type='MultiScaleFlipAug',\n                img_scale=(1024, 1024),\n                flip=False,\n                transforms=[\n                    dict(type='RResize'),\n                    dict(\n                        type='Normalize',\n                        mean=[123.675, 116.28, 103.53],\n                        std=[58.395, 57.12, 57.375],\n                        to_rgb=True),\n                    dict(type='Pad', size_divisor=32),\n                    dict(type='DefaultFormatBundle'),\n                    dict(type='Collect', keys=['img'])\n                ])\n        ],\n        version='le90'),\n    test=dict(\n        type='DOTADataset',\n        ann_file='data/split_ss_dota/trainval/annfiles/',\n        img_prefix='data/split_ss_dota/trainval/images/',\n        pipeline=[\n            dict(type='LoadImageFromFile'),\n            dict(\n                type='MultiScaleFlipAug',\n                img_scale=(1024, 1024),\n                flip=False,\n                transforms=[\n                    dict(type='RResize'),\n                    dict(\n                        type='Normalize',\n                        mean=[123.675, 116.28, 103.53],\n                        std=[58.395, 57.12, 57.375],\n                        to_rgb=True),\n                    dict(type='Pad', size_divisor=32),\n                    dict(type='DefaultFormatBundle'),\n                    dict(type='Collect', keys=['img'])\n                ])\n        ],\n        version='le90'))\nevaluation = dict(interval=12, metric='mAP')\noptimizer = dict(type='SGD', lr=0.005, momentum=0.9, weight_decay=0.0001)\noptimizer_config = dict(grad_clip=dict(max_norm=35, norm_type=2))\nlr_config = dict(\n    policy='step',\n    warmup='linear',\n    warmup_iters=500,\n    warmup_ratio=0.3333333333333333,\n    step=[8, 11])\nrunner = dict(type='EpochBasedRunner', max_epochs=1200)\ncheckpoint_config = dict(interval=12)\nlog_config = dict(interval=50, hooks=[dict(type='TextLoggerHook')])\ndist_params = dict(backend='nccl')\nlog_level = 'INFO'\nload_from = None\nresume_from = None\nworkflow = [('train', 1)]\nangle_version = 'le90'\nmodel = dict(\n    type='OrientedRCNN',\n    backbone=dict(\n        type='ResNet',\n        depth=50,\n        num_stages=4,\n        out_indices=(0, 1, 2, 3),\n        frozen_stages=1,\n        norm_cfg=dict(type='BN', requires_grad=True),\n        norm_eval=True,\n        style='pytorch',\n        init_cfg=dict(type='Pretrained', checkpoint='torchvision://resnet50')),\n    neck=dict(\n        type='FPN',\n        in_channels=[256, 512, 1024, 2048],\n        out_channels=256,\n        num_outs=5),\n    rpn_head=dict(\n        type='OrientedRPNHead',\n        in_channels=256,\n        feat_channels=256,\n        version='le90',\n        anchor_generator=dict(\n            type='AnchorGenerator',\n            scales=[8],\n            ratios=[0.5, 1.0, 2.0],\n            strides=[4, 8, 16, 32, 64]),\n        bbox_coder=dict(\n            type='MidpointOffsetCoder',\n            angle_range='le90',\n            target_means=[0.0, 0.0, 0.0, 0.0, 0.0, 0.0],\n            target_stds=[1.0, 1.0, 1.0, 1.0, 0.5, 0.5]),\n        loss_cls=dict(\n            type='CrossEntropyLoss', use_sigmoid=True, loss_weight=1.0),\n        loss_bbox=dict(\n            type='SmoothL1Loss', beta=0.1111111111111111, loss_weight=1.0)),\n    roi_head=dict(\n        type='OrientedStandardRoIHead',\n        bbox_roi_extractor=dict(\n            type='RotatedSingleRoIExtractor',\n            roi_layer=dict(\n                type='RoIAlignRotated',\n                out_size=7,\n                sample_num=2,\n                clockwise=True),\n            out_channels=256,\n            featmap_strides=[4, 8, 16, 32]),\n        bbox_head=dict(\n            type='RotatedShared2FCBBoxHead',\n            in_channels=256,\n            fc_out_channels=1024,\n            roi_feat_size=7,\n            num_classes=15,\n            bbox_coder=dict(\n                type='DeltaXYWHAOBBoxCoder',\n                angle_range='le90',\n                norm_factor=None,\n                edge_swap=True,\n                proj_xy=True,\n                target_means=(0.0, 0.0, 0.0, 0.0, 0.0),\n                target_stds=(0.1, 0.1, 0.2, 0.2, 0.1)),\n            reg_class_agnostic=True,\n            loss_cls=dict(\n                type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0),\n            loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0))),\n    train_cfg=dict(\n        rpn=dict(\n            assigner=dict(\n                type='MaxIoUAssigner',\n                pos_iou_thr=0.7,\n                neg_iou_thr=0.3,\n                min_pos_iou=0.3,\n                match_low_quality=True,\n                ignore_iof_thr=-1),\n            sampler=dict(\n                type='RandomSampler',\n                num=256,\n                pos_fraction=0.5,\n                neg_pos_ub=-1,\n                add_gt_as_proposals=False),\n            allowed_border=0,\n            pos_weight=-1,\n            debug=False),\n        rpn_proposal=dict(\n            nms_pre=2000,\n            max_per_img=2000,\n            nms=dict(type='nms', iou_threshold=0.8),\n            min_bbox_size=0),\n        rcnn=dict(\n            assigner=dict(\n                type='MaxIoUAssigner',\n                pos_iou_thr=0.5,\n                neg_iou_thr=0.5,\n                min_pos_iou=0.5,\n                match_low_quality=False,\n                iou_calculator=dict(type='RBboxOverlaps2D'),\n                ignore_iof_thr=-1),\n            sampler=dict(\n                type='RRandomSampler',\n                num=512,\n                pos_fraction=0.25,\n                neg_pos_ub=-1,\n                add_gt_as_proposals=True),\n            pos_weight=-1,\n            debug=False)),\n    test_cfg=dict(\n        rpn=dict(\n            nms_pre=2000,\n            max_per_img=2000,\n            nms=dict(type='nms', iou_threshold=0.8),\n            min_bbox_size=0),\n        rcnn=dict(\n            nms_pre=2000,\n            min_bbox_size=0,\n            score_thr=0.05,\n            nms=dict(iou_thr=0.1),\n            max_per_img=2000)))\nfp16 = dict(loss_scale='dynamic')\nwork_dir = 'tutorial_exps/6'\nauto_resume = False\ngpu_ids = range(0, 8)\n", "seed": 1002054415, "exp_name": "oriented_rcnn_r50_fpn_fp16_1x_dota_le90.py"}
{"mode": "train", "epoch": 1, "iter": 50, "lr": 0.00199, "memory": 5225, "data_time": 0.06591, "loss_rpn_cls": 0.49008, "loss_rpn_bbox": 0.24451, "loss_cls": 0.5884, "acc": 89.26196, "loss_bbox": 0.07441, "loss": 1.3974, "grad_norm": Infinity, "time": 0.39103}
{"mode": "train", "epoch": 1, "iter": 100, "lr": 0.00233, "memory": 5827, "data_time": 0.01341, "loss_rpn_cls": 0.25922, "loss_rpn_bbox": 0.27324, "loss_cls": 0.3099, "acc": 93.64111, "loss_bbox": 0.19406, "loss": 1.03642, "grad_norm": 2.8591, "time": 0.32763}
{"mode": "train", "epoch": 1, "iter": 150, "lr": 0.00266, "memory": 5972, "data_time": 0.01375, "loss_rpn_cls": 0.21154, "loss_rpn_bbox": 0.2416, "loss_cls": 0.24554, "acc": 94.07812, "loss_bbox": 0.22503, "loss": 0.92371, "grad_norm": 3.08655, "time": 0.31599}
{"mode": "train", "epoch": 1, "iter": 200, "lr": 0.00299, "memory": 5972, "data_time": 0.01507, "loss_rpn_cls": 0.17166, "loss_rpn_bbox": 0.23847, "loss_cls": 0.24146, "acc": 93.6853, "loss_bbox": 0.25926, "loss": 0.91086, "grad_norm": 3.90545, "time": 0.31348}
{"mode": "train", "epoch": 1, "iter": 250, "lr": 0.00333, "memory": 5972, "data_time": 0.01522, "loss_rpn_cls": 0.14846, "loss_rpn_bbox": 0.24486, "loss_cls": 0.24624, "acc": 93.05493, "loss_bbox": 0.29654, "loss": 0.9361, "grad_norm": 3.75738, "time": 0.30953}
{"mode": "train", "epoch": 1, "iter": 300, "lr": 0.00366, "memory": 5972, "data_time": 0.01396, "loss_rpn_cls": 0.12617, "loss_rpn_bbox": 0.22107, "loss_cls": 0.23617, "acc": 93.15283, "loss_bbox": 0.28501, "loss": 0.86842, "grad_norm": 3.64423, "time": 0.30898}
{"mode": "train", "epoch": 1, "iter": 350, "lr": 0.00399, "memory": 5972, "data_time": 0.01454, "loss_rpn_cls": 0.11642, "loss_rpn_bbox": 0.22439, "loss_cls": 0.26844, "acc": 91.93848, "loss_bbox": 0.32938, "loss": 0.93863, "grad_norm": 4.13674, "time": 0.30617}
{"mode": "train", "epoch": 1, "iter": 400, "lr": 0.00433, "memory": 5972, "data_time": 0.0125, "loss_rpn_cls": 0.09355, "loss_rpn_bbox": 0.20377, "loss_cls": 0.26567, "acc": 91.57178, "loss_bbox": 0.31137, "loss": 0.87435, "grad_norm": 3.78112, "time": 0.30788}
{"mode": "train", "epoch": 1, "iter": 450, "lr": 0.00466, "memory": 5972, "data_time": 0.01373, "loss_rpn_cls": 0.08562, "loss_rpn_bbox": 0.16761, "loss_cls": 0.25621, "acc": 91.83545, "loss_bbox": 0.29948, "loss": 0.80892, "grad_norm": 3.3404, "time": 0.31039}
{"mode": "train", "epoch": 1, "iter": 500, "lr": 0.00499, "memory": 5972, "data_time": 0.01322, "loss_rpn_cls": 0.0892, "loss_rpn_bbox": 0.14871, "loss_cls": 0.25657, "acc": 91.66187, "loss_bbox": 0.29561, "loss": 0.79009, "grad_norm": 3.5171, "time": 0.30781}
{"mode": "train", "epoch": 1, "iter": 550, "lr": 0.005, "memory": 5972, "data_time": 0.01299, "loss_rpn_cls": 0.08214, "loss_rpn_bbox": 0.15301, "loss_cls": 0.25519, "acc": 91.47144, "loss_bbox": 0.30947, "loss": 0.79981, "grad_norm": 3.53342, "time": 0.30582}
{"mode": "train", "epoch": 1, "iter": 600, "lr": 0.005, "memory": 7297, "data_time": 0.01262, "loss_rpn_cls": 0.08469, "loss_rpn_bbox": 0.148, "loss_cls": 0.26125, "acc": 91.22021, "loss_bbox": 0.28039, "loss": 0.77434, "grad_norm": 3.43214, "time": 0.32085}
{"mode": "train", "epoch": 2, "iter": 50, "lr": 0.005, "memory": 7302, "data_time": 0.06201, "loss_rpn_cls": 0.07263, "loss_rpn_bbox": 0.13418, "loss_cls": 0.2542, "acc": 91.04639, "loss_bbox": 0.29143, "loss": 0.75244, "grad_norm": 3.06555, "time": 0.36229}
{"mode": "train", "epoch": 2, "iter": 100, "lr": 0.005, "memory": 7302, "data_time": 0.01481, "loss_rpn_cls": 0.07452, "loss_rpn_bbox": 0.13444, "loss_cls": 0.2615, "acc": 90.88672, "loss_bbox": 0.28135, "loss": 0.7518, "grad_norm": 3.36351, "time": 0.30876}
{"mode": "train", "epoch": 2, "iter": 150, "lr": 0.005, "memory": 7302, "data_time": 0.01429, "loss_rpn_cls": 0.0627, "loss_rpn_bbox": 0.13066, "loss_cls": 0.25449, "acc": 91.04077, "loss_bbox": 0.2799, "loss": 0.72775, "grad_norm": 3.10981, "time": 0.30872}
{"mode": "train", "epoch": 2, "iter": 200, "lr": 0.005, "memory": 7302, "data_time": 0.01328, "loss_rpn_cls": 0.06462, "loss_rpn_bbox": 0.12688, "loss_cls": 0.2448, "acc": 91.33496, "loss_bbox": 0.28028, "loss": 0.71658, "grad_norm": 3.13727, "time": 0.31271}
{"mode": "train", "epoch": 2, "iter": 250, "lr": 0.005, "memory": 7302, "data_time": 0.01391, "loss_rpn_cls": 0.06489, "loss_rpn_bbox": 0.12779, "loss_cls": 0.24462, "acc": 91.43579, "loss_bbox": 0.28111, "loss": 0.71841, "grad_norm": 3.12275, "time": 0.3149}
{"mode": "train", "epoch": 2, "iter": 300, "lr": 0.005, "memory": 7302, "data_time": 0.01447, "loss_rpn_cls": 0.06365, "loss_rpn_bbox": 0.12144, "loss_cls": 0.24944, "acc": 91.16821, "loss_bbox": 0.2757, "loss": 0.71023, "grad_norm": 3.1206, "time": 0.3088}
{"mode": "train", "epoch": 2, "iter": 350, "lr": 0.005, "memory": 7302, "data_time": 0.01416, "loss_rpn_cls": 0.06299, "loss_rpn_bbox": 0.10959, "loss_cls": 0.23042, "acc": 91.89478, "loss_bbox": 0.24875, "loss": 0.65174, "grad_norm": 2.74787, "time": 0.30823}
{"mode": "train", "epoch": 2, "iter": 400, "lr": 0.005, "memory": 7312, "data_time": 0.01394, "loss_rpn_cls": 0.06043, "loss_rpn_bbox": 0.10851, "loss_cls": 0.23645, "acc": 91.31006, "loss_bbox": 0.25611, "loss": 0.6615, "grad_norm": 3.11993, "time": 0.31424}
{"mode": "train", "epoch": 2, "iter": 450, "lr": 0.005, "memory": 7312, "data_time": 0.01252, "loss_rpn_cls": 0.05776, "loss_rpn_bbox": 0.10813, "loss_cls": 0.22053, "acc": 92.05542, "loss_bbox": 0.23997, "loss": 0.62639, "grad_norm": 2.71383, "time": 0.3125}
{"mode": "train", "epoch": 2, "iter": 500, "lr": 0.005, "memory": 7312, "data_time": 0.01385, "loss_rpn_cls": 0.0577, "loss_rpn_bbox": 0.10813, "loss_cls": 0.23127, "acc": 91.60962, "loss_bbox": 0.24847, "loss": 0.64556, "grad_norm": 2.9239, "time": 0.3109}
{"mode": "train", "epoch": 2, "iter": 550, "lr": 0.005, "memory": 7312, "data_time": 0.01292, "loss_rpn_cls": 0.05479, "loss_rpn_bbox": 0.10286, "loss_cls": 0.22517, "acc": 91.80444, "loss_bbox": 0.23906, "loss": 0.62188, "grad_norm": 2.66494, "time": 0.31704}
{"mode": "train", "epoch": 2, "iter": 600, "lr": 0.005, "memory": 7312, "data_time": 0.01349, "loss_rpn_cls": 0.05428, "loss_rpn_bbox": 0.1064, "loss_cls": 0.22532, "acc": 91.81494, "loss_bbox": 0.23804, "loss": 0.62405, "grad_norm": 2.75621, "time": 0.31111}
{"mode": "train", "epoch": 3, "iter": 50, "lr": 0.005, "memory": 7312, "data_time": 0.06007, "loss_rpn_cls": 0.05233, "loss_rpn_bbox": 0.10132, "loss_cls": 0.22508, "acc": 91.5918, "loss_bbox": 0.23478, "loss": 0.61351, "grad_norm": 2.69264, "time": 0.36143}
{"mode": "train", "epoch": 3, "iter": 100, "lr": 0.005, "memory": 7312, "data_time": 0.01315, "loss_rpn_cls": 0.04968, "loss_rpn_bbox": 0.10358, "loss_cls": 0.21309, "acc": 92.06812, "loss_bbox": 0.24289, "loss": 0.60924, "grad_norm": 2.78402, "time": 0.31405}
{"mode": "train", "epoch": 3, "iter": 150, "lr": 0.005, "memory": 7312, "data_time": 0.01367, "loss_rpn_cls": 0.04873, "loss_rpn_bbox": 0.10553, "loss_cls": 0.22101, "acc": 91.66113, "loss_bbox": 0.24174, "loss": 0.61701, "grad_norm": 2.70125, "time": 0.30781}
{"mode": "train", "epoch": 3, "iter": 200, "lr": 0.005, "memory": 7312, "data_time": 0.01418, "loss_rpn_cls": 0.04183, "loss_rpn_bbox": 0.09339, "loss_cls": 0.2214, "acc": 91.82031, "loss_bbox": 0.23976, "loss": 0.59637, "grad_norm": 2.62679, "time": 0.31188}
{"mode": "train", "epoch": 3, "iter": 250, "lr": 0.005, "memory": 7312, "data_time": 0.01428, "loss_rpn_cls": 0.05342, "loss_rpn_bbox": 0.09869, "loss_cls": 0.21171, "acc": 92.16797, "loss_bbox": 0.22656, "loss": 0.59038, "grad_norm": 2.87926, "time": 0.31004}
{"mode": "train", "epoch": 3, "iter": 300, "lr": 0.005, "memory": 7312, "data_time": 0.01386, "loss_rpn_cls": 0.0471, "loss_rpn_bbox": 0.09685, "loss_cls": 0.21024, "acc": 92.17139, "loss_bbox": 0.22463, "loss": 0.57882, "grad_norm": 2.80212, "time": 0.31132}
{"mode": "train", "epoch": 3, "iter": 350, "lr": 0.005, "memory": 7312, "data_time": 0.01409, "loss_rpn_cls": 0.04459, "loss_rpn_bbox": 0.09252, "loss_cls": 0.20088, "acc": 92.50146, "loss_bbox": 0.21729, "loss": 0.55528, "grad_norm": 2.49329, "time": 0.31192}
{"mode": "train", "epoch": 3, "iter": 400, "lr": 0.005, "memory": 7312, "data_time": 0.01386, "loss_rpn_cls": 0.04395, "loss_rpn_bbox": 0.09071, "loss_cls": 0.20239, "acc": 92.55347, "loss_bbox": 0.21332, "loss": 0.55037, "grad_norm": 2.55151, "time": 0.31458}
{"mode": "train", "epoch": 3, "iter": 450, "lr": 0.005, "memory": 7312, "data_time": 0.01465, "loss_rpn_cls": 0.04597, "loss_rpn_bbox": 0.09003, "loss_cls": 0.20191, "acc": 92.49658, "loss_bbox": 0.22008, "loss": 0.55798, "grad_norm": 2.51531, "time": 0.31776}
{"mode": "train", "epoch": 3, "iter": 500, "lr": 0.005, "memory": 7312, "data_time": 0.01451, "loss_rpn_cls": 0.04595, "loss_rpn_bbox": 0.10073, "loss_cls": 0.20615, "acc": 92.2561, "loss_bbox": 0.21531, "loss": 0.56814, "grad_norm": 2.77899, "time": 0.30753}
{"mode": "train", "epoch": 3, "iter": 550, "lr": 0.005, "memory": 7312, "data_time": 0.01453, "loss_rpn_cls": 0.04731, "loss_rpn_bbox": 0.08981, "loss_cls": 0.20043, "acc": 92.58643, "loss_bbox": 0.20307, "loss": 0.54061, "grad_norm": 2.63092, "time": 0.31465}
{"mode": "train", "epoch": 3, "iter": 600, "lr": 0.005, "memory": 7312, "data_time": 0.01491, "loss_rpn_cls": 0.04281, "loss_rpn_bbox": 0.08573, "loss_cls": 0.20061, "acc": 92.34961, "loss_bbox": 0.20997, "loss": 0.53912, "grad_norm": 2.5258, "time": 0.31727}
{"mode": "train", "epoch": 4, "iter": 50, "lr": 0.005, "memory": 7312, "data_time": 0.06159, "loss_rpn_cls": 0.04014, "loss_rpn_bbox": 0.09228, "loss_cls": 0.20265, "acc": 92.37842, "loss_bbox": 0.22107, "loss": 0.55615, "grad_norm": 2.49584, "time": 0.36054}
{"mode": "train", "epoch": 4, "iter": 100, "lr": 0.005, "memory": 7312, "data_time": 0.0145, "loss_rpn_cls": 0.04111, "loss_rpn_bbox": 0.0916, "loss_cls": 0.19791, "acc": 92.50684, "loss_bbox": 0.21272, "loss": 0.54334, "grad_norm": 2.53911, "time": 0.31153}
{"mode": "train", "epoch": 4, "iter": 150, "lr": 0.005, "memory": 7312, "data_time": 0.01387, "loss_rpn_cls": 0.04112, "loss_rpn_bbox": 0.0852, "loss_cls": 0.19452, "acc": 92.68945, "loss_bbox": 0.21114, "loss": 0.53197, "grad_norm": 2.64917, "time": 0.31116}
{"mode": "train", "epoch": 4, "iter": 200, "lr": 0.005, "memory": 7312, "data_time": 0.01401, "loss_rpn_cls": 0.04283, "loss_rpn_bbox": 0.09205, "loss_cls": 0.20543, "acc": 92.25439, "loss_bbox": 0.21132, "loss": 0.55163, "grad_norm": 2.73124, "time": 0.31287}
{"mode": "train", "epoch": 4, "iter": 250, "lr": 0.005, "memory": 7312, "data_time": 0.01419, "loss_rpn_cls": 0.04001, "loss_rpn_bbox": 0.07857, "loss_cls": 0.19259, "acc": 92.66724, "loss_bbox": 0.19585, "loss": 0.50702, "grad_norm": 2.34148, "time": 0.31155}
{"mode": "train", "epoch": 4, "iter": 300, "lr": 0.005, "memory": 7312, "data_time": 0.01361, "loss_rpn_cls": 0.03641, "loss_rpn_bbox": 0.07927, "loss_cls": 0.18487, "acc": 93.10425, "loss_bbox": 0.19072, "loss": 0.49127, "grad_norm": 2.4257, "time": 0.30573}
{"mode": "train", "epoch": 4, "iter": 350, "lr": 0.005, "memory": 7312, "data_time": 0.01461, "loss_rpn_cls": 0.03734, "loss_rpn_bbox": 0.08723, "loss_cls": 0.1898, "acc": 92.88086, "loss_bbox": 0.20116, "loss": 0.51552, "grad_norm": 2.53326, "time": 0.31718}
{"mode": "train", "epoch": 4, "iter": 400, "lr": 0.005, "memory": 7312, "data_time": 0.01406, "loss_rpn_cls": 0.03761, "loss_rpn_bbox": 0.07493, "loss_cls": 0.18336, "acc": 93.01709, "loss_bbox": 0.19075, "loss": 0.48665, "grad_norm": 2.32669, "time": 0.31124}
{"mode": "train", "epoch": 4, "iter": 450, "lr": 0.005, "memory": 7312, "data_time": 0.01363, "loss_rpn_cls": 0.03812, "loss_rpn_bbox": 0.07534, "loss_cls": 0.18923, "acc": 92.771, "loss_bbox": 0.19254, "loss": 0.49523, "grad_norm": 2.44344, "time": 0.31058}
{"mode": "train", "epoch": 4, "iter": 500, "lr": 0.005, "memory": 7312, "data_time": 0.01344, "loss_rpn_cls": 0.03534, "loss_rpn_bbox": 0.08069, "loss_cls": 0.18534, "acc": 92.85107, "loss_bbox": 0.20247, "loss": 0.50384, "grad_norm": 2.39246, "time": 0.31021}
{"mode": "train", "epoch": 4, "iter": 550, "lr": 0.005, "memory": 7312, "data_time": 0.01411, "loss_rpn_cls": 0.03876, "loss_rpn_bbox": 0.09405, "loss_cls": 0.19607, "acc": 92.47803, "loss_bbox": 0.20755, "loss": 0.53643, "grad_norm": 2.36415, "time": 0.31305}
{"mode": "train", "epoch": 4, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.0132, "loss_rpn_cls": 0.03804, "loss_rpn_bbox": 0.09053, "loss_cls": 0.19229, "acc": 92.61841, "loss_bbox": 0.20123, "loss": 0.52209, "grad_norm": 2.31333, "time": 0.31802}
{"mode": "train", "epoch": 5, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.06095, "loss_rpn_cls": 0.03379, "loss_rpn_bbox": 0.08313, "loss_cls": 0.18539, "acc": 92.75269, "loss_bbox": 0.19516, "loss": 0.49747, "grad_norm": 2.30116, "time": 0.36295}
{"mode": "train", "epoch": 5, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.01404, "loss_rpn_cls": 0.03469, "loss_rpn_bbox": 0.08453, "loss_cls": 0.17725, "acc": 93.15894, "loss_bbox": 0.19032, "loss": 0.48679, "grad_norm": 2.20512, "time": 0.31054}
{"mode": "train", "epoch": 5, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01498, "loss_rpn_cls": 0.0353, "loss_rpn_bbox": 0.08337, "loss_cls": 0.1794, "acc": 92.93628, "loss_bbox": 0.19327, "loss": 0.49134, "grad_norm": 2.32082, "time": 0.30632}
{"mode": "train", "epoch": 5, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01451, "loss_rpn_cls": 0.03388, "loss_rpn_bbox": 0.07641, "loss_cls": 0.17562, "acc": 93.23145, "loss_bbox": 0.18344, "loss": 0.46935, "grad_norm": 2.38215, "time": 0.31452}
{"mode": "train", "epoch": 5, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01382, "loss_rpn_cls": 0.03194, "loss_rpn_bbox": 0.07727, "loss_cls": 0.18474, "acc": 92.84668, "loss_bbox": 0.19159, "loss": 0.48554, "grad_norm": 2.26403, "time": 0.31116}
{"mode": "train", "epoch": 5, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.01481, "loss_rpn_cls": 0.03346, "loss_rpn_bbox": 0.08042, "loss_cls": 0.18061, "acc": 93.13745, "loss_bbox": 0.19731, "loss": 0.49181, "grad_norm": 2.55265, "time": 0.31005}
{"mode": "train", "epoch": 5, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.01453, "loss_rpn_cls": 0.03336, "loss_rpn_bbox": 0.08538, "loss_cls": 0.17584, "acc": 93.24805, "loss_bbox": 0.18881, "loss": 0.48339, "grad_norm": 2.37044, "time": 0.31255}
{"mode": "train", "epoch": 5, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01475, "loss_rpn_cls": 0.03343, "loss_rpn_bbox": 0.08086, "loss_cls": 0.1788, "acc": 93.10376, "loss_bbox": 0.18745, "loss": 0.48053, "grad_norm": 2.26766, "time": 0.30894}
{"mode": "train", "epoch": 5, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.01428, "loss_rpn_cls": 0.03249, "loss_rpn_bbox": 0.07307, "loss_cls": 0.17886, "acc": 93.05396, "loss_bbox": 0.18366, "loss": 0.46808, "grad_norm": 2.19324, "time": 0.31336}
{"mode": "train", "epoch": 5, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.01468, "loss_rpn_cls": 0.03114, "loss_rpn_bbox": 0.07528, "loss_cls": 0.17163, "acc": 93.26343, "loss_bbox": 0.17732, "loss": 0.45537, "grad_norm": 2.13121, "time": 0.30904}
{"mode": "train", "epoch": 5, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01444, "loss_rpn_cls": 0.03199, "loss_rpn_bbox": 0.06925, "loss_cls": 0.17958, "acc": 93.0874, "loss_bbox": 0.18071, "loss": 0.46153, "grad_norm": 2.38434, "time": 0.31644}
{"mode": "train", "epoch": 5, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.0143, "loss_rpn_cls": 0.03297, "loss_rpn_bbox": 0.08784, "loss_cls": 0.17233, "acc": 93.35352, "loss_bbox": 0.18075, "loss": 0.47389, "grad_norm": 2.25789, "time": 0.31783}
{"mode": "train", "epoch": 6, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.06146, "loss_rpn_cls": 0.02954, "loss_rpn_bbox": 0.07941, "loss_cls": 0.17918, "acc": 93.06152, "loss_bbox": 0.19024, "loss": 0.47836, "grad_norm": 2.37168, "time": 0.35847}
{"mode": "train", "epoch": 6, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.01511, "loss_rpn_cls": 0.03239, "loss_rpn_bbox": 0.07837, "loss_cls": 0.1779, "acc": 93.16699, "loss_bbox": 0.18747, "loss": 0.47614, "grad_norm": 2.39204, "time": 0.31202}
{"mode": "train", "epoch": 6, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01464, "loss_rpn_cls": 0.02903, "loss_rpn_bbox": 0.07646, "loss_cls": 0.17412, "acc": 93.18896, "loss_bbox": 0.18167, "loss": 0.46129, "grad_norm": 2.11462, "time": 0.3101}
{"mode": "train", "epoch": 6, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01437, "loss_rpn_cls": 0.0296, "loss_rpn_bbox": 0.08031, "loss_cls": 0.17118, "acc": 93.24585, "loss_bbox": 0.17889, "loss": 0.45999, "grad_norm": 2.29988, "time": 0.31385}
{"mode": "train", "epoch": 6, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01476, "loss_rpn_cls": 0.02782, "loss_rpn_bbox": 0.0733, "loss_cls": 0.16798, "acc": 93.43359, "loss_bbox": 0.17842, "loss": 0.44753, "grad_norm": 2.14625, "time": 0.31419}
{"mode": "train", "epoch": 6, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.01415, "loss_rpn_cls": 0.03186, "loss_rpn_bbox": 0.08136, "loss_cls": 0.17695, "acc": 93.03833, "loss_bbox": 0.18658, "loss": 0.47675, "grad_norm": 2.45252, "time": 0.31426}
{"mode": "train", "epoch": 6, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.01396, "loss_rpn_cls": 0.02763, "loss_rpn_bbox": 0.06762, "loss_cls": 0.16542, "acc": 93.56372, "loss_bbox": 0.17913, "loss": 0.43979, "grad_norm": 2.18137, "time": 0.30965}
{"mode": "train", "epoch": 6, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01441, "loss_rpn_cls": 0.02881, "loss_rpn_bbox": 0.07452, "loss_cls": 0.16753, "acc": 93.51831, "loss_bbox": 0.17418, "loss": 0.44504, "grad_norm": 2.25841, "time": 0.31463}
{"mode": "train", "epoch": 6, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.01374, "loss_rpn_cls": 0.02786, "loss_rpn_bbox": 0.07493, "loss_cls": 0.16427, "acc": 93.52905, "loss_bbox": 0.17334, "loss": 0.4404, "grad_norm": 2.14399, "time": 0.30558}
{"mode": "train", "epoch": 6, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.029, "loss_rpn_bbox": 0.08372, "loss_cls": 0.17277, "acc": 93.25439, "loss_bbox": 0.18627, "loss": 0.47177, "grad_norm": 2.32484, "time": 0.31076}
{"mode": "train", "epoch": 6, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01445, "loss_rpn_cls": 0.02822, "loss_rpn_bbox": 0.07152, "loss_cls": 0.16706, "acc": 93.40674, "loss_bbox": 0.17395, "loss": 0.44075, "grad_norm": 2.13454, "time": 0.30917}
{"mode": "train", "epoch": 6, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.01479, "loss_rpn_cls": 0.03212, "loss_rpn_bbox": 0.07697, "loss_cls": 0.1595, "acc": 93.77539, "loss_bbox": 0.17143, "loss": 0.44002, "grad_norm": 2.10196, "time": 0.31597}
{"mode": "train", "epoch": 7, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.0611, "loss_rpn_cls": 0.02693, "loss_rpn_bbox": 0.07014, "loss_cls": 0.16082, "acc": 93.71948, "loss_bbox": 0.16799, "loss": 0.42587, "grad_norm": 2.17101, "time": 0.36067}
{"mode": "train", "epoch": 7, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.0149, "loss_rpn_cls": 0.02821, "loss_rpn_bbox": 0.07734, "loss_cls": 0.16917, "acc": 93.34497, "loss_bbox": 0.1799, "loss": 0.45462, "grad_norm": 2.29488, "time": 0.30968}
{"mode": "train", "epoch": 7, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01433, "loss_rpn_cls": 0.02542, "loss_rpn_bbox": 0.07637, "loss_cls": 0.16321, "acc": 93.51294, "loss_bbox": 0.17546, "loss": 0.44046, "grad_norm": 2.24608, "time": 0.30882}
{"mode": "train", "epoch": 7, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01481, "loss_rpn_cls": 0.02734, "loss_rpn_bbox": 0.07285, "loss_cls": 0.15937, "acc": 93.84204, "loss_bbox": 0.17377, "loss": 0.43332, "grad_norm": 2.28271, "time": 0.31246}
{"mode": "train", "epoch": 7, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01354, "loss_rpn_cls": 0.02956, "loss_rpn_bbox": 0.07564, "loss_cls": 0.16403, "acc": 93.59766, "loss_bbox": 0.17184, "loss": 0.44107, "grad_norm": 2.22358, "time": 0.31582}
{"mode": "train", "epoch": 7, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.014, "loss_rpn_cls": 0.02664, "loss_rpn_bbox": 0.08083, "loss_cls": 0.16915, "acc": 93.31055, "loss_bbox": 0.17329, "loss": 0.44992, "grad_norm": 2.15659, "time": 0.30934}
{"mode": "train", "epoch": 7, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.0137, "loss_rpn_cls": 0.02401, "loss_rpn_bbox": 0.06939, "loss_cls": 0.1594, "acc": 93.69141, "loss_bbox": 0.16416, "loss": 0.41696, "grad_norm": 2.09128, "time": 0.3084}
{"mode": "train", "epoch": 7, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01459, "loss_rpn_cls": 0.02546, "loss_rpn_bbox": 0.07119, "loss_cls": 0.16399, "acc": 93.4978, "loss_bbox": 0.16933, "loss": 0.42997, "grad_norm": 2.07036, "time": 0.3072}
{"mode": "train", "epoch": 7, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.01262, "loss_rpn_cls": 0.02698, "loss_rpn_bbox": 0.07773, "loss_cls": 0.16177, "acc": 93.6355, "loss_bbox": 0.17049, "loss": 0.43697, "grad_norm": Infinity, "time": 0.31227}
{"mode": "train", "epoch": 7, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.0126, "loss_rpn_cls": 0.0247, "loss_rpn_bbox": 0.07727, "loss_cls": 0.15663, "acc": 93.74268, "loss_bbox": 0.17805, "loss": 0.43665, "grad_norm": 2.25696, "time": 0.31292}
{"mode": "train", "epoch": 7, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01273, "loss_rpn_cls": 0.02553, "loss_rpn_bbox": 0.07469, "loss_cls": 0.1605, "acc": 93.70557, "loss_bbox": 0.17148, "loss": 0.43219, "grad_norm": 2.24083, "time": 0.31151}
{"mode": "train", "epoch": 7, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.01328, "loss_rpn_cls": 0.02568, "loss_rpn_bbox": 0.06938, "loss_cls": 0.1583, "acc": 93.77808, "loss_bbox": 0.1704, "loss": 0.42375, "grad_norm": 2.0953, "time": 0.31171}
{"mode": "train", "epoch": 8, "iter": 50, "lr": 0.005, "memory": 7313, "data_time": 0.06064, "loss_rpn_cls": 0.02812, "loss_rpn_bbox": 0.0785, "loss_cls": 0.16256, "acc": 93.5686, "loss_bbox": 0.17591, "loss": 0.44509, "grad_norm": 2.33617, "time": 0.35966}
{"mode": "train", "epoch": 8, "iter": 100, "lr": 0.005, "memory": 7313, "data_time": 0.01442, "loss_rpn_cls": 0.02208, "loss_rpn_bbox": 0.07757, "loss_cls": 0.15539, "acc": 93.88232, "loss_bbox": 0.17151, "loss": 0.42656, "grad_norm": 2.11713, "time": 0.31481}
{"mode": "train", "epoch": 8, "iter": 150, "lr": 0.005, "memory": 7313, "data_time": 0.01441, "loss_rpn_cls": 0.02335, "loss_rpn_bbox": 0.07125, "loss_cls": 0.15516, "acc": 93.85571, "loss_bbox": 0.17244, "loss": 0.4222, "grad_norm": 2.18155, "time": 0.31175}
{"mode": "train", "epoch": 8, "iter": 200, "lr": 0.005, "memory": 7313, "data_time": 0.01416, "loss_rpn_cls": 0.02382, "loss_rpn_bbox": 0.06559, "loss_cls": 0.16096, "acc": 93.61865, "loss_bbox": 0.16771, "loss": 0.41808, "grad_norm": 2.03908, "time": 0.3091}
{"mode": "train", "epoch": 8, "iter": 250, "lr": 0.005, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.02473, "loss_rpn_bbox": 0.07049, "loss_cls": 0.15361, "acc": 93.90576, "loss_bbox": 0.16591, "loss": 0.41475, "grad_norm": 2.10348, "time": 0.31601}
{"mode": "train", "epoch": 8, "iter": 300, "lr": 0.005, "memory": 7313, "data_time": 0.01331, "loss_rpn_cls": 0.02459, "loss_rpn_bbox": 0.0799, "loss_cls": 0.16039, "acc": 93.69312, "loss_bbox": 0.16626, "loss": 0.43114, "grad_norm": 2.14169, "time": 0.31713}
{"mode": "train", "epoch": 8, "iter": 350, "lr": 0.005, "memory": 7313, "data_time": 0.01416, "loss_rpn_cls": 0.02285, "loss_rpn_bbox": 0.06958, "loss_cls": 0.15422, "acc": 93.98218, "loss_bbox": 0.16386, "loss": 0.41051, "grad_norm": 2.06042, "time": 0.31523}
{"mode": "train", "epoch": 8, "iter": 400, "lr": 0.005, "memory": 7313, "data_time": 0.01458, "loss_rpn_cls": 0.02606, "loss_rpn_bbox": 0.0706, "loss_cls": 0.15806, "acc": 93.83447, "loss_bbox": 0.16004, "loss": 0.41476, "grad_norm": 2.09746, "time": 0.31189}
{"mode": "train", "epoch": 8, "iter": 450, "lr": 0.005, "memory": 7313, "data_time": 0.0145, "loss_rpn_cls": 0.02456, "loss_rpn_bbox": 0.06934, "loss_cls": 0.15397, "acc": 93.96973, "loss_bbox": 0.16603, "loss": 0.41391, "grad_norm": 2.1831, "time": 0.31656}
{"mode": "train", "epoch": 8, "iter": 500, "lr": 0.005, "memory": 7313, "data_time": 0.01593, "loss_rpn_cls": 0.02404, "loss_rpn_bbox": 0.07383, "loss_cls": 0.15636, "acc": 93.8418, "loss_bbox": 0.16659, "loss": 0.42083, "grad_norm": 2.19792, "time": 0.31881}
{"mode": "train", "epoch": 8, "iter": 550, "lr": 0.005, "memory": 7313, "data_time": 0.01498, "loss_rpn_cls": 0.02348, "loss_rpn_bbox": 0.06787, "loss_cls": 0.14917, "acc": 94.09253, "loss_bbox": 0.16846, "loss": 0.40899, "grad_norm": 2.09245, "time": 0.30738}
{"mode": "train", "epoch": 8, "iter": 600, "lr": 0.005, "memory": 7313, "data_time": 0.01471, "loss_rpn_cls": 0.02094, "loss_rpn_bbox": 0.06632, "loss_cls": 0.15636, "acc": 93.77515, "loss_bbox": 0.16864, "loss": 0.41226, "grad_norm": 2.10976, "time": 0.31301}
{"mode": "train", "epoch": 9, "iter": 50, "lr": 0.0005, "memory": 7313, "data_time": 0.06221, "loss_rpn_cls": 0.02075, "loss_rpn_bbox": 0.06499, "loss_cls": 0.14423, "acc": 94.25098, "loss_bbox": 0.15458, "loss": 0.38455, "grad_norm": 1.69776, "time": 0.36615}
{"mode": "train", "epoch": 9, "iter": 100, "lr": 0.0005, "memory": 7313, "data_time": 0.01492, "loss_rpn_cls": 0.01938, "loss_rpn_bbox": 0.06572, "loss_cls": 0.14807, "acc": 94.14624, "loss_bbox": 0.15034, "loss": 0.38352, "grad_norm": 1.70021, "time": 0.31029}
{"mode": "train", "epoch": 9, "iter": 150, "lr": 0.0005, "memory": 7313, "data_time": 0.01541, "loss_rpn_cls": 0.01884, "loss_rpn_bbox": 0.05553, "loss_cls": 0.1417, "acc": 94.41797, "loss_bbox": 0.14541, "loss": 0.36148, "grad_norm": 1.63026, "time": 0.31095}
{"mode": "train", "epoch": 9, "iter": 200, "lr": 0.0005, "memory": 7313, "data_time": 0.01554, "loss_rpn_cls": 0.01882, "loss_rpn_bbox": 0.05564, "loss_cls": 0.13558, "acc": 94.60181, "loss_bbox": 0.14752, "loss": 0.35757, "grad_norm": 1.65712, "time": 0.31143}
{"mode": "train", "epoch": 9, "iter": 250, "lr": 0.0005, "memory": 7313, "data_time": 0.01499, "loss_rpn_cls": 0.0199, "loss_rpn_bbox": 0.06004, "loss_cls": 0.14321, "acc": 94.25928, "loss_bbox": 0.15421, "loss": 0.37735, "grad_norm": 1.69668, "time": 0.31209}
{"mode": "train", "epoch": 9, "iter": 300, "lr": 0.0005, "memory": 7313, "data_time": 0.01487, "loss_rpn_cls": 0.02019, "loss_rpn_bbox": 0.0619, "loss_cls": 0.13946, "acc": 94.41382, "loss_bbox": 0.14506, "loss": 0.36661, "grad_norm": 1.69229, "time": 0.31445}
{"mode": "train", "epoch": 9, "iter": 350, "lr": 0.0005, "memory": 7313, "data_time": 0.01441, "loss_rpn_cls": 0.01854, "loss_rpn_bbox": 0.059, "loss_cls": 0.13904, "acc": 94.42065, "loss_bbox": 0.15023, "loss": 0.36681, "grad_norm": 1.74397, "time": 0.30896}
{"mode": "train", "epoch": 9, "iter": 400, "lr": 0.0005, "memory": 7313, "data_time": 0.015, "loss_rpn_cls": 0.0196, "loss_rpn_bbox": 0.06042, "loss_cls": 0.13881, "acc": 94.45898, "loss_bbox": 0.14764, "loss": 0.36648, "grad_norm": 1.67291, "time": 0.31278}
{"mode": "train", "epoch": 9, "iter": 450, "lr": 0.0005, "memory": 7313, "data_time": 0.01568, "loss_rpn_cls": 0.01871, "loss_rpn_bbox": 0.06556, "loss_cls": 0.14822, "acc": 94.02417, "loss_bbox": 0.15517, "loss": 0.38765, "grad_norm": 1.74661, "time": 0.31829}
{"mode": "train", "epoch": 9, "iter": 500, "lr": 0.0005, "memory": 7313, "data_time": 0.01536, "loss_rpn_cls": 0.02078, "loss_rpn_bbox": 0.06408, "loss_cls": 0.14372, "acc": 94.28687, "loss_bbox": 0.1532, "loss": 0.38178, "grad_norm": 1.71049, "time": 0.31833}
{"mode": "train", "epoch": 9, "iter": 550, "lr": 0.0005, "memory": 7313, "data_time": 0.01523, "loss_rpn_cls": 0.02115, "loss_rpn_bbox": 0.05762, "loss_cls": 0.14714, "acc": 94.16504, "loss_bbox": 0.15361, "loss": 0.37952, "grad_norm": 1.72701, "time": 0.30793}
{"mode": "train", "epoch": 9, "iter": 600, "lr": 0.0005, "memory": 7313, "data_time": 0.01453, "loss_rpn_cls": 0.01883, "loss_rpn_bbox": 0.06205, "loss_cls": 0.14241, "acc": 94.27417, "loss_bbox": 0.15384, "loss": 0.37712, "grad_norm": 1.75762, "time": 0.31536}
{"mode": "train", "epoch": 10, "iter": 50, "lr": 0.0005, "memory": 7313, "data_time": 0.06277, "loss_rpn_cls": 0.01745, "loss_rpn_bbox": 0.05541, "loss_cls": 0.14045, "acc": 94.38159, "loss_bbox": 0.15084, "loss": 0.36415, "grad_norm": 1.6777, "time": 0.35428}
{"mode": "train", "epoch": 10, "iter": 100, "lr": 0.0005, "memory": 7313, "data_time": 0.01439, "loss_rpn_cls": 0.01876, "loss_rpn_bbox": 0.06658, "loss_cls": 0.1411, "acc": 94.33472, "loss_bbox": 0.15342, "loss": 0.37986, "grad_norm": 1.75149, "time": 0.3127}
{"mode": "train", "epoch": 10, "iter": 150, "lr": 0.0005, "memory": 7313, "data_time": 0.01486, "loss_rpn_cls": 0.01846, "loss_rpn_bbox": 0.05933, "loss_cls": 0.13914, "acc": 94.42554, "loss_bbox": 0.14945, "loss": 0.36638, "grad_norm": 1.67852, "time": 0.3129}
{"mode": "train", "epoch": 10, "iter": 200, "lr": 0.0005, "memory": 7313, "data_time": 0.01447, "loss_rpn_cls": 0.01851, "loss_rpn_bbox": 0.0596, "loss_cls": 0.13278, "acc": 94.65723, "loss_bbox": 0.13899, "loss": 0.34988, "grad_norm": 1.74557, "time": 0.30957}
{"mode": "train", "epoch": 10, "iter": 250, "lr": 0.0005, "memory": 7313, "data_time": 0.01421, "loss_rpn_cls": 0.01889, "loss_rpn_bbox": 0.05989, "loss_cls": 0.14123, "acc": 94.3584, "loss_bbox": 0.15242, "loss": 0.37242, "grad_norm": 1.73603, "time": 0.31111}
{"mode": "train", "epoch": 10, "iter": 300, "lr": 0.0005, "memory": 7313, "data_time": 0.0145, "loss_rpn_cls": 0.01908, "loss_rpn_bbox": 0.06328, "loss_cls": 0.14472, "acc": 94.20361, "loss_bbox": 0.15231, "loss": 0.37939, "grad_norm": 1.72936, "time": 0.30653}
{"mode": "train", "epoch": 10, "iter": 350, "lr": 0.0005, "memory": 7313, "data_time": 0.01555, "loss_rpn_cls": 0.01798, "loss_rpn_bbox": 0.05822, "loss_cls": 0.13819, "acc": 94.48975, "loss_bbox": 0.14984, "loss": 0.36423, "grad_norm": 1.68749, "time": 0.31404}
{"mode": "train", "epoch": 10, "iter": 400, "lr": 0.0005, "memory": 7313, "data_time": 0.0138, "loss_rpn_cls": 0.01789, "loss_rpn_bbox": 0.0587, "loss_cls": 0.14044, "acc": 94.33911, "loss_bbox": 0.14691, "loss": 0.36394, "grad_norm": 1.69943, "time": 0.30966}
{"mode": "train", "epoch": 10, "iter": 450, "lr": 0.0005, "memory": 7313, "data_time": 0.01318, "loss_rpn_cls": 0.01979, "loss_rpn_bbox": 0.06118, "loss_cls": 0.14117, "acc": 94.39917, "loss_bbox": 0.15219, "loss": 0.37433, "grad_norm": 1.79924, "time": 0.31224}
{"mode": "train", "epoch": 10, "iter": 500, "lr": 0.0005, "memory": 7313, "data_time": 0.01435, "loss_rpn_cls": 0.01804, "loss_rpn_bbox": 0.06213, "loss_cls": 0.13887, "acc": 94.5332, "loss_bbox": 0.1451, "loss": 0.36414, "grad_norm": 1.71161, "time": 0.31389}
{"mode": "train", "epoch": 10, "iter": 550, "lr": 0.0005, "memory": 7313, "data_time": 0.01468, "loss_rpn_cls": 0.01684, "loss_rpn_bbox": 0.06169, "loss_cls": 0.13813, "acc": 94.45312, "loss_bbox": 0.14684, "loss": 0.36351, "grad_norm": 1.69689, "time": 0.31035}
{"mode": "train", "epoch": 10, "iter": 600, "lr": 0.0005, "memory": 7313, "data_time": 0.01484, "loss_rpn_cls": 0.01959, "loss_rpn_bbox": 0.06354, "loss_cls": 0.14206, "acc": 94.27026, "loss_bbox": 0.14872, "loss": 0.3739, "grad_norm": 1.72743, "time": 0.31639}
{"mode": "train", "epoch": 11, "iter": 50, "lr": 0.0005, "memory": 7313, "data_time": 0.06049, "loss_rpn_cls": 0.01772, "loss_rpn_bbox": 0.05691, "loss_cls": 0.13755, "acc": 94.46899, "loss_bbox": 0.1493, "loss": 0.36148, "grad_norm": 1.69198, "time": 0.36497}
{"mode": "train", "epoch": 11, "iter": 100, "lr": 0.0005, "memory": 7313, "data_time": 0.01474, "loss_rpn_cls": 0.01852, "loss_rpn_bbox": 0.06347, "loss_cls": 0.13469, "acc": 94.57202, "loss_bbox": 0.14904, "loss": 0.36572, "grad_norm": 1.70533, "time": 0.31679}
{"mode": "train", "epoch": 11, "iter": 150, "lr": 0.0005, "memory": 7313, "data_time": 0.01526, "loss_rpn_cls": 0.0175, "loss_rpn_bbox": 0.06002, "loss_cls": 0.13693, "acc": 94.56812, "loss_bbox": 0.14313, "loss": 0.35757, "grad_norm": 1.76658, "time": 0.31253}
{"mode": "train", "epoch": 11, "iter": 200, "lr": 0.0005, "memory": 7313, "data_time": 0.01517, "loss_rpn_cls": 0.01849, "loss_rpn_bbox": 0.06311, "loss_cls": 0.14064, "acc": 94.34961, "loss_bbox": 0.15083, "loss": 0.37308, "grad_norm": 1.81965, "time": 0.3092}
{"mode": "train", "epoch": 11, "iter": 250, "lr": 0.0005, "memory": 7313, "data_time": 0.01229, "loss_rpn_cls": 0.01916, "loss_rpn_bbox": 0.05908, "loss_cls": 0.13861, "acc": 94.51758, "loss_bbox": 0.146, "loss": 0.36285, "grad_norm": 1.71436, "time": 0.3103}
{"mode": "train", "epoch": 11, "iter": 300, "lr": 0.0005, "memory": 7313, "data_time": 0.01418, "loss_rpn_cls": 0.01966, "loss_rpn_bbox": 0.06162, "loss_cls": 0.14146, "acc": 94.30225, "loss_bbox": 0.14982, "loss": 0.37256, "grad_norm": 1.7506, "time": 0.31276}
{"mode": "train", "epoch": 11, "iter": 350, "lr": 0.0005, "memory": 7313, "data_time": 0.01548, "loss_rpn_cls": 0.01824, "loss_rpn_bbox": 0.06113, "loss_cls": 0.13888, "acc": 94.40601, "loss_bbox": 0.15354, "loss": 0.37179, "grad_norm": 1.76402, "time": 0.30962}
{"mode": "train", "epoch": 11, "iter": 400, "lr": 0.0005, "memory": 7313, "data_time": 0.01471, "loss_rpn_cls": 0.01744, "loss_rpn_bbox": 0.06175, "loss_cls": 0.1347, "acc": 94.51123, "loss_bbox": 0.14869, "loss": 0.36258, "grad_norm": 1.72716, "time": 0.30919}
{"mode": "train", "epoch": 11, "iter": 450, "lr": 0.0005, "memory": 7313, "data_time": 0.01526, "loss_rpn_cls": 0.01831, "loss_rpn_bbox": 0.05458, "loss_cls": 0.13744, "acc": 94.58569, "loss_bbox": 0.1437, "loss": 0.35403, "grad_norm": 1.76165, "time": 0.31}
{"mode": "train", "epoch": 11, "iter": 500, "lr": 0.0005, "memory": 7313, "data_time": 0.01485, "loss_rpn_cls": 0.01879, "loss_rpn_bbox": 0.0606, "loss_cls": 0.13808, "acc": 94.49902, "loss_bbox": 0.14958, "loss": 0.36705, "grad_norm": 1.75726, "time": 0.30969}
{"mode": "train", "epoch": 11, "iter": 550, "lr": 0.0005, "memory": 7313, "data_time": 0.01637, "loss_rpn_cls": 0.01781, "loss_rpn_bbox": 0.05641, "loss_cls": 0.138, "acc": 94.4436, "loss_bbox": 0.14501, "loss": 0.35722, "grad_norm": 1.74624, "time": 0.3157}
{"mode": "train", "epoch": 11, "iter": 600, "lr": 0.0005, "memory": 7313, "data_time": 0.01489, "loss_rpn_cls": 0.01751, "loss_rpn_bbox": 0.0583, "loss_cls": 0.1394, "acc": 94.37305, "loss_bbox": 0.15177, "loss": 0.36698, "grad_norm": 1.72964, "time": 0.31123}
{"mode": "train", "epoch": 12, "iter": 50, "lr": 5e-05, "memory": 7313, "data_time": 0.06148, "loss_rpn_cls": 0.01837, "loss_rpn_bbox": 0.06605, "loss_cls": 0.13662, "acc": 94.54126, "loss_bbox": 0.14617, "loss": 0.36721, "grad_norm": 1.71705, "time": 0.35697}
{"mode": "train", "epoch": 12, "iter": 100, "lr": 5e-05, "memory": 7313, "data_time": 0.01526, "loss_rpn_cls": 0.0175, "loss_rpn_bbox": 0.05514, "loss_cls": 0.13364, "acc": 94.6875, "loss_bbox": 0.14034, "loss": 0.34662, "grad_norm": Infinity, "time": 0.31688}
{"mode": "train", "epoch": 12, "iter": 150, "lr": 5e-05, "memory": 7313, "data_time": 0.01457, "loss_rpn_cls": 0.01833, "loss_rpn_bbox": 0.0583, "loss_cls": 0.1367, "acc": 94.52051, "loss_bbox": 0.14669, "loss": 0.36002, "grad_norm": 1.75122, "time": 0.3127}
{"mode": "train", "epoch": 12, "iter": 200, "lr": 5e-05, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.01878, "loss_rpn_bbox": 0.06293, "loss_cls": 0.14046, "acc": 94.3479, "loss_bbox": 0.15025, "loss": 0.37242, "grad_norm": 1.76831, "time": 0.31381}
{"mode": "train", "epoch": 12, "iter": 250, "lr": 5e-05, "memory": 7313, "data_time": 0.01472, "loss_rpn_cls": 0.01641, "loss_rpn_bbox": 0.05748, "loss_cls": 0.13463, "acc": 94.58154, "loss_bbox": 0.14437, "loss": 0.35288, "grad_norm": 1.64531, "time": 0.31433}
{"mode": "train", "epoch": 12, "iter": 300, "lr": 5e-05, "memory": 7313, "data_time": 0.01427, "loss_rpn_cls": 0.01647, "loss_rpn_bbox": 0.05725, "loss_cls": 0.1377, "acc": 94.52368, "loss_bbox": 0.1465, "loss": 0.35791, "grad_norm": 1.68576, "time": 0.30987}
{"mode": "train", "epoch": 12, "iter": 350, "lr": 5e-05, "memory": 7313, "data_time": 0.01463, "loss_rpn_cls": 0.01789, "loss_rpn_bbox": 0.06142, "loss_cls": 0.13718, "acc": 94.5105, "loss_bbox": 0.14815, "loss": 0.36465, "grad_norm": 1.70723, "time": 0.31219}
{"mode": "train", "epoch": 12, "iter": 400, "lr": 5e-05, "memory": 7313, "data_time": 0.01403, "loss_rpn_cls": 0.01669, "loss_rpn_bbox": 0.05459, "loss_cls": 0.13911, "acc": 94.43726, "loss_bbox": 0.14909, "loss": 0.35948, "grad_norm": 1.76735, "time": 0.3101}
{"mode": "train", "epoch": 12, "iter": 450, "lr": 5e-05, "memory": 7313, "data_time": 0.01404, "loss_rpn_cls": 0.01668, "loss_rpn_bbox": 0.05402, "loss_cls": 0.13558, "acc": 94.58423, "loss_bbox": 0.14675, "loss": 0.35303, "grad_norm": 1.68917, "time": 0.30829}
{"mode": "train", "epoch": 12, "iter": 500, "lr": 5e-05, "memory": 7313, "data_time": 0.01385, "loss_rpn_cls": 0.01888, "loss_rpn_bbox": 0.06378, "loss_cls": 0.13996, "acc": 94.40747, "loss_bbox": 0.14547, "loss": 0.36808, "grad_norm": 1.73619, "time": 0.31841}
{"mode": "train", "epoch": 12, "iter": 550, "lr": 5e-05, "memory": 7313, "data_time": 0.01388, "loss_rpn_cls": 0.01788, "loss_rpn_bbox": 0.05709, "loss_cls": 0.13766, "acc": 94.5, "loss_bbox": 0.1458, "loss": 0.35843, "grad_norm": 1.75452, "time": 0.3105}
{"mode": "train", "epoch": 12, "iter": 600, "lr": 5e-05, "memory": 7313, "data_time": 0.01372, "loss_rpn_cls": 0.01985, "loss_rpn_bbox": 0.06297, "loss_cls": 0.14479, "acc": 94.18164, "loss_bbox": 0.15355, "loss": 0.38116, "grad_norm": 1.84367, "time": 0.31359}
{"mode": "val", "epoch": 12, "iter": 8541, "lr": 5e-05, "mAP": 0.66536}

It's too long so I'll just show the subsequent mAP changes

# grep "mAP" tutorial_exps/6/20230731_1
01831.log     
evaluation = dict(interval=12, metric='mAP')
| mAP                |        |        |        | 0.665 |
2023-07-31 11:29:11,760 - mmrotate - INFO - Epoch(val) [12][8541]       mAP: 0.6654
| mAP                |        |        |        | 0.668 |
2023-07-31 12:19:05,220 - mmrotate - INFO - Epoch(val) [24][8541]       mAP: 0.6675
| mAP                |        |        |        | 0.668 |
2023-07-31 13:08:50,656 - mmrotate - INFO - Epoch(val) [36][8541]       mAP: 0.6682
| mAP                |        |        |        | 0.670 |
2023-07-31 13:59:27,904 - mmrotate - INFO - Epoch(val) [48][8541]       mAP: 0.6699
| mAP                |        |        |        | 0.671 |
2023-07-31 14:49:44,282 - mmrotate - INFO - Epoch(val) [60][8541]       mAP: 0.6709
| mAP                |        |        |        | 0.672 |
2023-07-31 15:40:11,998 - mmrotate - INFO - Epoch(val) [72][8541]       mAP: 0.6720
| mAP                |        |        |        | 0.673 |
2023-07-31 16:30:12,368 - mmrotate - INFO - Epoch(val) [84][8541]       mAP: 0.6726
| mAP                |        |        |        | 0.673 |
2023-07-31 17:20:18,145 - mmrotate - INFO - Epoch(val) [96][8541]       mAP: 0.6733
| mAP                |        |        |        | 0.674 |
2023-07-31 18:10:21,347 - mmrotate - INFO - Epoch(val) [108][8541]      mAP: 0.6743
| mAP                |        |        |        | 0.675 |
2023-07-31 19:00:17,892 - mmrotate - INFO - Epoch(val) [120][8541]      mAP: 0.6751
| mAP                |        |        |        | 0.676 |
2023-07-31 19:50:30,967 - mmrotate - INFO - Epoch(val) [132][8541]      mAP: 0.6758
| mAP                |        |        |        | 0.677 |
2023-07-31 20:40:40,306 - mmrotate - INFO - Epoch(val) [144][8541]      mAP: 0.6768
| mAP                |        |        |        | 0.678 |
2023-07-31 21:30:40,664 - mmrotate - INFO - Epoch(val) [156][8541]      mAP: 0.6777
| mAP                |        |        |        | 0.678 |
2023-07-31 22:20:39,693 - mmrotate - INFO - Epoch(val) [168][8541]      mAP: 0.6778
| mAP                |        |        |        | 0.679 |
2023-07-31 23:10:36,160 - mmrotate - INFO - Epoch(val) [180][8541]      mAP: 0.6787
| mAP                |        |        |        | 0.679 |
2023-08-01 00:00:31,284 - mmrotate - INFO - Epoch(val) [192][8541]      mAP: 0.6795
| mAP                |        |        |        | 0.680 |
2023-08-01 00:50:29,689 - mmrotate - INFO - Epoch(val) [204][8541]      mAP: 0.6797
| mAP                |        |        |        | 0.680 |
2023-08-01 01:40:17,485 - mmrotate - INFO - Epoch(val) [216][8541]      mAP: 0.6804
| mAP                |        |        |        | 0.681 |
2023-08-01 02:30:05,030 - mmrotate - INFO - Epoch(val) [228][8541]      mAP: 0.6814
| mAP                |        |        |        | 0.682 |
2023-08-01 03:19:05,696 - mmrotate - INFO - Epoch(val) [240][8541]      mAP: 0.6816
| mAP                |        |        |        | 0.682 |
2023-08-01 04:08:08,600 - mmrotate - INFO - Epoch(val) [252][8541]      mAP: 0.6822
| mAP                |        |        |        | 0.683 |
2023-08-01 04:57:24,387 - mmrotate - INFO - Epoch(val) [264][8541]      mAP: 0.6830
| mAP                |        |        |        | 0.683 |
2023-08-01 05:46:30,490 - mmrotate - INFO - Epoch(val) [276][8541]      mAP: 0.6834
| mAP                |        |        |        | 0.683 |
2023-08-01 06:35:37,152 - mmrotate - INFO - Epoch(val) [288][8541]      mAP: 0.6829
| mAP                |        |        |        | 0.684 |
2023-08-01 07:24:51,862 - mmrotate - INFO - Epoch(val) [300][8541]      mAP: 0.6840
| mAP                |        |        |        | 0.684 |
2023-08-01 08:14:15,776 - mmrotate - INFO - Epoch(val) [312][8541]      mAP: 0.6835
| mAP                |        |        |        | 0.685 |
2023-08-01 09:03:35,380 - mmrotate - INFO - Epoch(val) [324][8541]      mAP: 0.6847
| mAP                |        |        |        | 0.685 |
2023-08-01 09:52:50,364 - mmrotate - INFO - Epoch(val) [336][8541]      mAP: 0.6846
| mAP                |        |        |        | 0.683 |
2023-08-01 10:42:07,869 - mmrotate - INFO - Epoch(val) [348][8541]      mAP: 0.6832
| mAP                |        |        |        | 0.684 |
2023-08-01 11:31:20,057 - mmrotate - INFO - Epoch(val) [360][8541]      mAP: 0.6839
| mAP                |        |        |        | 0.686 |
2023-08-01 12:20:27,998 - mmrotate - INFO - Epoch(val) [372][8541]      mAP: 0.6856
| mAP                |        |        |        | 0.686 |
2023-08-01 13:09:37,734 - mmrotate - INFO - Epoch(val) [384][8541]      mAP: 0.6865
| mAP                |        |        |        | 0.687 |
2023-08-01 13:58:48,836 - mmrotate - INFO - Epoch(val) [396][8541]      mAP: 0.6865
| mAP                |        |        |        | 0.683 |
2023-08-01 14:48:13,449 - mmrotate - INFO - Epoch(val) [408][8541]      mAP: 0.6831
| mAP                |        |        |        | 0.685 |
2023-08-01 15:37:20,166 - mmrotate - INFO - Epoch(val) [420][8541]      mAP: 0.6852
| mAP                |        |        |        | 0.685 |
2023-08-01 16:26:37,679 - mmrotate - INFO - Epoch(val) [432][8541]      mAP: 0.6850
| mAP                |        |        |        | 0.685 |
2023-08-01 17:15:49,519 - mmrotate - INFO - Epoch(val) [444][8541]      mAP: 0.6851
| mAP                |        |        |        | 0.687 |
2023-08-01 18:04:56,135 - mmrotate - INFO - Epoch(val) [456][8541]      mAP: 0.6866
| mAP                |        |        |        | 0.686 |
2023-08-01 18:54:17,240 - mmrotate - INFO - Epoch(val) [468][8541]      mAP: 0.6865
| mAP                |        |        |        | 0.687 |
2023-08-01 19:43:47,853 - mmrotate - INFO - Epoch(val) [480][8541]      mAP: 0.6870
| mAP                |        |        |        | 0.687 |
2023-08-01 20:33:11,036 - mmrotate - INFO - Epoch(val) [492][8541]      mAP: 0.6873
| mAP                |        |        |        | 0.687 |
2023-08-01 21:22:32,373 - mmrotate - INFO - Epoch(val) [504][8541]      mAP: 0.6872
| mAP                |        |        |        | 0.687 |
2023-08-01 22:12:05,530 - mmrotate - INFO - Epoch(val) [516][8541]      mAP: 0.6872
| mAP                |        |        |        | 0.688 |
2023-08-01 23:01:31,712 - mmrotate - INFO - Epoch(val) [528][8541]      mAP: 0.6876
| mAP                |        |        |        | 0.688 |
2023-08-01 23:50:51,220 - mmrotate - INFO - Epoch(val) [540][8541]      mAP: 0.6881
| mAP                |        |        |        | 0.688 |
2023-08-02 00:40:22,192 - mmrotate - INFO - Epoch(val) [552][8541]      mAP: 0.6879
| mAP                |        |        |        | 0.688 |
2023-08-02 01:29:36,078 - mmrotate - INFO - Epoch(val) [564][8541]      mAP: 0.6881
| mAP                |        |        |        | 0.685 |
2023-08-02 02:19:13,104 - mmrotate - INFO - Epoch(val) [576][8541]      mAP: 0.6850
| mAP                |        |        |        | 0.690 |
2023-08-02 03:08:36,319 - mmrotate - INFO - Epoch(val) [588][8541]      mAP: 0.6895
| mAP                |        |        |        | 0.685 |
2023-08-02 03:58:00,523 - mmrotate - INFO - Epoch(val) [600][8541]      mAP: 0.6854
| mAP                |        |        |        | 0.690 |
2023-08-02 04:47:33,776 - mmrotate - INFO - Epoch(val) [612][8541]      mAP: 0.6896
| mAP                |        |        |        | 0.689 |
2023-08-02 05:36:50,538 - mmrotate - INFO - Epoch(val) [624][8541]      mAP: 0.6893
| mAP                |        |        |        | 0.689 |
2023-08-02 06:26:04,417 - mmrotate - INFO - Epoch(val) [636][8541]      mAP: 0.6891
| mAP                |        |        |        | 0.690 |
2023-08-02 07:15:12,725 - mmrotate - INFO - Epoch(val) [648][8541]      mAP: 0.6902
| mAP                |        |        |        | 0.686 |
2023-08-02 08:04:08,799 - mmrotate - INFO - Epoch(val) [660][8541]      mAP: 0.6859
| mAP                |        |        |        | 0.690 |
2023-08-02 08:54:15,594 - mmrotate - INFO - Epoch(val) [672][8541]      mAP: 0.6896
| mAP                |        |        |        | 0.690 |
2023-08-02 09:43:23,146 - mmrotate - INFO - Epoch(val) [684][8541]      mAP: 0.6898
| mAP                |        |        |        | 0.686 |
2023-08-02 10:33:07,083 - mmrotate - INFO - Epoch(val) [696][8541]      mAP: 0.6861
| mAP                |        |        |        | 0.690 |
2023-08-02 11:23:02,084 - mmrotate - INFO - Epoch(val) [708][8541]      mAP: 0.6898
| mAP                |        |        |        | 0.690 |
2023-08-02 12:12:48,204 - mmrotate - INFO - Epoch(val) [720][8541]      mAP: 0.6896
| mAP                |        |        |        | 0.685 |
2023-08-02 13:02:27,879 - mmrotate - INFO - Epoch(val) [732][8541]      mAP: 0.6851
| mAP                |        |        |        | 0.689 |
2023-08-02 13:52:06,785 - mmrotate - INFO - Epoch(val) [744][8541]      mAP: 0.6895
| mAP                |        |        |        | 0.687 |
2023-08-02 14:41:48,040 - mmrotate - INFO - Epoch(val) [756][8541]      mAP: 0.6868
| mAP                |        |        |        | 0.685 |
2023-08-02 15:31:33,748 - mmrotate - INFO - Epoch(val) [768][8541]      mAP: 0.6853
| mAP                |        |        |        | 0.684 |
2023-08-02 16:21:19,817 - mmrotate - INFO - Epoch(val) [780][8541]      mAP: 0.6836
| mAP                |        |        |        | 0.686 |
2023-08-02 17:11:09,557 - mmrotate - INFO - Epoch(val) [792][8541]      mAP: 0.6864
| mAP                |        |        |        | 0.687 |
2023-08-02 18:00:54,260 - mmrotate - INFO - Epoch(val) [804][8541]      mAP: 0.6872
| mAP                |        |        |        | 0.685 |
2023-08-02 18:50:50,314 - mmrotate - INFO - Epoch(val) [816][8541]      mAP: 0.6850
| mAP                |        |        |        | 0.684 |
2023-08-02 19:40:39,120 - mmrotate - INFO - Epoch(val) [828][8541]      mAP: 0.6836
| mAP                |        |        |        | 0.684 |
2023-08-02 20:30:39,360 - mmrotate - INFO - Epoch(val) [840][8541]      mAP: 0.6836
| mAP                |        |        |        | 0.681 |
2023-08-02 21:20:26,515 - mmrotate - INFO - Epoch(val) [852][8541]      mAP: 0.6813
| mAP                |        |        |        | 0.685 |
2023-08-02 22:10:21,591 - mmrotate - INFO - Epoch(val) [864][8541]      mAP: 0.6853
| mAP                |        |        |        | 0.684 |
2023-08-02 23:00:10,578 - mmrotate - INFO - Epoch(val) [876][8541]      mAP: 0.6838
| mAP                |        |        |        | 0.683 |
2023-08-02 23:50:06,990 - mmrotate - INFO - Epoch(val) [888][8541]      mAP: 0.6833
| mAP                |        |        |        | 0.681 |
2023-08-03 00:40:04,047 - mmrotate - INFO - Epoch(val) [900][8541]      mAP: 0.6812
| mAP                |        |        |        | 0.682 |
2023-08-03 01:29:53,060 - mmrotate - INFO - Epoch(val) [912][8541]      mAP: 0.6815
| mAP                |        |        |        | 0.681 |
2023-08-03 02:19:42,982 - mmrotate - INFO - Epoch(val) [924][8541]      mAP: 0.6810
| mAP                |        |        |        | 0.682 |
2023-08-03 03:09:26,042 - mmrotate - INFO - Epoch(val) [936][8541]      mAP: 0.6817
| mAP                |        |        |        | 0.682 |
2023-08-03 03:59:15,195 - mmrotate - INFO - Epoch(val) [948][8541]      mAP: 0.6817
| mAP                |        |        |        | 0.682 |
2023-08-03 04:49:13,036 - mmrotate - INFO - Epoch(val) [960][8541]      mAP: 0.6818
| mAP                |        |        |        | 0.682 |
2023-08-03 05:38:56,656 - mmrotate - INFO - Epoch(val) [972][8541]      mAP: 0.6816
| mAP                |        |        |        | 0.681 |
2023-08-03 06:28:41,580 - mmrotate - INFO - Epoch(val) [984][8541]      mAP: 0.6812
| mAP                |        |        |        | 0.681 |
2023-08-03 07:18:26,913 - mmrotate - INFO - Epoch(val) [996][8541]      mAP: 0.6811

Can you tell me how the mAP is calculated? I used test.py to calculate the mAP and kept getting stuck