open-mmlab / mmdetection

OpenMMLab Detection Toolbox and Benchmark
https://mmdetection.readthedocs.io
Apache License 2.0
29.43k stars 9.43k forks source link

I am unable to use "Adam" optimizer in my config #5907

Closed Jamestrump closed 3 years ago

Jamestrump commented 3 years ago

THIS IS MY CONFIG FILE

base = '/content/mmdetection/configs/res2net/cascade_mask_rcnn_r2_101_fpn_20e_coco.py'

model = dict( roi_head=dict( bbox_head=[ dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=4, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0., 0., 0., 0.], target_stds=[0.1, 0.1, 0.2, 0.2]), reg_class_agnostic=True, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0)), dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=4, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0., 0., 0., 0.], target_stds=[0.05, 0.05, 0.1, 0.1]), reg_class_agnostic=True, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0)), dict( type='Shared2FCBBoxHead', in_channels=256, fc_out_channels=1024, roi_feat_size=7, num_classes=4, bbox_coder=dict( type='DeltaXYWHBBoxCoder', target_means=[0., 0., 0., 0.], target_stds=[0.033, 0.033, 0.067, 0.067]), reg_class_agnostic=True, loss_cls=dict( type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0), loss_bbox=dict(type='SmoothL1Loss', beta=1.0, loss_weight=1.0)) ], mask_head=dict(num_classes=4)))

dataset_type = 'COCODataset' classes = (a,b,c,d) data = dict(samples_per_gpu=2,workers_per_gpu=2, train=dict( img_prefix='/content/drive/MyDrive/train2017 (1)/', classes=classes, ann_file='/content/instances_train2017_1.json'), val=dict( img_prefix='/content/drive/MyDrive/val2017 (1)/', classes=classes, ann_file='/content/instances_val2017_1.json'), test=dict( img_prefix='/content/drive/MyDrive/val2017 (1)/', classes=classes, ann_file='/content/instances_val2017_1.json'))

evaluation = dict( interval=1, # Evaluation interval metric=['bbox', 'segm']) # Metrics used during evaluation optimizer = dict( type='Adam',
lr=0.0025) optimizer_config = dict( grad_clip=None) lr_config = dict(policy='poly', power=0.9, min_lr=1e-4, by_epoch=False) runner = dict( type='EpochBasedRunner', max_epochs=10) checkpoint_config = dict( interval=10)

Error

When i try to run this config i get an unusual error. Error - Traceback (most recent call last): File "/content/mmdetection/tools/train.py", line 188, in main() File "/content/mmdetection/tools/train.py", line 184, in main meta=meta) File "/content/mmdetection/mmdet/apis/train.py", line 88, in train_detector optimizer = build_optimizer(model, cfg.optimizer) File "/usr/local/lib/python3.7/dist-packages/mmcv/runner/optimizer/builder.py", line 43, in build_optimizer optimizer = optim_constructor(model) File "/usr/local/lib/python3.7/dist-packages/mmcv/runner/optimizer/default_constructor.py", line 242, in call return build_from_cfg(optimizer_cfg, OPTIMIZERS) File "/usr/local/lib/python3.7/dist-packages/mmcv/utils/registry.py", line 55, in build_from_cfg raise type(e)(f'{obj_cls.name}: {e}') TypeError: Adam: init() got an unexpected keyword argument 'momentum'

Even though i am not using momentum in my config. Any kind of help will be appreciated, Thanks in advance

AronLin commented 3 years ago

5875 may help u

Jamestrump commented 3 years ago

@AronLin I read that but it did not help. My question is that when i did this optimizer = dict( type='Adam', lr=0.0025) in my config file. Do i also need to change the optimizer in base/schedules .py file ?

AronLin commented 3 years ago

@AronLin I read that but it did not help. My question is that when i did this optimizer = dict( type='Adam', lr=0.0025) in my config file. Do i also need to change the optimizer in base/schedules .py file ?

The optimizer in your config will inherit the schedules .py so it has the momentum argument. To not inherit, you need to set the _delete_=True in your config. Such as:

optimizer = dict(
    _delete_=True,
    type='Adam',
    lr=0.0025)
Jamestrump commented 3 years ago

Thankyou so much :)