Open Xglbrilliant opened 1 year ago
Hi! Currently, we only support getting cfg files from installed packages. Maybe you can install mmcls from source:
cd /home/s316/workspace/xionggl/Experiment/mmclassification
pip install -v -e .
and then using
_base_ = [
'mmcls::_base_/datasets/imagenet_bs32.py',
'mmcls::_base_/schedules/imagenet_bs256.py',
'mmcls::_base_/default_runtime.py'
]
If it still reports an error, please let us know.
Describe the bug
I wrote some resnet50 and resnet18 configuration files in mmclassification and experimented with their performance in mmclassification. Then I want to distill these two configs in mmrazor, but it reports an error: only external packages can be imported, such as mmdet: "xxx/xxx.py".
Traceback (most recent call last): File "tools/train.py", line 121, in <module> main() File "tools/train.py", line 114, in main runner = Runner.from_cfg(cfg) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/runner/runner.py", line 431, in from_cfg runner = cls( File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/runner/runner.py", line 398, in __init__ self.model = self.build_model(model) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/runner/runner.py", line 800, in build_model model = MODELS.build(model) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/registry/registry.py", line 521, in build return self.build_func(cfg, *args, **kwargs, registry=self) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/registry/build_functions.py", line 240, in build_model_from_cfg return build_from_cfg(cfg, registry, default_args) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/registry/build_functions.py", line 135, in build_from_cfg raise type(e)( ValueError: class `OverhaulFeatureDistillation` in mmrazor/models/algorithms/distill/configurable/overhaul_feature_distillation.py: `_get_package_and_cfg_path` is used for get external package, please specify the package name and relative config path, just like `mmdet::faster_rcnn/faster-rcnn_r50_fpn_1x_coco.py`
To Reproduce
The command you executed.
python tools/train.py configs/distill/mmcls/r50_train_r18_abtest.py --work-dir work_dirs/test/
Post related information
- The output of
pip list | grep "mmcv\|mmrazor\|^torch"
mmcv 2.0.0rc4 mmrazor 1.0.0rc2 /home/s316/workspace/xionggl/Experiment/mmrazor torch 1.12.0 torchaudio 0.12.0 torchvision 0.13.0
- Your config file if you modified it or created a new one.
_base_ = [ #'mmcls::_base_/datasets/imagenet_bs32.py', '/home/s316/workspace/xionggl/Experiment/mmclassification/configs/_base_/datasets/dermnet_bs64.py', #'mmcls::_base_/schedules/imagenet_bs256.py', '/home/s316/workspace/xionggl/Experiment/mmclassification/configs/_base_/schedules/dermnet_bs64_warmup_coslr.py', #'mmcls::_base_/default_runtime.py' '/home/s316/workspace/xionggl/Experiment/mmclassification/configs/_base_/default_runtime.py' ] teacher_ckpt = 'https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth' # noqa: E501 model = dict( _scope_='mmrazor', type='SingleTeacherDistill', data_preprocessor=dict( type='ImgDataPreprocessor', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], bgr_to_rgb=True), architecture=dict( cfg_path='/home/s316/workspace/xionggl/Experiment/mmclassification/configs/resnet/resnet18_b64_dermnet.py', pretrained=False), teacher=dict( cfg_path='/home/s316/workspace/xionggl/Experiment/mmclassification/configs/resnet/resnet50_b64_dermnet.py', pretrained=True), teacher_ckpt=teacher_ckpt, #calculate_student_loss=False, calculate_student_loss=True, distiller=dict( type='ConfigurableDistiller', student_recorders=dict( bb_s4=dict(type='ModuleOutputs', source='backbone.layer4.1.conv2'), bb_s3=dict(type='ModuleOutputs', source='backbone.layer3.1.conv2'), bb_s2=dict(type='ModuleOutputs', source='backbone.layer2.1.conv2'), bb_s1=dict(type='ModuleOutputs', source='backbone.layer1.1.conv2')), teacher_recorders=dict( bb_s4=dict(type='ModuleOutputs', source='backbone.layer4.2.conv3'), bb_s3=dict(type='ModuleOutputs', source='backbone.layer3.5.conv3'), bb_s2=dict(type='ModuleOutputs', source='backbone.layer2.3.conv3'), bb_s1=dict(type='ModuleOutputs', source='backbone.layer1.2.conv3')), distill_losses=dict( loss_s4=dict(type='ABLoss', loss_weight=1.0), loss_s3=dict(type='ABLoss', loss_weight=0.5), loss_s2=dict(type='ABLoss', loss_weight=0.25), loss_s1=dict(type='ABLoss', loss_weight=0.125)), connectors=dict( loss_s4_sfeat=dict( type='ConvModuleConncetor', in_channel=512, out_channel=2048, norm_cfg=dict(type='BN'), act_cfg=None), loss_s3_sfeat=dict( type='ConvModuleConncetor', in_channel=256, out_channel=1024, norm_cfg=dict(type='BN'), act_cfg=None), loss_s2_sfeat=dict( type='ConvModuleConncetor', in_channel=128, out_channel=512, norm_cfg=dict(type='BN'), act_cfg=None), loss_s1_sfeat=dict( type='ConvModuleConncetor', in_channel=64, out_channel=256, norm_cfg=dict(type='BN'), act_cfg=None)), loss_forward_mappings=dict( loss_s4=dict( s_feature=dict( from_student=True, recorder='bb_s4', connector='loss_s4_sfeat'), t_feature=dict(from_student=False, recorder='bb_s4')), loss_s3=dict( s_feature=dict( from_student=True, recorder='bb_s3', connector='loss_s3_sfeat'), t_feature=dict(from_student=False, recorder='bb_s3')), loss_s2=dict( s_feature=dict( from_student=True, recorder='bb_s2', connector='loss_s2_sfeat'), t_feature=dict(from_student=False, recorder='bb_s2')), loss_s1=dict( s_feature=dict( from_student=True, recorder='bb_s1', connector='loss_s1_sfeat'), t_feature=dict(from_student=False, recorder='bb_s1'))))) find_unused_parameters = True train_cfg = dict(by_epoch=True, max_epochs=20, val_interval=1) val_cfg = dict(_delete_=True, type='mmrazor.SingleTeacherDistillValLoop')
- Your train log file if you meet the problem during training. [here]
- Other code you modified in the
mmrazor
folder. [here]Additional context
Add any other context about the problem here.
[here]
do you solve the problem? i met the same
Describe the bug
I wrote some resnet50 and resnet18 configuration files in mmclassification and experimented with their performance in mmclassification. Then I want to distill these two configs in mmrazor, but it reports an error: only external packages can be imported, such as mmdet: "xxx/xxx.py".
Traceback (most recent call last): File "tools/train.py", line 121, in <module> main() File "tools/train.py", line 114, in main runner = Runner.from_cfg(cfg) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/runner/runner.py", line 431, in from_cfg runner = cls( File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/runner/runner.py", line 398, in __init__ self.model = self.build_model(model) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/runner/runner.py", line 800, in build_model model = MODELS.build(model) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/registry/registry.py", line 521, in build return self.build_func(cfg, *args, **kwargs, registry=self) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/registry/build_functions.py", line 240, in build_model_from_cfg return build_from_cfg(cfg, registry, default_args) File "/home/s316/miniconda3/envs/razor-1x/lib/python3.8/site-packages/mmengine/registry/build_functions.py", line 135, in build_from_cfg raise type(e)( ValueError: class `OverhaulFeatureDistillation` in mmrazor/models/algorithms/distill/configurable/overhaul_feature_distillation.py: `_get_package_and_cfg_path` is used for get external package, please specify the package name and relative config path, just like `mmdet::faster_rcnn/faster-rcnn_r50_fpn_1x_coco.py`
To Reproduce
The command you executed.
python tools/train.py configs/distill/mmcls/r50_train_r18_abtest.py --work-dir work_dirs/test/
Post related information
- The output of
pip list | grep "mmcv\|mmrazor\|^torch"
mmcv 2.0.0rc4 mmrazor 1.0.0rc2 /home/s316/workspace/xionggl/Experiment/mmrazor torch 1.12.0 torchaudio 0.12.0 torchvision 0.13.0
- Your config file if you modified it or created a new one.
_base_ = [ #'mmcls::_base_/datasets/imagenet_bs32.py', '/home/s316/workspace/xionggl/Experiment/mmclassification/configs/_base_/datasets/dermnet_bs64.py', #'mmcls::_base_/schedules/imagenet_bs256.py', '/home/s316/workspace/xionggl/Experiment/mmclassification/configs/_base_/schedules/dermnet_bs64_warmup_coslr.py', #'mmcls::_base_/default_runtime.py' '/home/s316/workspace/xionggl/Experiment/mmclassification/configs/_base_/default_runtime.py' ] teacher_ckpt = 'https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth' # noqa: E501 model = dict( _scope_='mmrazor', type='SingleTeacherDistill', data_preprocessor=dict( type='ImgDataPreprocessor', mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], bgr_to_rgb=True), architecture=dict( cfg_path='/home/s316/workspace/xionggl/Experiment/mmclassification/configs/resnet/resnet18_b64_dermnet.py', pretrained=False), teacher=dict( cfg_path='/home/s316/workspace/xionggl/Experiment/mmclassification/configs/resnet/resnet50_b64_dermnet.py', pretrained=True), teacher_ckpt=teacher_ckpt, #calculate_student_loss=False, calculate_student_loss=True, distiller=dict( type='ConfigurableDistiller', student_recorders=dict( bb_s4=dict(type='ModuleOutputs', source='backbone.layer4.1.conv2'), bb_s3=dict(type='ModuleOutputs', source='backbone.layer3.1.conv2'), bb_s2=dict(type='ModuleOutputs', source='backbone.layer2.1.conv2'), bb_s1=dict(type='ModuleOutputs', source='backbone.layer1.1.conv2')), teacher_recorders=dict( bb_s4=dict(type='ModuleOutputs', source='backbone.layer4.2.conv3'), bb_s3=dict(type='ModuleOutputs', source='backbone.layer3.5.conv3'), bb_s2=dict(type='ModuleOutputs', source='backbone.layer2.3.conv3'), bb_s1=dict(type='ModuleOutputs', source='backbone.layer1.2.conv3')), distill_losses=dict( loss_s4=dict(type='ABLoss', loss_weight=1.0), loss_s3=dict(type='ABLoss', loss_weight=0.5), loss_s2=dict(type='ABLoss', loss_weight=0.25), loss_s1=dict(type='ABLoss', loss_weight=0.125)), connectors=dict( loss_s4_sfeat=dict( type='ConvModuleConncetor', in_channel=512, out_channel=2048, norm_cfg=dict(type='BN'), act_cfg=None), loss_s3_sfeat=dict( type='ConvModuleConncetor', in_channel=256, out_channel=1024, norm_cfg=dict(type='BN'), act_cfg=None), loss_s2_sfeat=dict( type='ConvModuleConncetor', in_channel=128, out_channel=512, norm_cfg=dict(type='BN'), act_cfg=None), loss_s1_sfeat=dict( type='ConvModuleConncetor', in_channel=64, out_channel=256, norm_cfg=dict(type='BN'), act_cfg=None)), loss_forward_mappings=dict( loss_s4=dict( s_feature=dict( from_student=True, recorder='bb_s4', connector='loss_s4_sfeat'), t_feature=dict(from_student=False, recorder='bb_s4')), loss_s3=dict( s_feature=dict( from_student=True, recorder='bb_s3', connector='loss_s3_sfeat'), t_feature=dict(from_student=False, recorder='bb_s3')), loss_s2=dict( s_feature=dict( from_student=True, recorder='bb_s2', connector='loss_s2_sfeat'), t_feature=dict(from_student=False, recorder='bb_s2')), loss_s1=dict( s_feature=dict( from_student=True, recorder='bb_s1', connector='loss_s1_sfeat'), t_feature=dict(from_student=False, recorder='bb_s1'))))) find_unused_parameters = True train_cfg = dict(by_epoch=True, max_epochs=20, val_interval=1) val_cfg = dict(_delete_=True, type='mmrazor.SingleTeacherDistillValLoop')
- Your train log file if you meet the problem during training. [here]
- Other code you modified in the
mmrazor
folder. [here]Additional context
Add any other context about the problem here.
[here]
do you solve the problem? i met the same
Seems the 1.0.0 version mmrazor
and 0.7.3 version mmengine
support the usage of local config files. I'm performing the distillation of models from mmagic
package, when defining the cfg_path
in the config of SingleTeacherDistill
model, put the full path to the local config file after "mmagic::" or other similar prefix. Example:
model = dict(
_scope_='mmrazor',
type='SingleTeacherDistill',
architecture=dict(
cfg_path='mmagic::/home/user/full/path/to/config/student.py', pretrained=False),
teacher=dict(
cfg_path='mmagic::/home/user/full/path/to/config/teacher.py', pretrained=False),
teacher_ckpt=teacher_ckpt,
......
)
Also, if a config file in the _base_
list is inherited from other packages, refer to it in the same format. 😃
Seems the 1.0.0 version and 0.7.3 version support the usage of local config files. I'm performing the distillation of models from package, when defining the in the config of model, put the full path to the local config file after "mmagic::" or other similar prefix. Example:
mmrazor``mmengine``mmagic``cfg_path``SingleTeacherDistill
model = dict( _scope_='mmrazor', type='SingleTeacherDistill', architecture=dict( cfg_path='mmagic::/home/user/full/path/to/config/student.py', pretrained=False), teacher=dict( cfg_path='mmagic::/home/user/full/path/to/config/teacher.py', pretrained=False), teacher_ckpt=teacher_ckpt, ...... )
Also, if a config file in the list is inherited from other packages, refer to it in the same format. 😃
_base_
Thank you for giving me the information. I was able to run training and distillation.
Describe the bug
I wrote some resnet50 and resnet18 configuration files in mmclassification and experimented with their performance in mmclassification. Then I want to distill these two configs in mmrazor, but it reports an error: only external packages can be imported, such as mmdet: "xxx/xxx.py".
To Reproduce
The command you executed.
Post related information
The output of
pip list | grep "mmcv\|mmrazor\|^torch"
Your config file if you modified it or created a new one.
mmrazor
folder. [here]Additional context
Add any other context about the problem here.
[here]