Closed pranavteddu closed 4 years ago
Hi @pranavteddu,
You cannot arbitrarily replace filter pruning with channel pruning. This is explained here. Cheers Neta
Hi @nzmora can I not explicitly prune and apply thinning on channels? Is there any way to do that?
Removing channels and filters is simple enough on sequential DNNs (e.g. VGG), but when you have more complicated data-dependencies (e.g. ResNet residuals and identity concatenation) then you have to be aware of these dependencies and explicitly handle them. This is all explained in the link I provided. Cheers Neta
Thanks alot @nzmora now I understand how pruning the channels was effecting the previous layers. Thanks for the help.
Hi While trying to apply channel thinning on resnet20_cifar Im facing the issue with the layer input size mismatch at layer1.conv1 It was working fine with the filter remover but throwing the error filter thinning is replaced with channel thinning. RuntimeError: Given groups=1, weight of size 16 16 3 3, expected input[256, 12, 32, 32] to have 16 channels, but got 12 channels instead
Following is the config file used:
version: 1
pruners: low_pruner: class: L1RankedStructureParameterPruner_AGP initial_sparsity : 0.10 final_sparsity: 0.30 group_type: Channels weights: [module.layer2.0.conv1.weight, module.layer2.0.conv2.weight, module.layer2.0.downsample.0.weight, module.layer2.1.conv2.weight, module.layer2.2.conv2.weight, module.layer2.1.conv1.weight, module.layer2.2.conv1.weight]
fine_pruner: class: AutomatedGradualPruner initial_sparsity : 0.05 final_sparsity: 0.70 weights: [module.layer3.1.conv1.weight, module.layer3.1.conv2.weight, module.layer3.2.conv1.weight, module.layer3.2.conv2.weight]
fc_pruner: class: L1RankedStructureParameterPruner_AGP initial_sparsity : 0.05 final_sparsity: 0.50 group_type: Rows weights: [module.fc.weight]
lr_schedulers: pruning_lr: class: StepLR step_size: 50 gamma: 0.10
extensions: net_thinner: class: 'ChannelRemover' thinning_func_str: remove_channels arch: 'resnet20_cifar' dataset: 'cifar10'
policies:
pruner: instance_name : low_pruner starting_epoch: 0 ending_epoch: 10 frequency: 2
extension: instance_name: net_thinner epochs: [5]
pruner: instance_name : fine_pruner starting_epoch: 30 ending_epoch: 50 frequency: 2
pruner: instance_name : fc_pruner starting_epoch: 30 ending_epoch: 50 frequency: 2
lr_scheduler: instance_name: pruning_lr starting_epoch: 0 ending_epoch: 400 frequency: 1