yoniaflalo / knapsack_pruning

Implementation of knapsack pruning
Apache License 2.0
28 stars 5 forks source link

Pruning all channels from layer #1

Open LsNatan opened 3 years ago

LsNatan commented 3 years ago

Hi, I ran the following command:

/home/natan/datasets/ILSVRC2012/ -b=32 --amp --model=resnet18 --lr=0.02 --sched=cosine -bp=32 --pruning_ratio=0.27 --prune --prune_skip --gamma_knowledge=20 --epochs=50

In the "redesign_module_resnet" function, the program throws an error:

Traceback (most recent call last): File "/home/natan/github_repos/pytorch/pruning/knapsack_pruning-master/train_pruning.py", line 727, in main() File "/home/natan/github_repos/pytorch/pruning/knapsack_pruning-master/train_pruning.py", line 412, in main local_rank=args.local_rank, input_size=data_config['input_size'][1]) File "/home/natan/github_repos/pytorch/pruning/knapsack_pruning-master/external/utils_pruning.py", line 801, in redesign_module_resnet groups=m.groups, stride=m.stride) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 408, in init False, _pair(0), groups, bias, padding_mode) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 83, in init self.reset_parameters() File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 86, in reset_parameters init.kaiminguniform(self.weight, a=math.sqrt(5)) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/init.py", line 381, in kaiminguniform fan = _calculate_correct_fan(tensor, mode) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/init.py", line 350, in _calculate_correct_fan fan_in, fan_out = _calculate_fan_in_and_fan_out(tensor) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/init.py", line 282, in _calculate_fan_in_and_fan_out receptive_field_size = tensor[0][0].numel() IndexError: index 0 is out of bounds for dimension 0 with size 0

After some investigation, it seems that the pruning logic wants to prune all the channels of layer layer4.1.conv1 Does the program includes some kind of logic which suppose to prevent layer-collapse?

yoniaflalo commented 3 years ago

Hi, I ran the following command:

/home/natan/datasets/ILSVRC2012/ -b=32 --amp --model=resnet18 --lr=0.02 --sched=cosine -bp=32 --pruning_ratio=0.27 --prune --prune_skip --gamma_knowledge=20 --epochs=50

In the "redesign_module_resnet" function, the program throws an error:

Traceback (most recent call last): File "/home/natan/github_repos/pytorch/pruning/knapsack_pruning-master/train_pruning.py", line 727, in main() File "/home/natan/github_repos/pytorch/pruning/knapsack_pruning-master/train_pruning.py", line 412, in main local_rank=args.local_rank, input_size=data_config['input_size'][1]) File "/home/natan/github_repos/pytorch/pruning/knapsack_pruning-master/external/utils_pruning.py", line 801, in redesign_module_resnet groups=m.groups, stride=m.stride) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 408, in init False, _pair(0), groups, bias, padding_mode) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 83, in init self.reset_parameters() File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/modules/conv.py", line 86, in reset_parameters init.kaiminguniform(self.weight, a=math.sqrt(5)) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/init.py", line 381, in kaiminguniform fan = _calculate_correct_fan(tensor, mode) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/init.py", line 350, in _calculate_correct_fan fan_in, fan_out = _calculate_fan_in_and_fan_out(tensor) File "/home/natan/miniconda3/envs/pytorch_1.6.0/lib/python3.6/site-packages/torch/nn/init.py", line 282, in _calculate_fan_in_and_fan_out receptive_field_size = tensor[0][0].numel() IndexError: index 0 is out of bounds for dimension 0 with size 0

After some investigation, it seems that the pruning logic wants to prune all the channels of layer layer4.1.conv1 Does the program includes some kind of logic which suppose to prevent layer-collapse?

Hi, I did not implement such a logic, but I definitely should do it, as soon as I find time in my schedule.