jacobgil / pytorch-pruning

PyTorch Implementation of [1611.06440] Pruning Convolutional Neural Networks for Resource Efficient Inference
875 stars 202 forks source link

Runtime Error #17

Open jagadeesh09 opened 6 years ago

jagadeesh09 commented 6 years ago

Hi

I have encountered this error while running the code. After getting the information about the filters which are to be pruned, while pruning the filters this issue occured.


/usr/local/lib/python2.7/dist-packages/torchvision/transforms/transforms.py:156: UserWarning: The use of the transforms.Scale transform is deprecated, please use transforms.Resize instead.
  "please use transforms.Resize instead.")
/usr/local/lib/python2.7/dist-packages/torchvision/transforms/transforms.py:397: UserWarning: The use of the transforms.RandomSizedCrop transform is deprecated, please use transforms.RandomResizedCrop instead.
  "please use transforms.RandomResizedCrop instead.")
Accuracy : 0.9248
Number of prunning iterations to reduce 67% filters 5
Ranking filters.. 
Layers that will be prunned {0: 5, 2: 4, 5: 7, 7: 6, 10: 25, 12: 24, 14: 19, 17: 42, 19: 50, 21: 66, 24: 72, 26: 74, 28: 118}
Prunning filters.. 
Traceback (most recent call last):
  File "finetune.py", line 270, in <module>
    fine_tuner.prune()
  File "finetune.py", line 228, in prune
    model = prune_vgg16_conv_layer(model, layer_index, filter_index)
  File "/disk2/jagadeesh/pytorch-pruning/prune.py", line 33, in prune_vgg16_conv_layer
    bias = conv.bias)
  File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/conv.py", line 278, in __init__
    False, _pair(0), groups, bias)
  File "/usr/local/lib/python2.7/dist-packages/torch/nn/modules/conv.py", line 34, in __init__
    if bias:
  File "/usr/local/lib/python2.7/dist-packages/torch/autograd/variable.py", line 125, in __bool__
    torch.typename(self.data) + " is ambiguous")
RuntimeError: bool value of Variable objects containing non-empty torch.FloatTensor is ambiguous```

Thanks
guangzhili commented 6 years ago

@jagadeesh09 The type of parameter "bias" in torch.nn.Conv2d should be bool(Check API at http://pytorch.org/docs/master/nn.html), but conv.bias is FloatTensor, change to bias = True if conv.bias is not None, otherwise False.

MrLinNing commented 6 years ago

Hi , I meet the problem, can your help me solved it

Accuracy : 0.9852
Number of prunning iterations to reduce 67% filters 5 
Ranking filters.
Layers that will be prunned {0: 4, 2: 10, 5: 6, 7: 7, 10: 24, 12: 19, 14: 18, 17: 59, 19: 62, 21: 58, 24: 71, 26: 87
Prunning filters..
Traceback (most recent call last):
    File "finetune.py", line 270, in <module>
    fine_tuner.prune()
    File "finetune.py", line 228, in prune
        model = prune_vgg16_conv_layer(model, layer_index, filter_index)
   File "/home/linning/pytorch_test/pytorch-pruning/prune.py", line 14, in prune_vgg16_conv_layer
        _, conv = model._modules.items()[layer_index]
TypeError: 'odict_items' object does not support indexing
XUHUAKing commented 6 years ago

@MrLinNing _modules will return OrderedDict() object which does not support indexing, try this instead: _, conv = list(model._modules.items())[layer_index]

RgZhangLihao commented 6 years ago

@XUHUAKing I met this problem, but I don't know how to solve it: Accuracy: 0.98 Number of prunning iterations to reduce 67% filters 5 Ranking filters.. Layers that will be prunned {28: 59, 24: 139, 26: 135, 21: 54, 17: 55, 19: 52, 14: 11, 12: 5, 10: 2} Prunning filters.. Traceback (most recent call last): File "finetune.py", line 269, in fine_tuner.prune() File "finetune.py", line 227, in prune model = prune_vgg16_conv_layer(model, layer_index, filter_index) File "/home/share2/zhanglihao/pytorch-pruning/prune.py", line 14, in prune_vgg16_convlayer , conv = list(model._modules.items())[layer_index] IndexError: list index out of range

Can you help me? Thanks!

XUHUAKing commented 6 years ago

@RgZhangLihao I am not sure of your error because I didn't meet this when I ran my own modified program.

However, in my program, I added constraint if module.weight.data.size(0) <= 1: #skip pruning this layer when cutting filters, to ensure during the pruning, I would not delete an entire layer, which may affect the layer index within your model in the future and cause your error.

This is just my guess, hope this help.

fbiying87 commented 5 years ago

@XUHUAKing I have the same issue with the list index out of the range. Can you share the prune.py script with me, as I don't know where to put the if condition exactly. Thanks!

Pyramiding commented 5 years ago

@jagadeesh09 The type of parameter "bias" in torch.nn.Conv2d should be bool(Check API at http://pytorch.org/docs/master/nn.html), but conv.bias is FloatTensor, change to bias = True if conv.bias is not None, otherwise False.

If I want to use trained bias parameters, how to do ?