VainF / Torch-Pruning

[CVPR 2023] DepGraph: Towards Any Structural Pruning
https://arxiv.org/abs/2301.12900
MIT License
2.71k stars 334 forks source link

Object detection models from torchvision not working #110

Open yummyKnight opened 1 year ago

yummyKnight commented 1 year ago

Object detection models from torchvision not working. If you try to prune pretrained object detection models from torchvision you will got an error:

    pruner = tp.pruner.MagnitudePruner(
/torch_pruning/pruner/algorithms/metapruner.py", line 62, in __init__
    self.DG = dependency.DependencyGraph().build_dependency(
torch_pruning/dependency.py", line 262, in build_dependency
    self.update_index_mapping()
/torch_pruning/dependency.py", line 632, in update_index_mapping
    self._update_split_index_mapping(node)
torch_pruning/dependency.py", line 698, in _update_split_index_mapping
    offsets.append(offsets[-1] + ch)
TypeError: unsupported operand type(s) for +: 'int' and 'NoneType'

Sample code to reproduce:

    from torchvision.io.image import read_image
    from torchvision.models.detection import retinanet_resnet50_fpn_v2, RetinaNet_ResNet50_FPN_V2_Weights
    img = read_image("mmyolo/demo/demo.jpg")
    imp = tp.importance.MagnitudeImportance(p=2)
    weights = RetinaNet_ResNet50_FPN_V2_Weights.DEFAULT
    model = retinanet_resnet50_fpn_v2(weights=weights, box_score_thresh=0.9)
    model.eval()
    ignored_layers = []
    # model.eval()
    preprocess = weights.transforms()
    input_tensort = preprocess(img)
    iterative_steps = 5  # progressive pruning
    pruner = tp.pruner.MagnitudePruner(
        model,
        input_tensort,
        importance=imp,
        iterative_steps=iterative_steps,
        ch_sparsity=0.5,  # remove 50% channels, ResNet18 = {64, 128, 256, 512} => ResNet18_Half = {32, 64, 128, 256}
        ignored_layers=ignored_layers
    )

python=3.8 torch==1.10.0 torch-pruning==1.0.0 torchvision==0.11.0

gkrisp98 commented 1 year ago

Hi, have you figured it out ? I am also facing the same problem

yummyKnight commented 1 year ago

@gkrisp98 Hi. Unfortunately, no. I figured out that problem may lay in SPP layers. So try modify architecture to avoid something like that or you can prune only backbone.