VainF / Torch-Pruning

[CVPR 2023] Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs
https://arxiv.org/abs/2301.12900
MIT License
2.59k stars 318 forks source link

Prune Yolov8-pose failed #302

Open J0eky opened 9 months ago

J0eky commented 9 months ago

When I tried to prune yolov8m-pose.pt by using yolov8_pruning.py, I encountered a bug: Traceback (most recent call last): File "/home/huangjun/ultralytics/yolov8_pruning.py", line 390, in prune(args) File "/home/huangjun/ultralytics/yolov8_pruning.py", line 334, in prune pruner.step() File "/home/huangjun/.conda/envs/yolov8_hj/lib/python3.8/site-packages/torch_pruning/pruner/algorithms/group_norm_pruner.py", line 86, in step super(GroupNormPruner, self).step(interactive=interactive) File "/home/huangjun/.conda/envs/yolov8_hj/lib/python3.8/site-packages/torch_pruning/pruner/algorithms/metapruner.py", line 227, in step for group in pruning_method(): File "/home/huangjun/.conda/envs/yolov8_hj/lib/python3.8/site-packages/torch_pruning/pruner/algorithms/metapruner.py", line 343, in prune_local imp = self.estimate_importance(group, ch_groups=ch_groups) File "/home/huangjun/.conda/envs/yolov8_hj/lib/python3.8/site-packages/torch_pruning/pruner/algorithms/metapruner.py", line 231, in estimate_importance return self.importance(group, ch_groups=ch_groups) File "/home/huangjun/.conda/envs/yolov8_hj/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/huangjun/.conda/envs/yolov8_hj/lib/python3.8/site-packages/torch_pruning/pruner/importance.py", line 199, in call w = layer.weight.data[idxs] IndexError: index 576 is out of bounds for dimension 0 with size 576

I have no idea how to solve it.

VainF commented 9 months ago

Hi @J0eky, thanks for the issue. Will check the pose model ASAP.

J0eky commented 8 months ago

@VainF Hi, after updating torch-pruning from version 1.3.1 to 1.3.3, I didn't encounter the bug I mentioned earlier, and the pruning program could run without any issue. However, I noticed a new problem that the pruned model was identical to the one before pruning. In other words, the pruning operation didn't seem to have any effect.

J0eky commented 8 months ago

here is my code

import torch import torch.nn as nn from ultralytics import YOLO import torch_pruning as tp

from ultralytics.nn.modules import Pose

def prune():

load trained yolov8x model

model = YOLO('yolov8m-pose.pt')

for name, param in model.model.named_parameters():
    param.requires_grad = True

# pruning
model.model.eval()
example_inputs = torch.randn(1, 3, 640, 640).to(model.device)
imp = tp.importance.MagnitudeImportance(p=1)

ignored_layers = []
unwrapped_parameters = []

modules_list = list(model.model.modules())
for i, m in enumerate(modules_list):
    if isinstance(m, (Pose,)):
        ignored_layers.append(m)

iterative_steps = 1  # progressive pruning
pruner = tp.pruner.GroupNormPruner(
    model.model,
    example_inputs,
    importance=tp.importance.GroupNormImportance(),  # L2 norm pruning,
    iterative_steps=1,
    pruning_ratio=0.5,
    ignored_layers=ignored_layers,
    unwrapped_parameters=unwrapped_parameters
)
# pruner = tp.pruner.MagnitudePruner(
#     model.model,
#     example_inputs,
#     importance=imp,
#     iterative_steps=iterative_steps,
#     global_pruning=True,
#     pruning_ratio=0.5,  # remove 50% channels
#     ignored_layers=ignored_layers,
#     unwrapped_parameters=unwrapped_parameters
# )
base_macs, base_nparams = tp.utils.count_ops_and_params(model.model, example_inputs)
pruner.step()

pruned_macs, pruned_nparams = tp.utils.count_ops_and_params(pruner.model, example_inputs)
print(model.model)

print("Before Pruning: MACs=%f G, #Params=%f G" % (base_macs / 1e9, base_nparams / 1e9))
print("After Pruning: MACs=%f G, #Params=%f G" % (pruned_macs / 1e9, pruned_nparams / 1e9))

# print(pruner.model)
# fine-tuning, TBD

if name == "main": prune()

J0eky commented 8 months ago

@VainF Hi, after updating torch-pruning to 1.3.3, the pruner.step() in yolov8_pruning.py didn't execute.

J0eky commented 8 months ago

@VainF Hi, after updating torch-pruning from version 1.3.1 to 1.3.3, I didn't encounter the bug I mentioned earlier, and the pruning program could run without any issue. However, I noticed a new problem that the pruned model was identical to the one before pruning. In other words, the pruning operation didn't seem to have any effect.

debug found that the pruner.step() didn't execute