VainF / Torch-Pruning

[CVPR 2023] Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs
https://arxiv.org/abs/2301.12900
MIT License
2.63k stars 329 forks source link

How to prune yolov8m coco model ? #389

Open valentin-phoenix opened 3 months ago

valentin-phoenix commented 3 months ago

I'm not a yolo expert. But this line may be helpful for post-training:

pruned_macs, pruned_nparams = tp.utils.count_ops_and_params(pruner.model, example_inputs)
print(model.model)
print("Before Pruning: MACs=%f G, #Params=%f M" % (base_macs / 1e9, base_nparams / 1e6))
print("After Pruning: MACs=%f G, #Params=%f M" % (pruned_macs / 1e9, pruned_nparams / 1e6))

# post-training
model.train(data='coco128.yaml', epochs=100, imgsz=640)

Reference: https://docs.ultralytics.com/modes/train/

Please replace the coco128 toy set with a full coco dataset and use a smaller learning rate (original_lr x 0.1) for post-training.

Originally posted by @VainF in https://github.com/VainF/Torch-Pruning/issues/147#issuecomment-1510190688

Could you share how to set up the best parameters to get a pruned YOLOv8m COCO model? Where do I have to change the lr and what is the best iteration step number and epoch number to choose? (related to the most recent version of this script: https://github.com/VainF/Torch-Pruning/blob/master/examples/yolov8/yolov8_pruning.py)