Open aoji0606 opened 1 year ago
Hi @aoji-tjut,
the parameter "global_pruning" determines how pruning ratios are applied to the layers of a neural network.
If "global_pruning" is set to False, then uniform layer sparsity is used, meaning that each layer will have the same pruning ratio applied to it. Importance ranking only happens in local layers.
On the other hand, if "global_pruning" is set to True, a global ranking of all channels across different layers will be performed to determine layer sparsity.
When "global_pruning" is set to True, there is a risk of over-pruning some channels to 0.
Ok, got it. Thank you very much~
I think it is indeed confusing. I will try to rename it. Thank you for the issue.
hi, When I was pruning, I tried the parameter global-pruning, but I felt that there was a big difference in accuracy, so I was not very clear about the meaning of this parameter