Open RunningLeon opened 1 year ago
hello @RunningLeon , sorry we don't have this feature right now, but it's on our to-do list, which stage you want to apply the channel constriction
, in pruner simulated pruning or in pruning speedup or both?
And could you explain why this feature is important to you so that we can re-evaluate this feature's priority?
For now, there is a workaround can make the remained channel number can be divisible by 8, you could set the sparsity to a certain value, here is an example:
import math
config_list = []
target_remained = 0.3
div_num = 8
for name, module in model.named_modules():
if isinstance(module, torch.nn.Conv2d):
if module.out_channels < div_num:
continue
remained_num = math.ceil((module.out_channels // div_num) * target_remained) * div_num
config_list.append({'op_names': [name], 'sparsity': (module.out_channels - remained_num) / module.out_channels})
hello @RunningLeon , sorry we don't have this feature right now, but it's on our to-do list, which stage you want to apply the
channel constriction
, in pruner simulated pruning or in pruning speedup or both? And could you explain why this feature is important to you so that we can re-evaluate this feature's priority?For now, there is a workaround can make the remained channel number can be divisible by 8, you could set the sparsity to a certain value, here is an example:
import math config_list = [] target_remained = 0.3 div_num = 8 for name, module in model.named_modules(): if isinstance(module, torch.nn.Conv2d): if module.out_channels < div_num: continue remained_num = math.ceil((module.out_channels // div_num) * target_remained) * div_num config_list.append({'op_names': [name], 'sparsity': (module.out_channels - remained_num) / module.out_channels})
@J-shang Hi, thanks for your reply.
@J-shang Hi, thanks for your reply.
- I'd prefer both.
- The motivation is that for backends like TensorRT, if the channel number meets some requirements, the performance would be better. For instance, for structured sparsity for TensorRT, the channel number should be 128 * x, see this doc.
- The workaround does not work for pruner which automatically decides different sparsity for different layers or convs.
@RunningLeon , hello, thanks for your information. I believe this is a reasonable motivation for us to make this feature a higher priority, we will discuss about this feature and let you know if this is done.
Thanks a lot. Looking forward to this feature.
Hi, I've look through the docs and could not find how to set in the pruning config(suppose we use l1-norm-pruner) so that remained convs have channel number that is divisible by a certain number, say 8. Thanks.