-
following this toturial(https://github.com/quic/aimet/blob/develop/Examples/torch/compression/channel_pruning.py), a big model has been pruned. here, comress.py is my script, and compress_model.pt is…
-
Not sure if this is intended behavior, but it looks like there might be an issue with concatenation based on the following test.
Code:
```
class TestModule(nn.Module):
def __init__(self, in_…
-
hi,
When I was pruning, I tried the parameter global-pruning, but I felt that there was a big difference in accuracy, so I was not very clear about the meaning of this parameter
-
Pruning a model with GLU results in an error when finding importance. GLU does not have any params but halves the input (in the given dimension). This is not accounted for during tracing, assigning in…
-
For better performance, we can destroy multiple snapshots with one ioctl, e.g. from the CLI `zfs destroy pool/fs@a,b` (where `a` and `b` are both snapshots in `pool/fs`). From libzfs_core (or channel…
-
Prune ratio: {1-remain_num/len(sorted_bn):.3f}
mAP of the pruned model is {mAP:.4f}
layer index: {idx:>3d} total channel: {mask.shape[0]:>4d} remaining channel: {remain:>4d}
layer index: {idx:>…
-
After reading the paper it never made sense that structured pruning was pruning what, parameters, weights? I think it's a bit abstract, please answer my questions, thanks!
-
Hi, thanks for this good repo, I wonder how can I load the pruned model with YOLOv8?
when I try to run the below command to get the validation results and GFLOP and number of layers, I get this error…
-
Is there a way to force the pruning to remove the same amount of parameters from all layers?
This would make the resulting model compatible with hf implementation (use from_pretrained)
-
source 1_train_ofa_model.sh
Traceback (most recent call last):
File "mynet_ofa_pruning.py", line 358, in
ofa_model = ofa_pruner.ofa_model(args.expand_ratio, args.channel_divisible, args.ex…