NVIDIA / apex

A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
BSD 3-Clause "New" or "Revised" License
8.32k stars 1.38k forks source link

can ASP be used to form NM sparsity other than 2:4? #1283

Open BNAadministrator3 opened 2 years ago

BNAadministrator3 commented 2 years ago

I have read the paper "Channel permutation for NM sparsity" . I guess it is a great job. However, I am confused whether or not the codes that are publically available can be used to prune a network into the NM sparsity other than 2:4? E.g., 1:4, or 1:8. If the codes can be used to realize the function as I say above, either directly or only with minor modifiction, please let me know. Thank you anyway.

BNAadministrator3 commented 2 years ago

After a thorough review of the codes, I just find the paper "Channel Permutations for N:M Sparsity" did not publish their codes! The url given in the paper just points out the codes of naive ASP, which is devoid of the claimed channel permutation technology!!!!!!!!!!!!

aojunzz commented 2 years ago

@BNAadministrator3
Hi, I am the first author of the paper "Learning N:M Fine-grained Structured Sparse Neural Networks From Scratch", this paper has released the source code to prune models with arbitrary NM sparsity, you can refer to https://github.com/NM-sparsity/NM-sparsity.

BNAadministrator3 commented 2 years ago

Thank you very much. I have notied your great jobs! Moreover, the codes in https://github.com/NM-sparsity/NM-sparsity. are very clear and inspiring. Keep in touch @NM-sparsity PS: After this issue is posted soon, the relevent codes declared in the paper "Channel Permutations for N:M Sparsity" are uploaded in the repository..