calgaryml / condensed-sparsity

[ICLR 2024] Dynamic Sparse Training with Structured Sparsity
https://openreview.net/forum?id=kOBkxFRKTA
MIT License
17 stars 3 forks source link

How to use SRigL to adapt "N:M sparsity" #64

Closed erwangccc closed 8 months ago

erwangccc commented 1 year ago

Hi, @evcu @yanii

Thanks for your great work first. The paper said that:

This structure is a particular case of “N:M sparsity” which requiresN out ofM consecutive weights to be non-zero

So i want to know that how to use SRigL to adapt "N:M sparsity"(such as 2:4/4:8 Structured Sparsity).

Thanks in advance.

mklasby commented 1 year ago

Hi @erwangccc, Currently we have not implemented arbritary N:M support, only constant fan-in. If you're willing to migrate to jax, I suggest checking out the jaxpruner library which does support arbritarty N:M sparisty. We are working on migrating methods from this repo to that library as well.

Jaxpruner: https://github.com/google-research/jaxpruner