google-research / jaxpruner

Apache License 2.0
206 stars 14 forks source link

Roadmap and experimental.sparse #8

Closed stepp1 closed 11 months ago

stepp1 commented 1 year ago

Hi everyone!

First, I wanted to say thanks for such an easy-to-use library! I've been using Jaxpruner for some weeks now, resulting in highly pruned convolutional models (mostly with unstructured pruners).

I wanted to ask if there was a roadmap for the library and if you had any expected release date for the "integration with the jax.experimental.sparse"

Thanks in advance

evcu commented 1 year ago

Hi Stefano,

We have a colab which converts a given checkpoint to jax.experimental.sparse, we plan to release this relatively soon. Apart from that, we are closely tracking the developments on the jax side. Let us know if you have any feature requests.

Best

stepp1 commented 1 year ago

Hi there, Utku! Sorry for such a late answer.

Here are some ideas off the top of my head that might go a long way toward improving Jaxpruner's adoption:

I'll be happy to help with any of these (if it's internally possible).

Cheers!

yuyuzh commented 1 year ago

Hi Stefano,

We have a colab which converts a given checkpoint to jax.experimental.sparse, we plan to release this relatively soon. Apart from that, we are closely tracking the developments on the jax side. Let us know if you have any feature requests.

Best

Hi Utku,

I also have the same problem and wonder how to convert the pruned dense model parameters to a sparse BCOO format with jax.experimental.sparse. In the paper it says there's an example but I'm a little confused about where to find it. Could you please give me some guide about this problem?

Thanks a lot!

evcu commented 11 months ago

Sorry for being extremely slow with this. Finally we found the time to push the colab out. The commit #93bc832 solves this. Soon I'm hoping to share some-checkpoints and update the colab with real checkpoints.