RAIVNLab / supsup

Code for "Supermasks in Superposition"
116 stars 19 forks source link

Densenets supermask #2

Open bhack opened 4 years ago

bhack commented 4 years ago

Have you never tried to find supermask over densenets?

mitchellnw commented 4 years ago

This seems like more of a question for

bhack commented 4 years ago

I was interested in your specific context :wink: and the comments and FAQ section in https://mitchellnw.github.io/blog/2020/supsup/ was poiting to this repo :smile_cat:

bhack commented 4 years ago

P.s. I got this vague idea reading the conclusions of https://arxiv.org/abs/2006.12156.

If he is wondering about skip connections why not about dense connections?

mitchellnw commented 4 years ago

Oops! Sorry about that :)

We tried skip-connections with resnets here which worked well.

I believe dense-connections have not been explored with supermasks and it seems like a really interesting direction!

bhack commented 4 years ago

Yes I know but I meant in the mentioned work the conclusion was more related to their strong claim that subnetworks "only needs a logarithmic factor (in all variables but depth) number of neurons per weight of the target subnetwork".

So the open question was more about the impact of convolutional and batch norm layers, skip-connections, (densenet like connections?) and LSTMs on the number of required sampled neurons to maintain a good accuracy.

bhack commented 4 years ago

I also meant that this claim could has an interesting impact in your continual learning specific setup. If you can free-up "more resources" it is useful when you need to expand on new task.

mitchellnw commented 4 years ago

Thanks, that could definitely help!

bhack commented 4 years ago

If you are interested in this see also Optimal Lottery Tickets via SubsetSum: Logarithmic Over-Parameterization is Sufficient

mitchellnw commented 4 years ago

Thank you, we have seen this but haven't taken a close look! Hopefully we can soon it seems awesome

bhack commented 4 years ago

Other then densenets another interesting direction are Transformers. Some early exploring efforts were made in:

https://arxiv.org/abs/2005.00561 https://arxiv.org/abs/2005.03454