torch / nn

Other
1.34k stars 968 forks source link

How to freeze a layer? #394

Closed arashno closed 8 years ago

arashno commented 8 years ago

Hi everyone. I am wondering how can I freeze one or more layers of a network during training? Thanks

soumith commented 8 years ago

You can set that layer's accGradParameters function to an empty function.

SunnyWay commented 7 years ago

Hi @soumith. Is it possible to implement a Freeze container which freezes the modules inside? I think it will be more convenient than setting the layer's accGradParameters function to an empty function.

dno89 commented 7 years ago

Hi everyone, does the proposed solution work even if you flatten the parameters' tensor for training with optim?

joeyhng commented 7 years ago

It won't work if you have weight decay. I would make parameters() function to an empty function.

SunnyWay commented 7 years ago

@dno89 I implemented a Container which shared the idea of @joeyhng. You can find it on torch forum: https://groups.google.com/forum/#!topic/torch7/NMb4DcUaKdg

dno89 commented 7 years ago

@SunnyWay nice solution, thank you!