Closed arashno closed 8 years ago
You can set that layer's accGradParameters function to an empty function.
Hi @soumith. Is it possible to implement a Freeze container which freezes the modules inside? I think it will be more convenient than setting the layer's accGradParameters function to an empty function.
Hi everyone, does the proposed solution work even if you flatten the parameters' tensor for training with optim?
It won't work if you have weight decay. I would make parameters() function to an empty function.
@dno89 I implemented a Container which shared the idea of @joeyhng. You can find it on torch forum: https://groups.google.com/forum/#!topic/torch7/NMb4DcUaKdg
@SunnyWay nice solution, thank you!
Hi everyone. I am wondering how can I freeze one or more layers of a network during training? Thanks