Closed pluskid closed 9 years ago
Part of #29. Provide Ability to freeze some layers in the net. This is useful for bottom-up layer-wise pre-training. The user could freeze the lower trained layer and train only the top layer.
Coverage increased (+0.0%) when pulling 18e5f6db796a3adced7388984960ce734968dcfc on dA into 9b4bd0c6d42279857a635b133d5ab0feb664e875 on master.
Part of #29. Provide Ability to freeze some layers in the net. This is useful for bottom-up layer-wise pre-training. The user could freeze the lower trained layer and train only the top layer.