Closed funan-jhc-lee closed 3 years ago
model.named_parameters()
and set requires_grad=False
for params you want to fix:
model = NeuralNetwork(...)
...
model.load(model_saved_path)
for name, p in model.named_parameters():
if keyword in name: # keyword
is some keyword in the layer you want to fix, e.g. layer1
p.requires_grad = False
- different lr requires multiple optimizers; unfortunately it is not supported out of the box now.
Hello!As stated in the title, is it possible to specify different lr for different layer or fix some layer, when retraining a model?I‘ve read the document, but It seems not possible. Thanks a lot for taking the time to answer this question!