issues
search
dmlc
/
mxnet-notebooks
Notebooks for MXNet
Apache License 2.0
615
stars
325
forks
source link
Some question about finetuning
#13
Closed
wangxianliang
closed
7 years ago
wangxianliang
commented
7 years ago
During fine-tuning, is it possible to set some layers' learning rate as 0, if possible, how to do?
In multi-task fine-tuning, how to alter multi-losses with the new one?
If I want to remove the last several layers, and replace them with new layers, how to implement? Thanks a lot!