dmlc / mxnet-notebooks

Notebooks for MXNet
Apache License 2.0
615 stars 325 forks source link

Some question about finetuning #13

Closed wangxianliang closed 7 years ago

wangxianliang commented 7 years ago
  1. During fine-tuning, is it possible to set some layers' learning rate as 0, if possible, how to do?
  2. In multi-task fine-tuning, how to alter multi-losses with the new one?
  3. If I want to remove the last several layers, and replace them with new layers, how to implement? Thanks a lot!