Open bmsookim opened 7 years ago
Is fine-tuning the pretrained model with different learning rates for each step is possible?
For example, adapting 0.001 rate for the first convolution, 0.0001 for the second convolution, 0.00001 for the third convolution... etc.
You can set your own learning rate policy on "train.lua" script. At last of the script, you can modify "Trainer:learningRate" function.
Is fine-tuning the pretrained model with different learning rates for each step is possible?
For example, adapting 0.001 rate for the first convolution, 0.0001 for the second convolution, 0.00001 for the third convolution... etc.