Closed emipasat closed 4 years ago
Hi @emipasat,
Sorry for not being clear. My question was if is worth to freeze the backbone layers and to train only the ones added by you, the last ones. Would it work that, too? Thank you
You could give it a try. However, in my experience, fine-tuning generally gives better results than transfer learning.
When training the RPN Model for example, wouldn't be more suitable to reuse the weights for 'imagenet' instead of retrain all model from scratch? Did you noticed better results when training all model instead of only last layers? Thank you