FurkanOM / tf-faster-rcnn

Tensorflow 2 Faster-RCNN implementation from scratch supporting to the batch processing with MobileNetV2 and VGG16 backbones
Apache License 2.0
94 stars 61 forks source link

Beginner question about frozen layers and transfer learning #1

Closed emipasat closed 4 years ago

emipasat commented 4 years ago

When training the RPN Model for example, wouldn't be more suitable to reuse the weights for 'imagenet' instead of retrain all model from scratch? Did you noticed better results when training all model instead of only last layers? Thank you

FurkanOM commented 4 years ago

Hi @emipasat,

  1. The model is already using imagenet weights. You need to read about default params of keras applications, for VGG16.
  2. The model is already working to train not only the last layers but all the layers.
emipasat commented 4 years ago

Sorry for not being clear. My question was if is worth to freeze the backbone layers and to train only the ones added by you, the last ones. Would it work that, too? Thank you

FurkanOM commented 4 years ago

You could give it a try. However, in my experience, fine-tuning generally gives better results than transfer learning.