aurora95 / Keras-FCN

Keras-tensorflow implementation of Fully Convolutional Networks for Semantic Segmentation(Unfinished)
MIT License
650 stars 268 forks source link

locking BatchNormalization Layers during training #15

Open ahundt opened 7 years ago

ahundt commented 7 years ago

Okay, I spoke with the author of https://github.com/warmspringwinds/tf-image-segmentation, and he mentioned another key aspect to maximizing performance when training resnet with pretrained weights for segmentation tasks is to lock the batch normalizations so they never change while training segmentation. I'm curious if there is an easy way to do that with Keras.

aurora95 commented 7 years ago

Hmm, a tricky way could be setting momentum to 1 and trainable=False I think... But I don't know if there is better way if you don't change Keras source code.