This PR adds the ability to freeze BatchNorm layers, which is often done when finetuning algorithms (such as Faster RCNN). The freezing is implemented in a custom BatchNorm layer which calls Keras' BatchNorm function, while setting the two different training and trainable flags. The reason for a custom layer is because the alternative would look something like this:
This PR adds the ability to freeze BatchNorm layers, which is often done when finetuning algorithms (such as Faster RCNN). The freezing is implemented in a custom BatchNorm layer which calls Keras' BatchNorm function, while setting the two different
training
andtrainable
flags. The reason for a custom layer is because the alternative would look something like this:The custom layer changes this to:
which saves lines and complexity.