Keras has a trainable param for layers. I would like to set this to False when I load the predefined weights for fine tuning, so that only the top layers I define are trained. I probably can do this myself with a PR but is there another mechanism to achieve this?
Keras has a
trainable
param for layers. I would like to set this toFalse
when I load the predefined weights for fine tuning, so that only the top layers I define are trained. I probably can do this myself with a PR but is there another mechanism to achieve this?