Closed linetor closed 6 years ago
I have get the same problem, when I use a pre-train model, for layer in pretrain_model.layers: layer.trainable = False
, the bidirectional wrapper layer also can be training.
same issue. Even I set
model.get_layer("blstm").trainable = False model.get_layer("blstm").layer.trainable = False model.get_layer("blstm").forward_layer.trainable = False model.get_layer("blstm").backward_layer.trainable = False
.
The Bidirectional layer still trainable.
Looking into it, thanks for the report.
I need to freeze lstm layer(using Bidirectional wrapper). So I freeze layer using
singleModel.layers[x].trainable = False
But it seems to not be frozen.After checking layer(by seeing # of Trainable params), I find that forward_layer,backward_layer flag need to be set. like below
I wonder is it intended? If not, I think that if layers.trainable=False, it needs to be freeze forward_layer,backward_layer.
Thanks