keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.97k stars 19.46k forks source link

Bidirectional LSTM freeze(trainable=False) need to be checked. #8754

Closed linetor closed 6 years ago

linetor commented 6 years ago

I need to freeze lstm layer(using Bidirectional wrapper). So I freeze layer using singleModel.layers[x].trainable = False But it seems to not be frozen.

After checking layer(by seeing # of Trainable params), I find that forward_layer,backward_layer flag need to be set. like below

from keras.layers import Dense, Activation,LSTM,Input,Bidirectional,Dropout,CuDNNLSTM
for x in range(len(classificationWithString.layers)-2):
    if classificationWithString.layers[x].__class__==Bidirectional:
        classificationWithString.layers[x].forward_layer.trainable = False        
        classificationWithString.layers[x].backward_layer.trainable = False            
    classificationWithString.layers[x].trainable = False

I wonder is it intended? If not, I think that if layers.trainable=False, it needs to be freeze forward_layer,backward_layer.

Thanks

ouzhi commented 6 years ago

I have get the same problem, when I use a pre-train model, for layer in pretrain_model.layers: layer.trainable = False, the bidirectional wrapper layer also can be training.

s6juncheng commented 6 years ago

same issue. Even I set

model.get_layer("blstm").trainable = False model.get_layer("blstm").layer.trainable = False model.get_layer("blstm").forward_layer.trainable = False model.get_layer("blstm").backward_layer.trainable = False.

The Bidirectional layer still trainable.

fchollet commented 6 years ago

Looking into it, thanks for the report.