Open mingbocui opened 5 years ago
It is because I wanted to load only a transformer part of saved model, not the whole model.
@dhlee347 thanks for your reply. I have one more question, if I change the number of BERT layers from 12 to 6, should I change the key[12:] to key[6:]?
Could I kindly ask that what is the meaning of key[12:]: value when you load a pretrained_model? Just want to keep the last layer? Thanks, hope for your reply.