Open Z-Jeff opened 4 years ago
I know that it is a old issue, but just set the requires_grad parameter to True (the default value is True), like this:
## create the embedding layer
self.embedding = nn.Embedding(vocab_size, embedding_dim, padding_idx = 0)
self.embedding.load_state_dict({'weight': embedding_weights})
self.embedding.weight.requires_grad = True
In this case, I am loading a pretrained embedding weights into a embedding layer and setting it to be trainable during the training process.
It seems that the word embedding are kept static during training. How to make the embedding changeable in backpropagation?