Shawn1993 / cnn-text-classification-pytorch

CNNs for Sentence Classification in PyTorch
Apache License 2.0
1.02k stars 278 forks source link

How to make the embedding changeable in backpropagation #24

Open Z-Jeff opened 4 years ago

Z-Jeff commented 4 years ago

It seems that the word embedding are kept static during training. How to make the embedding changeable in backpropagation?

rafaelgreca commented 2 years ago

I know that it is a old issue, but just set the requires_grad parameter to True (the default value is True), like this:

## create the embedding layer
self.embedding = nn.Embedding(vocab_size, embedding_dim, padding_idx = 0)
self.embedding.load_state_dict({'weight': embedding_weights})
self.embedding.weight.requires_grad = True

In this case, I am loading a pretrained embedding weights into a embedding layer and setting it to be trainable during the training process.