IBM / pytorch-seq2seq

An open source framework for seq2seq models in PyTorch.
https://ibm.github.io/pytorch-seq2seq/public/index.html
Apache License 2.0
1.5k stars 376 forks source link

some question about embedding matrix in v0.1.4 #155

Closed reeuq closed 6 years ago

reeuq commented 6 years ago

In the v0.1.4,does it support the use of pre-trained word vectors to initialize the embedding matrix? If so, in the initialization method of class EncoderRNN, when we create the nn.Embedding object, we don't provide the pre-trained word vectors, and how to use the pre-trained word vectors in the embedding layer?

pskrunner14 commented 6 years ago

No @reeuq v0.1.4 does not support initializing embedding layers with pre-trained embeddings as that feature was added in v0.1.6. However if you want to manually initialize the embedding layer, you could use:

self.embedding = nn.Embedding(vocab_size, hidden_size)
self.embedding.weight = nn.Parameter(embedding_matrix)