graykode / nlp-tutorial

Natural Language Processing Tutorial for Deep Learning Researchers
https://www.reddit.com/r/MachineLearning/comments/amfinl/project_nlptutoral_repository_who_is_studying/
MIT License
14.07k stars 3.91k forks source link

different Embedding way #23

Closed lhbrichard closed 5 years ago

lhbrichard commented 5 years ago

In the code 'Seq2seq-torch.py', i saw u use np.eye,the one-hot representation, to represent embedding, so i change in a normal way ,using nn.Embedding(dict_length,embedding_dim),it can work out. but the loss i got is very high. i wanna ask the differences between this two ways. here are my code and the result.

image image

graykode commented 5 years ago

I think nn.Embedding is not trainale Parameter because it's not in Model init(), also nn.Embedding was initialized as randomly...