HLTCHKUST / Mem2Seq

Mem2Seq: Effectively Incorporating Knowledge Bases into End-to-End Task-Oriented Dialog Systems
MIT License
353 stars 106 forks source link

why use the nn.Embedding rather than something pre-trained #17

Closed Abbyyan closed 5 years ago

Abbyyan commented 5 years ago

Sorry to bother you, but i really want to ask when encoding the words , why you use the nn.Embedding rather than use a pre-trained Embedding such as Glove? Hope you can help me with this question. Thank you very much !

andreamad8 commented 5 years ago

Hi @Abbyyan,

we did not use any pre-trained word embeddings because this task has a lot of Named-Entity which are actually not present in Glove vocabulary.

Anyway, would be interesting to try the newest contextualized word embeddings (e.g. CLOVA, ELMO, BERT, GPT-2 etc.), since they use sub-word embedding and char embeddings.

I hope this help.

Best

Andrea

ps: nn.Embedding is needed also for pre-trained word embeddings.