Closed Abbyyan closed 5 years ago
Hi @Abbyyan,
we did not use any pre-trained word embeddings because this task has a lot of Named-Entity which are actually not present in Glove vocabulary.
Anyway, would be interesting to try the newest contextualized word embeddings (e.g. CLOVA, ELMO, BERT, GPT-2 etc.), since they use sub-word embedding and char embeddings.
I hope this help.
Best
Andrea
ps: nn.Embedding is needed also for pre-trained word embeddings.
Sorry to bother you, but i really want to ask when encoding the words , why you use the nn.Embedding rather than use a pre-trained Embedding such as Glove? Hope you can help me with this question. Thank you very much !