-
-
At the moment, the model works at a character level. It'd be better if it used information about the meaning of words based on word embeddings. This could be incorporated with some character-level RNN…
-
-
Have you thought of adding ELMo word embeddings to the selection? It outperforms both GloVe and Word2Vec embeddings.
Link to ELMo: https://github.com/allenai/allennlp/blob/master/tutorials/how_to/e…
-
…by Stanford, https://nlp.stanford.edu/projects/histwords/
*We released pre-trained historical word embeddings (spanning all decades from 1800 to 2000) for multiple languages (English, French, Germ…
-
Is it possible to download the trained word embeddings? I would like to explore them in detail with the embedding explorer.
-
Given a directory of articles in the desired input format (see #2) generate the word embeddings to be used in document ranking.
- [ ] What software package to use.
-
@angelo337 para espanol ?
-
Hi @alex-tifrea ,
I'm training word embeddings on the [RCV1 corpus](http://www.ai.mit.edu/projects/jmlr/papers/volume5/lewis04a/lyrl2004_rcv1v2_README.htm). I've generated the vocab (7MB) and co-oc…
-
Thank you for your work on SGG.
I placed the BLIP pre-trained weights in cache/ckpts. However, when I start training, it still shows "Missing keys ['text_decoder.bert.embeddings.word_embeddings.wei…