-
Hello
I need to download the "lexsub_context_embeddings.txt" and "lexsub_word_embeddings.txt" for lexical substitution ranking.
But the link is not working.
www.cs.biu.ac.il/nlp/resources/download…
-
Hello, I have trained a Bert with vocab_size 21128, and I noticed that in BLIP the vocab_size should be 21130 (including 2 additional tokens:DEC,ENC). However, this difference caused a shape conflict …
-
Hi and thanks for the great series about transformers!
I noticed that you initialize the nn.embeddings layer for the word embeddings with an input size that is equal to the vocabulary size.
As we …
-
Is it possible to start model training (main.py) from existing word embeddings like Fasttext?
-
I see word embeddings as some potentially low hanging fruit for a more robust product. Namely, word embeddings, such as GloVe, are (A) additive and (B) can quantify the similarity between words/phrase…
-
@dennybritz
Hi ,
First of all many thanks for sharing your code. I am trying to use pretrained word embeddings instead of randomly initialized word embedings based on the vocabulary size.
My pre…
-
If Glove word embeddings is assigned before tf.global_variables_initializer(). The embedding will be randomly initialized. It will not use Glove word embeddings.
-
Does the framework support using of pre-trained word embeddings (Word2Vec or Glove)?
-
I'm encountering a dimension mismatch error while loading the weights for the teacher model (tmodel). The issue specifically occurs in the transformer.embeddings.word_embeddings.weight and mlp.layers.…
-
When trying to run aligner.py (after the prepro code to get snli data, vocab, word embedding) I get the following error. Could you please provide the precomputed word embeddings with the correct numbe…