-
Hello, I would like to ask why the choice of glove embeddings is Common Crawl and the choice of agwe embeddings is librispeech in the code. Shouldn't the choice of glove embeddings also be librispeech
-
Hello Mr. Mr. Jia-Chen Gu,
I would like to request access to the code that creates the word2vec+glove word embeddings for the UDC V2 training data.
Could you kindly share with me ?
-
Hi,
I think https://github.com/MinishLab/model2vec might be a good fit for Embetter. It's a static subword embedder that outperforms both GloVE (300d) and BPEmb (50k, 300d) while being much smaller…
-
### ML-Crate Repository (Proposing new issue)
:red_circle: **Project Title** : Glove Embeddings
:red_circle: **Aim** : This project will create the vector representations of the words.
:red_circle:…
-
the embedding train did not converge
-
This is related to #283 which was closed. I found seemingly working code able to load pretrained GloVe vectors to text2vec: https://gist.github.com/tjvananne/8b0e7df7dcad414e8e6d5bf3947439a9
I have n…
-
If Glove word embeddings is assigned before tf.global_variables_initializer(). The embedding will be randomly initialized. It will not use Glove word embeddings.
-
-
Hi Ruotian Luo, thank you for your excellent codes.
But I have a problem. I want to use pretrained-glove vectors as initial embeddings and set input_encoding_size=300(which is glove vector dim), and …
-
df = pd.merge(
base_df, emb_df, left_index=True, right_index=True
)
Doesn't seem to work correctly for GloVe embeddings, because the index of base_df and emb_df are not the sam…