bheinzerling / bpemb

Pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE)
https://nlp.h-its.org/bpemb
MIT License
1.18k stars 101 forks source link

Are the word embedding glove or word2vec #55

Closed YoadTew closed 3 years ago

YoadTew commented 3 years ago

Hey,

Do the word embedding vectors trained with GloVe or with word2vec?

Thanks

bheinzerling commented 3 years ago

The embeddings were trained with Glove