facebookresearch / MUSE

A library for Multilingual Unsupervised or Supervised word Embeddings
Other
3.18k stars 544 forks source link

The english embedding is too big #115

Closed 15091444119 closed 5 years ago

15091444119 commented 5 years ago

Did you use the full english embedding of 6.2G to training cross-lingual embedding, or only 20w words?