bheinzerling / bpemb

Pre-trained subword embeddings in 275 languages, based on Byte-Pair Encoding (BPE)
https://nlp.h-its.org/bpemb
MIT License
1.18k stars 101 forks source link

en.wiki.bpe.op or en.wiki.bpe.vs #59

Closed zhenpingli closed 1 year ago

zhenpingli commented 3 years ago

note this for tips. the bpemb have changed version. for en.wiki.bpe.op you can get from : https://github.com/Katherinnan/bpemb