Closed bernhard2202 closed 8 years ago
With locally trained word2vec every word should have an embedding but the output of preprocessing is still: Used 56008 pre-trained word2vec vectors and 32896 new random vectors.
Used 56008 pre-trained word2vec vectors and 32896 new random vectors.
Should be fixed in PR #12
Agree, closed with #12
With locally trained word2vec every word should have an embedding but the output of preprocessing is still:
Used 56008 pre-trained word2vec vectors and 32896 new random vectors.