Open ghost opened 7 years ago
The GloVe dictionary I'm using is 400K words and word2vec is about 40K, so W would have to be 400K by 40K, which (using 64 bit floats) would take up over 128 gigabytes to store. So unless I'm missing something, I don't know how feasible this really is
Project GloVe vectors (https://nlp.stanford.edu/projects/glove/) onto word2vec vectors (models/w2v_100d.pickle)