Open bdlacree opened 7 years ago
Word2vec learns two matrices--call them an embedding and a context vector (although this might not be appropriate for every method). The embedding matrix is the "model" that's returned by this package; it can be extracted by model <- train_word2vec(...)
or read.vectors(...)
. The context vector is (I think) just thrown away after training, which is too bad because it can be useful in some situations.
Curious if you really want both or just the embedding matrix. I'd be willing to leave an issue open for the context vectors if there's a use for them.
I'd certainly be interested in recovering the context vectors as well, if this is feasible. This package is pretty awesome, btw.
Hey---great work here. Sorry if this is obtuse, but is there a way to extract the neural network weight matrices after training?
Thanks!