bmschmidt / wordVectors

An R package for creating and exploring word2vec and other word embedding models
Other
280 stars 78 forks source link

Extract Network Weights #36

Open bdlacree opened 7 years ago

bdlacree commented 7 years ago

Hey---great work here. Sorry if this is obtuse, but is there a way to extract the neural network weight matrices after training?

Thanks!

bmschmidt commented 7 years ago

Word2vec learns two matrices--call them an embedding and a context vector (although this might not be appropriate for every method). The embedding matrix is the "model" that's returned by this package; it can be extracted by model <- train_word2vec(...) or read.vectors(...). The context vector is (I think) just thrown away after training, which is too bad because it can be useful in some situations.

Curious if you really want both or just the embedding matrix. I'd be willing to leave an issue open for the context vectors if there's a use for them.

adamlauretig commented 7 years ago

I'd certainly be interested in recovering the context vectors as well, if this is feasible. This package is pretty awesome, btw.