Closed chunyuma closed 3 months ago
Hi! SkipGram & CBOW models produce two embeddings, one associated to central tokens, one associated to contextual tokens. Commonly, SkipGram consider its embedding the central tokens one, and CBOW the contextual tokens one. Since this selection is arguably arbitrary, you get to choose which one to use (or both at once).
In SkipGram the first returned is the central and the latter the context, if I recall correctly.
I see. Thanks so much for the responses. @LucaCappelletti94
Hello, I am using
Node2VecSkipGramEnsmallen
fromgrape.embedders
to generate node embeddings for the Cora graph. Please see my code below:After training is done, I got two embedding matrices for the nodes. Could you please help me explain why there are two embedding matrices? What are the differences between them? I assume it should have only one embedding for each node. Thanks!