cynricfu / MAGNN

Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding
398 stars 69 forks source link

How to project feature vectors into the same latent factor space in Equation 1 of the paper ? #35

Closed skiusp closed 1 year ago

skiusp commented 2 years ago

Thank you so much for bringing such a wonderful paper. But I have no idea about applying a type-specific linear transformation for each type of nodes by projecting feature vectors into the same latent factor space, which appears in Equation 1 of the paper. I would appreciate it if you could give me a hint or tell me the answer. Thank you very much.

cynricfu commented 2 years ago

"Type-specific linear transformation" is self-explanatory. Let [64] be the dimension of hidden states, if type A nodes are associated with feature vectors of length [8], then the weight matrix to linearly project type A nodes is of shape [8, 64].

After this projection operation, every node's vector representation (regardless of node types) is projected into the same latent vector space of shape [64].