Closed NorbertZheng closed 1 year ago
We went through Knowledge Graph Embeddings and Random Walk in previous graph neural network stories. Knowledge graph embeddings train entity embeddings for downstream tasks. On the other hand, several neural networks model apply random walk theory to train entity embeddings.
In this story, we will go focus on 4 graph neural network models:
Shervashidze et al. (2011) introduce a way to measure graph similarity (WL test) on graph neural networks. Passed WL test means either of the graphs is an isomorphism or cannot prove graphs are not an isomorphism.
In WL test, you can define the height of the graph. Height means the number of iterations. The following procedure is WL test with $h=1$:
Kipf and Welling introduced Graph Convolutional Network (GCN) in 2017. The basic idea of GCN is
The below figure shows the overall architecture of GCN. Taking $X{2}$ node as an example, it absorbs $X{2}$ features (self), $X{1}$, $X{3}$ and $X_{4}$ (neighbor) features and feeding into neural network to train a model.
Regarding the feature of the node, it can be random initialization or numeric features. Taking paper citing as an example, a document is a node while the edge is citing. Node features (e.g. document) can be one-hot encoded-word count, year of publication, and authors. Another example is the social network. A person is a node while the edge is relationships (e.g. friendship). Features include the birth of a city, the number of friends.
Inspired by attention mechanism, Veličković et al. propose to apply it on the graph neural network. Same as GCN (Kipf and Welling, 2017), Graph Attention Networks (GAT) (Veličković et al., 2017) leverages self node features and neighbor features to train a model. Same as BERT in natural language processing (NLP) field, authors use multi-head attention to learn more different attention.
Using attention comes with several advantages:
Topologically identical can be one of the ways to measure the graph's similarity. Traditionally, we can have the Weisfeiler-Lehman (WL) test it. Xu et al. proved that GIN is as powerful as WL test:
Edward Ma. 4 Graph Neural Networks you Need to Know (WLG, GCN, GAT, GIN).