NorbertZheng / read-papers

My paper reading notes.
MIT License
8 stars 0 forks source link

Edward Ma | 4 Graph Neural Networks you Need to Know (WLG, GCN, GAT, GIN). #48

Closed NorbertZheng closed 1 year ago

NorbertZheng commented 1 year ago

Edward Ma. 4 Graph Neural Networks you Need to Know (WLG, GCN, GAT, GIN).

NorbertZheng commented 1 year ago

Overview

We went through Knowledge Graph Embeddings and Random Walk in previous graph neural network stories. Knowledge graph embeddings train entity embeddings for downstream tasks. On the other hand, several neural networks model apply random walk theory to train entity embeddings.

In this story, we will go focus on 4 graph neural network models:

NorbertZheng commented 1 year ago

Weisfeiler-Lehman (WL) Graph Kernel

Shervashidze et al. (2011) introduce a way to measure graph similarity (WL test) on graph neural networks. Passed WL test means either of the graphs is an isomorphism or cannot prove graphs are not an isomorphism.

In WL test, you can define the height of the graph. Height means the number of iterations. The following procedure is WL test with $h=1$:

image

NorbertZheng commented 1 year ago

Graph Convolutional Network (GCN)

Kipf and Welling introduced Graph Convolutional Network (GCN) in 2017. The basic idea of GCN is

The below figure shows the overall architecture of GCN. Taking $X{2}$ node as an example, it absorbs $X{2}$ features (self), $X{1}$, $X{3}$ and $X_{4}$ (neighbor) features and feeding into neural network to train a model.

image

Regarding the feature of the node, it can be random initialization or numeric features. Taking paper citing as an example, a document is a node while the edge is citing. Node features (e.g. document) can be one-hot encoded-word count, year of publication, and authors. Another example is the social network. A person is a node while the edge is relationships (e.g. friendship). Features include the birth of a city, the number of friends.

NorbertZheng commented 1 year ago

Graph Attention Network (GAT)

Inspired by attention mechanism, Veličković et al. propose to apply it on the graph neural network. Same as GCN (Kipf and Welling, 2017), Graph Attention Networks (GAT) (Veličković et al., 2017) leverages self node features and neighbor features to train a model. Same as BERT in natural language processing (NLP) field, authors use multi-head attention to learn more different attention.

Using attention comes with several advantages:

image

NorbertZheng commented 1 year ago

Graph Isomorphism Network (GIN)

Topologically identical can be one of the ways to measure the graph's similarity. Traditionally, we can have the Weisfeiler-Lehman (WL) test it. Xu et al. proved that GIN is as powerful as WL test:

image

NorbertZheng commented 1 year ago

Take Away

NorbertZheng commented 1 year ago

Extension Reading

NorbertZheng commented 1 year ago

Reference