Residual / Dense graph connections for GCN and Dilated Graph convolutions are important components for training Deep GCNs. And MRGCN is a memory efficient GCN operator to use for graph learning tasks.
Additional context
Our work transfers concepts such as residual/dense connections and dilated convolutions from CNNs to GCNs in order to successfully train very deep GCNs. We show the benefit of deep GCNs with as many as 112 layers experimentally across various datasets and tasks. Specifically, we achieve state-of-the-art performance in part segmentation and semantic segmentation on point clouds and in node classification of protein functions across biological protein-protein interaction (PPI) graphs.
🚀 Feature
Request to add DeepGCNs (ICCV'2019 Oral) to Pytorch Geometric.
Journal extension: https://arxiv.org/abs/1910.06849 ICCV Paper: http://openaccess.thecvf.com/content_ICCV_2019/papers/Li_DeepGCNs_Can_GCNs_Go_As_Deep_As_CNNs_ICCV_2019_paper.pdf Pytorch (300+stars) : https://github.com/lightaime/deep_gcns_torch (We implemented Deep GCN models based on Pytorch Geometric for sparse data format and Pytorch for Dense data format) Tensorflow (450+stars) : https://github.com/lightaime/deep_gcns
Motivation
Residual / Dense graph connections for GCN and Dilated Graph convolutions are important components for training Deep GCNs. And MRGCN is a memory efficient GCN operator to use for graph learning tasks.
Additional context
Our work transfers concepts such as residual/dense connections and dilated convolutions from CNNs to GCNs in order to successfully train very deep GCNs. We show the benefit of deep GCNs with as many as 112 layers experimentally across various datasets and tasks. Specifically, we achieve state-of-the-art performance in part segmentation and semantic segmentation on point clouds and in node classification of protein functions across biological protein-protein interaction (PPI) graphs.