ORNL / HydraGNN

Distributed PyTorch implementation of multi-headed graph convolutional neural networks
BSD 3-Clause "New" or "Revised" License
61 stars 27 forks source link

Skip connection for ResNet type of GNN #164

Open allaffa opened 1 year ago

allaffa commented 1 year ago

As the name suggests, the skip connections in deep architecture bypass some of the neural network layers and feed the output of one layer as the input to the following levels. It is a standard module and provides an alternative path for the gradient with backpropagation.

Skip Connections were originally created to tackle various difficulties in various architectures and were introduced even before residual networks. In the case of residual networks or ResNets, skip connections were used to solve the degradation problems (e.g., vanishing gradient), and in the case of dense networks or DenseNets, it ensured feature reusability.

akhilpandey95 commented 7 months ago

hi, i have previously worked on building neural architecture search with skip connections. Can i contribute to this and help you build it ?

allaffa commented 6 months ago

@akhilpandey95
Thank you for reaching out to us. Please send me an email to lupopasinim@ornl.gov and we will coordinate