Open allaffa opened 1 year ago
hi, i have previously worked on building neural architecture search with skip connections. Can i contribute to this and help you build it ?
@akhilpandey95
Thank you for reaching out to us.
Please send me an email to lupopasinim@ornl.gov and we will coordinate
As the name suggests, the skip connections in deep architecture bypass some of the neural network layers and feed the output of one layer as the input to the following levels. It is a standard module and provides an alternative path for the gradient with backpropagation.
Skip Connections were originally created to tackle various difficulties in various architectures and were introduced even before residual networks. In the case of residual networks or ResNets, skip connections were used to solve the degradation problems (e.g., vanishing gradient), and in the case of dense networks or DenseNets, it ensured feature reusability.