Closed rlqja1107 closed 2 years ago
Hi!
Thanks for attending our paper.
Since there are many graph iterations in our forward time (N graph iteration in K stage refinement), which leads to a quite deep GNN. To this end, we introduce the skip-connection mechanism into our GNN design, as shown in our formula.
Above picture describes the message propagation method in GCN(Graph Convolution Network). In this method, when the information of the neighborhood is aggregated, it adds the message itself, then makes a transformation and adopts the activation function.
However, the BGNN method shows that for updating the node representation, the l-th representation of node is added after the message of neighborhood is transformed and is adopted the activation function.
I wonder why BGNN takes the latter method.