Hi, I really like this paper but I have a question. According to the paper 'f is a ReLU-activated L-layer Multi-Layer Perceptron (MLP) with W1 , . . . , WL as parameters'. Does this paper include Graph structure?
g is an aggregation function that aggregates the features over local neighbors. and hi(X, G) =f(gi(X, G); W1, W2, . . . , WL), where f is only the "decoder" for the learned representatives.
Hi, I really like this paper but I have a question. According to the paper 'f is a ReLU-activated L-layer Multi-Layer Perceptron (MLP) with W1 , . . . , WL as parameters'. Does this paper include Graph structure?