Open napatnicky opened 3 years ago
This is hard to say TBH. In general, GCN
might be too limited when operating on single node feature values, and you may have better luck with other GNN operators such as GATConv
.
Other-wise, you may want to try to heavily overfit on your data first, e.g., by increasing the general hidden dimensionality and the number of layers of your final classifier. Furthermore, the global_mean_pool
may lose meaningful features when averaging node embeddings across a larger graph. You may want to try out different pooling operators as well, such as global_add_pool
or global_max_pool
.
Have you solved the problem? I meet the same issue. My loss does not decrease and only oscillate. I tried to increase 1) dim of hid, 2) learning rate, 3) make input node features the same as label. Still cannot converge.
Can you check other GNN ops as well, such as SAGEConv
?
@lingchen1991 Yes ,I have . The problems occur when you are dealing with a larger graph and limited features (1 feature, it isn't a good feature as well) for my work. Using a global pooling layer might not be meaningful enough to represent all nodes in a graph to 1 single vector. So , I decided to use a flatten layer instead of global pooling layer. Then, it worked for me.
However, it throws away the structure of the graph in the flatten layer as well ;). I am just dealing with this problem.
Hi matthias,
Thank you for your amazing library. I am stuck to train GCNCov model for my own dataset (about 1000 samples and fix adjacency matrix 6888x6888 but different signals with 1 feature to each node ), It's a binary graph classification task . It seems my training loss didn't converge; it stuck at the same value.
Currently ,I'm training a model with batch size = 4, Adam(lr = 0.01) ,hidden_dim = 16 .However, I am trying to adjust batch size and learning rate . It still has the same problem. Do you have any suggestions to overcome this problem ?
Kind regards, Napat