graph4ai / graph4nlp

Graph4nlp is the library for the easy use of Graph Neural Networks for NLP. Welcome to visit our DLG4NLP website (https://dlg4nlp.github.io/index.html) for various learning resources!
Apache License 2.0
1.67k stars 201 forks source link

Does having more edges on a GNN helps learning? #576

Closed smith-co closed 2 years ago

smith-co commented 2 years ago

I am using Graph2Seq model for a NMT task. I use GCN as the encoder.

Graph stats:

I have following two questions:

  1. Logically does adding more edges help a GNN model?

  2. I guess adding more semantic edges might help the model to learn about the neighbours faster and could reduce training time as the model could converge faster. Is this understanding correct?

AlanSwift commented 2 years ago

Thanks for your interest. The following are some personal insights. For Q1: Yes. The edges contain more information that can usually help the down tasks. For Q2: I think the coverage speed is not guaranteed. When the graph is simple, it still coverage fast. And adding the edges will bring additional computation costs, which can reduce the training speed in some way. So I suggest you compare the different varients of the graph to find the answer for the specific task.

smith-co commented 2 years ago

Thanks for the feedback.

For a large graph, would adding edges reduce the converge speed?

AlanSwift commented 2 years ago

I think it depends on your specific design. Adding useful edges will certainly help message passing. But if you add negative edges, I guess it harms the performance.