Open Ali-Hatami opened 7 months ago
Hello,
I think you can use this lib to support a graph-to-sequence translation model. But the output sequencer (the decoder) that it currently supports is the LSTM style, not a transformer. The GGNN I added to OpenNMT-py is based on the paper https://arxiv.org/abs/1511.05493 and the OpenNMT-py connections I added support the graph neural network sequence generator shown in the center of Figure 2 from our paper at https://arxiv.org/abs/2002.06799 . I tried to make the usage rather generic so others could use it for graph-to-sequence generation. You can see some usage Q&A by reading the issues for this repo: https://github.com/SteveKommrusch/OpenNMT-py-ggnn-example/issues .
In my own research I have found that transformer models outperform graph-to-sequence models (and my work was on computer code, not human languages). I think the transformer's key/query attention system is usually better at finding relevant connections than the hand-coded edges for a graph network.
But if you are interested in trying the ggnn, I can help on your attempt. Do you have a write-up (draft paper?) of what you are trying to do?
Regards, Steve
Hi,
I want to use a graph as an input instead of a text in my translation model. In fact, it is a translation model from graph to text. Can I use this lib? It is a bit difficult to follow how can use this. Could you please help me? Thanks