Open Dinxin opened 2 years ago
Thx, I will have a try in the near future.
By the way, I have another problem whether the copy mechanism could boost the pre-trained model such as BART and T5.
I think it is interesting. But we didn't have an experiment on this idea.
@xunjianyin So the authors didn't intend to open source the data and code, they just said open source inside the paper?
Hi, Dinxin
Due to my busy schedule, I have not yet organized my code and data. I'm sorry for that.
But you can use some open-source code on GitHub, some of which is also exactly what I used.
•T5-small、T5-base、BART-base https://github.com/UKPLab/plms-graph2text •Hierarchical transformer encoder + conditional copy https://github.com/KaijuML/data-to-text-hierarchical •Neural Content Planning + conditional copy https://github.com/ratishsp/data2text-plan-py •Pointer-generator https://github.com/ymfa/seq2seq-summarizer •Bert-to-bert https://github.com/google-research/google-research/tree/master/bertseq2seq •Graph convolutional networks https://github.com/diegma/graph-2-text •Transformer https://github.com/OpenNMT/OpenNMT-py
If you have any questions, welcome to ask here or email xjyin@pku.edu.cn.