Open whqwill opened 6 years ago
Hello, recently I am also doing research on applying BERT Pre-trained model to the seq2seq tasks, if u mind presenting the code about using BERT as enconder, thanks.
Hello, recently I am also doing research on applying BERT Pre-trained model to the seq2seq tasks, if u mind presenting the code about using BERT as enconder, thanks.
Hi,
here is my code of applying BERT to keyphrase generation task : https://github.com/whqwill/seq2seq-keyphrase-bert/
Hope you can find what I did wrong or not properly. Thanks
I think you should not replace the encoder part by bert, but replace the entire seq2seq model with bert Model...
maybe you can just use transformer do s2s
Hi,
recently, I am researching about Keyphrase generation. Usually, people use seq2seq with attention model to deal with such problem. Specifically I use the framework: https://github.com/memray/seq2seq-keyphrase-pytorch, which is implementation of http://memray.me/uploads/acl17-keyphrase-generation.pdf .
Now I just change its encoder part to BERT, but the result is not good. The experiment comparison of two models is in the attachment.
Can you give me some advice if what I did is reasonable and if BERT is suitable for doing such a thing?
Thanks. RNN vs BERT in Keyphrase generation.pdf