google-research / bert

TensorFlow code and pre-trained models for BERT
https://arxiv.org/abs/1810.04805
Apache License 2.0
38.3k stars 9.62k forks source link

not good when I use BERT for seq2seq model in keyphrase generation #192

Open whqwill opened 6 years ago

whqwill commented 6 years ago

Hi,

recently, I am researching about Keyphrase generation. Usually, people use seq2seq with attention model to deal with such problem. Specifically I use the framework: https://github.com/memray/seq2seq-keyphrase-pytorch, which is implementation of http://memray.me/uploads/acl17-keyphrase-generation.pdf .

Now I just change its encoder part to BERT, but the result is not good. The experiment comparison of two models is in the attachment.

Can you give me some advice if what I did is reasonable and if BERT is suitable for doing such a thing?

Thanks. RNN vs BERT in Keyphrase generation.pdf

sc89703312 commented 6 years ago

Hello, recently I am also doing research on applying BERT Pre-trained model to the seq2seq tasks, if u mind presenting the code about using BERT as enconder, thanks.

whqwill commented 6 years ago

Hello, recently I am also doing research on applying BERT Pre-trained model to the seq2seq tasks, if u mind presenting the code about using BERT as enconder, thanks.

Hi,

here is my code of applying BERT to keyphrase generation task : https://github.com/whqwill/seq2seq-keyphrase-bert/

Hope you can find what I did wrong or not properly. Thanks

MichaelZhouwang commented 5 years ago

I think you should not replace the encoder part by bert, but replace the entire seq2seq model with bert Model...

SeekPoint commented 5 years ago

maybe you can just use transformer do s2s