Open xiaohy9 opened 3 years ago
Hi, it seems you already get better results in the other issue you posted at transformers
.
However, I hypothesise the model may memorise something about this topic in the training corpus and this is a good indicator that pegasus can be further improved.
We do not intend to update this repo and TF2 PEGASUS has been supported by Huggingface.
Hope this may help.
I had problems using pegasus with tensorflow 2. So I am using pre-trained model from transformers. The wired thing happened when I tried model pegasus-xsum for text summarization using the example code and data. I noticed that the output describes a similar but obviously different story than the one in the input. I expected to see some description of the Eiffel Tower, but the output is all about New York's World Trade Center!! For details, please see https://github.com/huggingface/transformers/issues/10837.
I posted the same question on transformers' issues page. But I haven't got a good answer yet. I assume pegasus' author should have a better idea on this. please let me know if you have any thoughts/suggestions.
BTW, I really think pegasus should support tensorflow 2. It has been the default version for many projects now. Thank you!