-
Hello,
Thank you for this fantastic code release! I'm currently running the abstractive summarization on input longer than 512, and changed the max_pos arg in train.py accordingly.
But I notic…
-
# URL
- https://arxiv.org/abs/2009.13312
# Affiliations
- Zheng Zhao, N/A
- Shay B. Cohen, N/A
- Bonnie Webber, N/A
# Abstract
- It is well-known that abstractive summaries are subject tohallu…
-
I ran into an issue following the readme steps:
[0:00:07.825803][Epoch: 0][Iter: 58][Loss: 8.262327][lr: 0.000500]: 100%|████████████████████████████████████████████████████████…
-
Whenever I am using the pre-trained model CNN/DM BertExtAbs that is bertsumextabs_cnndm_final_model.zip (1.8G) for abstractive summarization I get redundancy is my summary, that is the below sente…
-
Socher ... beszámolt a csapata néhány friss eredményéről, amit reménytelinek látok ebből a szempontból. Ezeket a modelleket még nem tanulmányoztam alaposan, ezért inkább Socher oldalát ajánlom az olva…
-
# title
A Reinforced Topic-Aware Convolutional Sequence-to-Sequence Model for Abstractive Text Summarization
# notes
做生成式短文本摘要,在DUC, Gigaword和LCSTS打到stoa。方法是用ConvS2S作为基础,在attention的时候不仅attend到原文的…
-
Foundational source for abstractive generation: https://arxiv.org/abs/1704.04368
-
https://dl.acm.org/citation.cfm?id=974305.974329
-
我看了这篇论文,好不容易找到您写的源代码,对我来说真是雪中送碳,可是不知道您有没有和其他方法做对比的代码呀
-
How difficult would it be to turn your code into a word-level model? That is, to select the best words to include in the summary, instead of at the sentence level. This task would be extremely useful …