-
- link: https://arxiv.org/pdf/2002.10957.pdf
- code: https://github.com/microsoft/unilm/tree/master/minilm
-
This looks like it's doing abstractive summarization, but occasionally it can do pure extractive summarization.
Can you confirm - and explain the exact methodology used in the summarization module…
-
Hi - I was able to generate extractive summary by referring this [link](https://medium.com/thecyphy/generating-abstractive-summaries-using-googles-pegasus-model-18eef8ae985b). But I am stuck on how to…
-
## 어떤 내용의 논문인가요? 👋
- BART는 Sequence-to-Sequence 아키텍처 기반의 encoder(Generalized BERT)와 decoder(GPT)를 사용 하였습니다.
- 여러 input noise 기법들을 테스트한 결과, Text Infilling이 좋은 성능을 보였습니다.
- Text Generation, QA, MT …
-
Hi Li Dong,
It's me again ;-)
My second question is related to XLM-Roberta.
I've seen in the source code that you include XLM-Roberta model.
Have you tried / managed to generate text using this …
-
And if it's extractive - is it only selecting sentences? or can it form summaries at the word/token level?
-
Hi there
I'm trying to run the decoding for abstractive summarization (CNN/DM) on CPU mode after referring [#23 ](https://github.com/microsoft/unilm/issues/23#issuecomment-549788510).
I don't hav…
-
Next Friday @ibeltagy will be with us presenting his work on:
## Longformer: The Long-Document Transformer
Iz Beltagy, Matthew E. Peters, Arman Cohan
paper: https://arxiv.org/abs/2004.05150
co…
-
Hi,
What result did you obtain with your Abstractive Summarization code for CNN-DM? Also, is it implemented based on this paper: https://arxiv.org/abs/1908.08345 ?
It will be great if you provid…
-
# 🐛 Bug
## Information
Model I am using (Bert, XLNet ...): default model from pipeline("summarization")
Language I am using the model on (English, Chinese ...): English
I am using the pipe…