microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
19.75k stars 2.52k forks source link

XLM-Roberta text generation in S2S-FT #104

Closed Neuronys closed 4 years ago

Neuronys commented 4 years ago

Hi Li Dong,

It's me again ;-) My second question is related to XLM-Roberta. I've seen in the source code that you include XLM-Roberta model. Have you tried / managed to generate text using this model ? Any suggestions, advices, cluse are welcome, as I want to have question generation & abstractive summarization in french. Thanks in advance Philippe

donglixp commented 4 years ago

We are going to support multilingual s2s-ft in the next release. The multilingual MiniLM (https://github.com/microsoft/unilm/tree/master/minilm#multilingual-pretrained-model ) will also be added. @addf400

Neuronys commented 4 years ago

That's great !😊 I've seen the XNLI example. Do you plan to also add Question Generation and Abstractive Summarization examples, to compare with the XNLG approach ? https://github.com/CZWin32768/XNLG Looking forward to test it. Cheers Philippe