microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
19.75k stars 2.52k forks source link

CrossLingual MiniLM example for Question Generation & Abstractive Summary ? #108

Closed Neuronys closed 4 years ago

Neuronys commented 4 years ago

Hi @donglixp and @WenhuiWang0824, Thanks a lot for releasing Cross Lingual MiniLM. As being french, I'm highly interested in such coss-lingual models. You gave us XNLI & MLQA examples. Could you please also give us Question Generation and Abstractive Summarization examples ? I would like to compare with the XNLG approach: https://github.com/CZWin32768/XNLG Thanks in advance Philippe

wenhui0924 commented 4 years ago

Hi Philippe, thanks for your suggestion. We just finished the experiments on cross-lingual NLU tasks. It may take several days to conduct experiments on cross-lingual NLG tasks. We will keep you updated.

Thanks, Wenhui

Neuronys commented 4 years ago

Hi @WenhuiWang0824 Have you made progress on cross-lingual NLG tasks ? Thanks

wenhui0924 commented 4 years ago

Hi @Neuronys

Sorry for the delay, we are still conducting the experiments and will keep you updated. You could also conduct cross-lingual NLG experiments by modifying the tokenization file in the s2s-ft package to adapt to multilingual MiniLM.

Thanks

Neuronys commented 4 years ago

Hi @WenhuiWang0824 Have you made progress in your multi lingual NLG experiments ? I'm so curious to see how it works / performs ;-) Cheers Philippe