microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
19.11k stars 2.44k forks source link

How to pretrain UniLM on abstract summarization task? #39

Open JasonVann opened 4 years ago

JasonVann commented 4 years ago

If I want to train UniLM from scratch on another abstract summarization task (not in English), how do I do it?

I guess the fine tuning and inference code from Readme can be reused, but I'm not sure how to do the pretraining. Can you guys share the pre-train code on CNN summarization? Thanks guys!

phucnsp commented 3 years ago

up

donglixp commented 3 years ago

https://github.com/microsoft/unilm/tree/master/s2s-ft