Closed samru-rai closed 4 years ago
RoBERTa is currently monolingual. Can we make it talk multiple languages by converting XLM-RoBERTa (https://arxiv.org/pdf/1911.02116.pdf) to long version? There seems to already exists pre-trained version of the model https://huggingface.co/transformers/multilingual.html#xlm-roberta.
There has been some discussion to create multi-lingual RoBERTa https://github.com/pytorch/fairseq/issues/952 and the issue seems to close by pointing to https://github.com/pytorch/fairseq/tree/master/examples/xlmr
duplicate of https://github.com/allenai/longformer/issues/21
RoBERTa is currently monolingual. Can we make it talk multiple languages by converting XLM-RoBERTa (https://arxiv.org/pdf/1911.02116.pdf) to long version? There seems to already exists pre-trained version of the model https://huggingface.co/transformers/multilingual.html#xlm-roberta.
There has been some discussion to create multi-lingual RoBERTa https://github.com/pytorch/fairseq/issues/952 and the issue seems to close by pointing to https://github.com/pytorch/fairseq/tree/master/examples/xlmr