aaronmueller / dont-stop-pretraining

Adapting the Don't Stop Pretraining approach for multilingual applications. Modified by Aaron Mueller and Nathaniel Weir.
0 stars 0 forks source link

multilinguality #1

Open aaronmueller opened 4 years ago

aaronmueller commented 4 years ago

Original paper works w/ RoBERTa. We want to do some multilingual stuff, so let's replace with whatever is easiest.

nweir127 commented 4 years ago

I'm indifferent as to which. Experimental results with one almost certainly generalize to results with the other-- whichever is less of a headache to work with.