openai / gpt-2-output-dataset

Dataset of GPT-2 outputs for research in detection, biases, and more
MIT License
1.92k stars 548 forks source link

why Roberta? #21

Open fatemeh-sh264 opened 4 years ago

fatemeh-sh264 commented 4 years ago

Why did you use Roberta and not use BERT or ELMO instead?

jongwook commented 4 years ago

In an ablation study (that we didn't publish) we found that RoBERTa fine-tunes better than BERT or GPT-2 itself. We expect ELECTRA should work as well.