openai / finetune-transformer-lm

Code and model for the paper "Improving Language Understanding by Generative Pre-Training"
https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf
MIT License
2.14k stars 499 forks source link

Do you ever try your model on ROCstory training dataset? #28

Open Brandonnogithub opened 5 years ago

Brandonnogithub commented 5 years ago

I use the training data to train this model. (I make the wrong ending by random) And use the test data to test. The result is only about 60%. While the common embedding model can reach 65+%. I'm not sure whether I use this model in a right way. Do you ever try this ?