ThilinaRajapakse / pytorch-transformers-classification

Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
Apache License 2.0
306 stars 97 forks source link

Eval results outputs #31

Open Azamat25 opened 5 years ago

Azamat25 commented 5 years ago

I am trying to reproduce the code with smaller subset : 100K train; 5K dev examples Getting this result:

INFO:main: Eval results outputs INFO:main: fn = 0 INFO:main: fp = 2529 INFO:main: mcc = 0.0 INFO:main: tn = 0 INFO:main: tp = 2471

How to interpret it? What possibly can went wrong? Thank you.

azamatolegen commented 5 years ago

by the way I am using XLM model. The model and data size are only differences from your code

ThilinaRajapakse commented 5 years ago

Hard to say. 100k samples should be enough to train the model. Can you try with the Simple Transformers library linked in the readme as this repo is out of date?