ThilinaRajapakse / pytorch-transformers-classification

Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
Apache License 2.0
306 stars 97 forks source link

WARMUP_PROPORTION equivalent #14

Closed Magpi007 closed 5 years ago

Magpi007 commented 5 years ago

Hi,

In the BERT_binary_text_classification repo we used a parameter called WARMUP_PROPORTION (set to 0.1). Which is the equivalent in this repo?

Thanks.

ThilinaRajapakse commented 5 years ago

You can use either warmup_steps or warmup_ratio. If warmup_steps is given, it will be used, else warmup_ratio is used.

Note, I just pushed the warmup_ratio commit, so you'll need to update to use it. warmup_steps was there from the beginning, but you have to explicitly set a number of steps.

Magpi007 commented 5 years ago

I saw that this parameter was added, so I can close this issue.