Based on the Pytorch-Transformers library by HuggingFace. To be used as a starting point for employing Transformer models in text classification tasks. Contains code to easily train BERT, XLNet, RoBERTa, and XLM models for text classification.
Apache License 2.0
306
stars
97
forks
source link
UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. #42
/usr/local/lib/python3.6/dist-packages/torch/optim/lr_scheduler.py:122: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
I do not know if this is related to your code. Just wanted to let you know.
Thanks for the heads up! It doesn't appear to be related to the code in this library (and doesn't seem to be doing anything naughty). Do let me know if it causes any issues though.
Hi, I am getting this warning:
I do not know if this is related to your code. Just wanted to let you know.