ShannonAI / mrc-for-flat-nested-ner

Code for ACL 2020 paper `A Unified MRC Framework for Named Entity Recognition`
657 stars 117 forks source link

when running trainer.py, this Userwarning is always presented before epoch 0 #56

Closed iblahimovic closed 3 years ago

iblahimovic commented 3 years ago

the log looks like this:

/home/mist/.local/lib/python3.6/site-packages/pytorch_lightning/utilities/distributed.py:37: UserWarning: The dataloader, val dataloader 0, does not have many workers which may be a bottleneck. Consider increasing the value of the num_workers argument(try 96 which is the number of cpus on this machine) in theDataLoaderinit to improve performance. warnings.warn(*args, **kwargs) bert/vocab.txt /home/mist/.local/lib/python3.6/site-packages/pytorch_lightning/utilities/distributed.py:37: UserWarning: The dataloader, train dataloader, does not have many workers which may be a bottleneck. Consider increasing the value of thenum_workersargument (try 96 which is the number of cpus on this machine) in the DataLoader init to improve performance. warnings.warn(*args, **kwargs) bert/vocab.txt Epoch 0: 0%| | 1/2239 [00:00<17:25, 2.14it/s, loss=0.795, v_num=1]/usr/local/lib/python3.6/dist-packages/torch/optim/lr_scheduler.py:123: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate "https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning) Epoch 0: 100%|████████▉| 2231/2239 [19:17<00:04, 1.93it/s, loss=0.025, v_num=1]

look forward to your reply thanks in advance

YuxianMeng commented 3 years ago

I also noticed this warning, but the learning rate in tensorboard looks as expected, so I just ignore it. If you are still concerned, please add an issue to pytorch-lightning repo.