Closed sjchasel closed 1 year ago
Does the example code work in your environment? https://github.com/Tony-Y/pytorch_warmup/blob/master/examples/emnist/main.py
Your code does not optimize model parameters at all because optimizer.zero_grad()
is called after loss.backward()
. If you cannot understand this, please read the following tutorial:
https://pytorch.org/tutorials/beginner/basics/optimization_tutorial.html#optimizer
Did you solve this issue?
In every batch, I execute
It doesn't have a warm up process.