Making model...
Preparing loss function:
1.000 * L1
/home/.../.../anaconda3/envs/py36_pytorch/lib/python3.6/site-packages/torch/optim/lr_scheduler.py:82: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
"https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning)
Traceback (most recent call last):
File "main.py", line 19, in
t.train()
File "/home/.../.../MSRN-PyTorch-master/MSRN/Train/trainer.py", line 34, in train
self.loss.step()
File "/home/.../.../MSRN-PyTorch-master/MSRN/Train/loss/init.py", line 87, in step
for l in self.get_loss_module():
File "/home/.../.../MSRN-PyTorch-master/MSRN/Train/loss/init.py", line 123, in get_loss_module
return self.loss_module.module
File "/home/.../.../anaconda3/envs/py36_pytorch/lib/python3.6/site-packages/torch/nn/modules/module.py", line 591, in getattr
type(self).name, name))
AttributeError: 'ModuleList' object has no attribute 'module'
My pytorch version is 1.2.0, Is it something wrong with pytorch?
Making model... Preparing loss function: 1.000 * L1 /home/.../.../anaconda3/envs/py36_pytorch/lib/python3.6/site-packages/torch/optim/lr_scheduler.py:82: UserWarning: Detected call of
t.train()
File "/home/.../.../MSRN-PyTorch-master/MSRN/Train/trainer.py", line 34, in train
self.loss.step()
File "/home/.../.../MSRN-PyTorch-master/MSRN/Train/loss/init.py", line 87, in step
for l in self.get_loss_module():
File "/home/.../.../MSRN-PyTorch-master/MSRN/Train/loss/init.py", line 123, in get_loss_module
return self.loss_module.module
File "/home/.../.../anaconda3/envs/py36_pytorch/lib/python3.6/site-packages/torch/nn/modules/module.py", line 591, in getattr
type(self).name, name))
AttributeError: 'ModuleList' object has no attribute 'module'
lr_scheduler.step()
beforeoptimizer.step()
. In PyTorch 1.1.0 and later, you should call them in the opposite order:optimizer.step()
beforelr_scheduler.step()
. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate "https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning) Traceback (most recent call last): File "main.py", line 19, inMy pytorch version is 1.2.0, Is it something wrong with pytorch?