Thank you for your sharing. I use 2 3090GPUS and meet the following error:
Traceback (most recent call last):
File "/openseg.pytorch-pytorch-1.7/main.py", line 524, in
model.train()
File "openseg.pytorch-pytorch-1.7/segmentor/trainer.py", line 416, in train
self.val()
File "openseg.pytorch-pytorch-1.7/segmentor/trainer.py", line 361, in val
loss = self.pixel_loss(outputs[i], targets[i].unsqueeze(0))
File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, *kwargs)
File "/data1/Segmentation/2021Q1/openseg.pytorch-pytorch-1.7/lib/loss/loss_helper.py", line 273, in forward
aux_loss = self.ce_loss(aux_out, targets)
File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(input, **kwargs)
File "/data1/Segmentation/2021Q1/openseg.pytorch-pytorch-1.7/lib/loss/loss_helper.py", line 135, in forward
target = self._scale_target(targets[0], (inputs.size(2), inputs.size(3)))
File "/data1/Segmentation/2021Q1/openseg.pytorch-pytorch-1.7/lib/loss/loss_helper.py", line 145, in _scale_target
targets = F.interpolate(targets, size=scaled_size, mode="nearest")
File "/opt/conda/lib/python3.6/site-packages/torch/nn/functional.py", line 3052, in interpolate
'Input is {}D, size is {}'.format(dim, len(size)))
ValueError: size shape must match input shape. Input is 3D, size is 2
Thank you for your sharing. I use 2 3090GPUS and meet the following error:
Traceback (most recent call last): File "/openseg.pytorch-pytorch-1.7/main.py", line 524, in
model.train()
File "openseg.pytorch-pytorch-1.7/segmentor/trainer.py", line 416, in train
self.val()
File "openseg.pytorch-pytorch-1.7/segmentor/trainer.py", line 361, in val
loss = self.pixel_loss(outputs[i], targets[i].unsqueeze(0))
File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(*input, *kwargs)
File "/data1/Segmentation/2021Q1/openseg.pytorch-pytorch-1.7/lib/loss/loss_helper.py", line 273, in forward
aux_loss = self.ce_loss(aux_out, targets)
File "/opt/conda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 727, in _call_impl
result = self.forward(input, **kwargs)
File "/data1/Segmentation/2021Q1/openseg.pytorch-pytorch-1.7/lib/loss/loss_helper.py", line 135, in forward
target = self._scale_target(targets[0], (inputs.size(2), inputs.size(3)))
File "/data1/Segmentation/2021Q1/openseg.pytorch-pytorch-1.7/lib/loss/loss_helper.py", line 145, in _scale_target
targets = F.interpolate(targets, size=scaled_size, mode="nearest")
File "/opt/conda/lib/python3.6/site-packages/torch/nn/functional.py", line 3052, in interpolate
'Input is {}D, size is {}'.format(dim, len(size)))
ValueError: size shape must match input shape. Input is 3D, size is 2