Open zhaoyuanyuan2011 opened 2 weeks ago
Hi, have you fixed the error yet? Can you check the size of x_warp, x_cond?
Hi, have you fixed the error yet? Can you check the size of x_warp, x_cond?
Hi, thank you for the reply!
Yes, I printed out the shape of x_warp
and x_cond
, but it looks ok:
x_warp.shape: torch.Size([1, 256, 8, 6])
x_cond.shape: torch.Size([1, 256, 8, 6])
x_warp.shape: torch.Size([1, 256, 16, 12])
x_cond.shape: torch.Size([1, 256, 16, 12])
x_warp.shape: torch.Size([1, 256, 32, 24])
x_cond.shape: torch.Size([1, 256, 32, 24])
x_warp.shape: torch.Size([1, 256, 64, 48])
x_cond.shape: torch.Size([1, 256, 64, 48])
x_warp.shape: torch.Size([1, 256, 128, 96])
x_cond.shape: torch.Size([1, 256, 128, 96])
0%| | 0/1 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/home/DM-VTON/test.py", line 155, in <module>
main(opt)
File "/home/DM-VTON/test.py", line 142, in main
run_test_pf(
File "/home/DM-VTON/test.py", line 50, in run_test_pf
p_tryon, warped_cloth = pipeline(real_image, clothes, edge, phase="test")
File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/DM-VTON/pipelines/dmvton_pipeline.py", line 41, in forward
flow_out = self.warp_model(person, clothes, phase=phase)
File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/DM-VTON/models/warp_modules/mobile_afwm.py", line 331, in forward
x_warp, last_flow = self.aflow_net(
File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1553, in _wrapped_call_impl
return self._call_impl(*args, **kwargs)
File "/home/anaconda3/envs/dm-vton/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1562, in _call_impl
return forward_call(*args, **kwargs)
File "/home/DM-VTON/models/warp_modules/mobile_afwm.py", line 258, in forward
concat = torch.cat([x_warp, x_cond], 1)
RuntimeError: Sizes of tensors must match except in dimension 1. Expected size 31 but got size 28 for tensor number 1 in the list.
I'm trying to run inference using my own input images and masks, here's the command:
python test.py --batch_size 1 --dataroot . --pf_warp_checkpoint checkpoints/dmvton_pf_warp.pt --pf_gen_checkpoint checkpoints/dmvton_pf_gen.pt
And here's the error message
I have used
nn.functional.interpolate
to resize mask so it has the sameh
andw
with the cloth image.