Open tharindu-mathew opened 5 years ago
Is the height of your test image (after resizing with load_size) smaller than crop_size?
[updated] I did not debug in the code, so I'm not sure. From the config options it shouldn't be a problem?
Load size = 256, Cropsize = 256. Both my train and test images are larger, and aligned as shown below.
train$ file 0001.png 0001.png: PNG image data, 1280 x 360, 8-bit/color RGB, non-interlaced
test$ file 0300.png 0300.png: PNG image data, 1280 x 360, 8-bit/color RGB, non-interlaced
Here is the issue. If you resize an image with 1280 x360 to an output image with width 256, it will become a 256 x 72 image. If you apply a 256x256 cropping, it may cause the error. You may want to do `--load_size 1280 --crop_size 256'. You can also use other preprocess options.
Since this is aligned, these are two images, 640x380. From what I see during training, they split the image scale and crop.
In test, if the same happens, then it should be 256 x 152. Now, based on your explanation, this should throw an exception during training as well. But, the training completes without an issue. Does my reasoning make any sense?
On a side note, I do understand scaling and randomly cropping seems weird for inference. Maybe, if it was scaled and cropped starting from (0, 0), that makes more sense.
On Tue, Feb 5, 2019 at 5:33 PM Jun-Yan Zhu notifications@github.com wrote:
Here is the issue. If you resize an image with 1280 x360 to an output image with width 256, it will become a 256 x 72 image. If you apply a 256x256 cropping, it may cause the error. You may want to do `--load_size 1280 --crop_size 256'. You can also use other preprocess options.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/issues/516#issuecomment-460830785, or mute the thread https://github.com/notifications/unsubscribe-auth/ABsR4bxxVYa_uLMQe3z3g4yu-sqS1HkPks5vKgaygaJpZM4aiYLS .
-- Regards,
Tharindu
blog: http://mackiemathew.com/
I've trained pix2pix using scale_width_and_crop, and when I try to test with the same option test.py crashes. If I use another preprocess option such as none or resize_width_and_crop, everything works.
e.g.:
$ python test.py --dataroot ./datasets/c --name c_pix2pix --model pix2pix --netG unet_256 --direction AtoB --dataset_mode aligned --norm batch --preprocess scale_width_and_crop --gpu_ids=1,2,3,4,5 ----------------- Options --------------- aspect_ratio: 1.0 batch_size: 1 checkpoints_dir: ./checkpoints crop_size: 256 dataroot: ./datasets/c [default: None] dataset_mode: aligned direction: AtoB display_winsize: 256 epoch: latest eval: False gpu_ids: 1,2,3,4,5 [default: 0] init_gain: 0.02 init_type: normal input_nc: 3 isTrain: False [default: None] load_iter: 0 [default: 0] load_size: 256 max_dataset_size: inf model: pix2pix [default: test] n_layers_D: 3 name: c_pix2pix [default: experiment_name] ndf: 64 netD: basic netG: unet_256 ngf: 64 no_dropout: False no_flip: False norm: batch ntest: inf num_test: 50 num_threads: 4 output_nc: 3 phase: test preprocess: scale_width_and_crop [default: resize_and_crop] results_dir: ./results/ serial_batches: False suffix: verbose: False ----------------- End ------------------- dataset [AlignedDataset] was created initialize network with normal model [Pix2PixModel] was created loading the model from ./checkpoints/crowds_pix2pix/latest_net_G.pth ---------- Networks initialized ------------- [Network G] Total number of parameters : 54.414 M
Traceback (most recent call last): File "test.py", line 60, in
model.test() # run inference
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/base_model.py", line 105, in test
self.forward()
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/pix2pix_model.py", line 88, in forward
self.fake_B = self.netG(self.real_A) # G(A)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/parallel/data_parallel.py", line 123, in forward
outputs = self.parallel_apply(replicas, inputs, kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/parallel/data_parallel.py", line 133, in parallel_apply
return parallel_apply(replicas, inputs, kwargs, self.device_ids[:len(replicas)])
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/parallel/parallel_apply.py", line 77, in parallel_apply
raise output
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/parallel/parallel_apply.py", line 53, in _worker
output = module(*input, *kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(input, kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 459, in forward
return self.model(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 527, in forward
return self.model(x)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, *kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(input, kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 529, in forward
return torch.cat([x, self.model(x)], 1)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, *kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 529, in forward
return torch.cat([x, self.model(x)], 1)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(input, kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 529, in forward
return torch.cat([x, self.model(x)], 1)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, *kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(input, kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 529, in forward
return torch.cat([x, self.model(x)], 1)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, *kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 529, in forward
return torch.cat([x, self.model(x)], 1)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(input, kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 529, in forward
return torch.cat([x, self.model(x)], 1)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(*input, *kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(input, kwargs)
File "/scratch2/mathewc/pytorch-CycleGAN-and-pix2pix/models/networks.py", line 529, in forward
return torch.cat([x, self.model(x)], 1)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/modle.py", line 477, in call
result = self.forward(*input, *kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/container.py", line 91, in forward
input = module(input)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/module.py", line 477, in call
result = self.forward(input, **kwargs)
File "/scratch2/mathewc/anaconda3/envs/pytorch-CycleGAN-and-pix2pix/lib/python3.5/site-packages/torch/nn/modules/conv.py", line 301, in forward
self.padding, self.dilation, self.groups)
RuntimeError: CuDNN error: CUDNN_STATUS_BAD_PARAM