Lornatang / SRGAN-PyTorch

A simple and complete implementation of super-resolution paper.
Apache License 2.0
410 stars 105 forks source link

about upscale==3 #83

Closed tarxs closed 1 year ago

tarxs commented 1 year ago

Hello,

The code worked well for me when the upscale factor was a multiple of 2. Thank you for providing these solutions. However, when I changed it to 3, I encountered some issues.

Firstly, in the model.py file, within the_UpsampleBlock class, I noticed that the parameters of nn.PixelShuffle need to be changed from 2 to the upscale_factor.

Furthermore, I encountered a problem related to pixel_loss = pixel_criterion(sr, gt). The error message "The size of tensor a (96) must match the size of tensor b (95) at non-singleton dimension 2" might be due to the image cropping process when the upscale factor is 3. It's possible that the cropping process introduces some floating-point discrepancies. However, the code structure is quite complex, and I'm unsure if I have the ability to fix it on my own.

Could you possibly help me address this issue so that the code works seamlessly with an upscale factor of 3, thank you?

Lornatang commented 1 year ago

Your origin image size is 96x96?

tarxs commented 1 year ago

Your origin image size is 96x96?

train->original's image size is 256x256, after run scripts/run.py, the image in train->GT is 128x128. I think is imgproc.py->random_crop_torch to crop the image to 96

Lornatang commented 1 year ago

Generally speaking, If GT image size is a multiple of 3, there is no problem.

tarxs commented 1 year ago

Generally speaking, If GT image size is a multiple of 3, there is no problem.

I change

lr_top = random.randint(0, lr_image_height - lr_patch_size)
lr_left = random.randint(0, lr_image_width - lr_patch_size)

to

lr_top = random.randint(0, lr_image_height - lr_patch_size - 1)
lr_left = random.randint(0, lr_image_width - lr_patch_size - 1) 

and it can work with scale==3,although i dont know why

Lornatang commented 1 year ago

There is a problem with the dismantling script, which has now been fixed