SHI-Labs / Cross-Scale-Non-Local-Attention

PyTorch code for our paper "Image Super-Resolution with Cross-Scale Non-Local Attention and Exhaustive Self-Exemplars Mining" (CVPR2020).
399 stars 46 forks source link

Question about patch_size. #19

Open WOWspring opened 3 years ago

WOWspring commented 3 years ago

I notice the patch_size is set to 48*48 in your paper. However, it is set to 96 in your README.md. Why is it that?

python3 main.py --chop --batch_size 16 --model CSNLN --scale 2 --patch_size 96 --save CSNLN_x2 --n_feats 128 --depth 12 --data_train DIV2K --save_models

(Above is the cmd options)

HolmesShuan commented 2 years ago

@WOWspring Hi, the short answer is

It is possible to get a slightly higher PSNR.

Please refer to https://github.com/sanghyun-son/EDSR-PyTorch/issues/181#issuecomment-500023283.