daitao / SAN

Second-order Attention Network for Single Image Super-resolution (CVPR-2019)
550 stars 106 forks source link

Mini-batch size - 8 or 16? #30

Open marijavella opened 4 years ago

marijavella commented 4 years ago

Hi,

In the paper, it is stated that 8 LR colour patches of size 48x48 are used for training. However, in the default settings, the mini-batch size is 16. What settings need to be used to match the results in the paper?

When I reduced the batch size to 10 due to memory limitations in the GPU, the PSNR of Set5x2 was about 37.8dB and stopped improving after 590 epochs. This deviates from the results in the paper.

I am aware that a few questions have been asked about the batch size, but I couldn't find an answer to it. Does anyone have any information about this? Thanks.

EchoXu98 commented 1 year ago

Hi, How many GPUs do you use to train this model?