opteroncx / SESR

SESR: Single Image Super Resolution with Recursive Squeeze and Excitation Networks
28 stars 13 forks source link

Some questions about brachloss and pre-trained checkpoint #4

Closed Zheng222 closed 6 years ago

Zheng222 commented 6 years ago

Hello, I carefully read your code and has some questions as follows https://github.com/opteroncx/SESR/blob/157fa46a46d1e4b922446ee8adff7421bd5b09f0/main.py#L131-L137 https://github.com/opteroncx/SESR/blob/157fa46a46d1e4b922446ee8adff7421bd5b09f0/main.py#L107-L114

  1. I find that len(imglist) in this position is equal to 1, so brachloss function turns into the plain criterion function. My doubt is why directly call criterion function like loss_x2 = criterion(HR_2x[0], label_x2).
  2. I evulate this model with checkpoint file SESR/model/model.pth, and get the different result than what is described in reademe.md file.
              PSNR          SSIM
    Set5         31.90         0.8920
    Set14        28.39         0.7853
    B100         27.47         0.7376
    Urban100     25.74         0.7806

    I want to know how this checkpoint is obtained. Specifically, which is training dataset? Thank you, @opteroncx

opteroncx commented 6 years ago

If you move output = self.upsample(x) list_out.append(output) and HR_2x.append(tmp) HR_4x.append(tmp) into the for statement and train the model, you will supervise each block in each branch. Finally you will have higher PSNR. This is an experimental function and was not mentioned in the paper.

Zheng222 commented 6 years ago

Hello, the Fig.3 in your paper suggests that multi-supervised style as in LapSRN. I gain the result is much higher than that shown in the paper. Are the training datasets 291+General100 ? @opteroncx

opteroncx commented 6 years ago

I have update the checkpoint with div2k @Zheng222