ngchc / CameraSR

Camera Lens Super-Resolution in CVPR 2019
MIT License
170 stars 30 forks source link

Why did the interpolated image (LR) and HR I tested always differ about two db from the results in the paper? #3

Closed ninesun127 closed 5 years ago

ninesun127 commented 5 years ago

figure2(c)R-V degradation ,my test psnr 25.451540226734174,is different from paper In Tabel1's five interpolated LR(s),my test psnr is 2 db lower。 My test code is below: `def compute_psnr(im1, im2):

if im1.size != im2.size:
    raise Exception('the shapes of two images are not equal')

rmse = np.sqrt(((np.asfarray(im1) - np.asfarray(im2)) ** 2).mean())
psnr = 20 * np.log10(255.0 / rmse)
return psnr

path='../City100/City100_NikonD5500/'#27.1413

path='../City100/City100_iphoneX/'#24.333565211970758

lr_path=path+'001L.png'

hr_path=path+'001H.png'
print(compute_psnr(Image.open(lr_path),Image.open(hr_path)))#26.544952727983393

'''

psnr_list=[] bar=tqdm(range(1,101),desc='the bar') for i in bar:

lr_path=path+"{:0>3d}".format(i)+'L.png'
hr_path=path+"{:0>3d}".format(i)+'H.png'
lr=Image.open(lr_path)
hr=Image.open(hr_path)

hr=ToTensor()(hr)
lr=ToTensor()(lr)
mse = ((hr - lr) ** 2).mean()
#print(mse)
psnr = 10 * log10(1 / mse)

psnr_list.append(psnr)

PSNR=np.array(psnr_list) print('average') print(PSNR.mean()) `

ngchc commented 5 years ago

Hi. First, I think it could be better to raise an issue in English, to make it more valuable to others. For the PSNR values, could you please try to calculate them in the luminance channel (by separating RGB into YCbCr) and check it out?

ninesun127 commented 5 years ago

I am sorry that using Chinese at begining. well,I do have already test the Y channel,but the result is also different from the paper test result:

_#027image Y_channel 25.898109804588742 RGB channel 25.451540226734174 in NikonD5500 dataset

001 image Y channel 27.418709510361392 RGB channel 26.544952727983393 in NikonD5500 dataset

The average PSNR OF NikonD550 RGB channel 27.14130181759771 Y channel 27.823181962351658

The average PSNR OF IphoneX RGB channel 24.333565211970758 Y channel 24.58039436454012

NikonD5500 001 RGB Y 26.545140986846498 27.41867617574001 016 RGB Y 28.355940512266958 28.89164527298145 048 RGB Y 24.758472617093513 25.200213894334023 060 RGB Y 28.680681067383745 29.32632376536002 098 RGB Y 22.762742495607988 23.30119926767699 AVERAGE RGB Y 26.220595535839742 26.8276116752185

MY test code is below: def compute_psnr(im1, im2):

if im1.size != im2.size:
    raise Exception('the shapes of two images are not equal')

rmse = np.sqrt(((np.asfarray(im1) - np.asfarray(im2)) ** 2).mean())
psnr = 20 * np.log10(255.0 / rmse)
return psnr

path='../City100/City100_NikonD5500/'#27.1413

path='../City100/City100_iphoneX/'#24.333565211970758

lr_path=path+'001L.png'

hr_path=path+'001H.png'

lry,,_=Image.open(lr_path).convert('YCbCr').split()

hry,,_=Image.open(hr_path).convert('YCbCr').split() print(compute_psnr(lr_y,hr_y)) print(compute_psnr(Image.open(lr_path),Image.open(hr_path)))

psnr_list=[] psnr_list_y=[] bar=tqdm([1, 16, 48, 60, 98],desc='the bar') for i in bar:

lr_path=path+"{:0>3d}".format(i)+'L.png'
hr_path=path+"{:0>3d}".format(i)+'H.png'
lr=Image.open(lr_path)
hr=Image.open(hr_path)

lr_y, _, _ = lr.convert('YCbCr').split()

hr_y, _, _ = hr.convert('YCbCr').split()

hr_y=ToTensor()(hr_y)
lr_y=ToTensor()(lr_y)
mse_y = ((hr_y - lr_y) ** 2).mean()
psnr_y=10*log10(1/mse_y)
psnr_list_y.append(psnr_y)

hr=ToTensor()(hr)
lr=ToTensor()(lr)
mse=((hr-lr)**2).mean()
psnr = 10 * log10(1 / mse)

print(psnr,psnr_y)

psnr_list.append(psnr)

PSNR=np.array(psnr_list) PSNR_Y=np.array(psnr_list_y) print('average RGB') print(PSNR.mean()) print("average luminance") print(PSNR_Y.mean())

BY the way ,thanks for your work,i do have an interest in City100 dataset,thank you again

ngchc commented 5 years ago

Glad to know your interest to our work. Such a difference may be caused by the convert function from RGB to YCbCr. The function adopted in paper is the "rgb2ycbcr" in MATLAB toolbox, which might differ from the one in python library. In fact, the CameraSR results for reconstruction accuracy are provided in the Models/VDSR/results folder, which match the scores reported in paper. Also, the interpolated LR images can be found in Models/VDSR/testset. Could you please check it out again?

LoSealL commented 4 years ago

For details on how rgb2yuv conversion differs among matlab, opencv or python PIL, you can refer to ImageProcess.