Closed MKFMIKU closed 2 years ago
How do you load the images? I just tried and I got 0.2698.
How do you load the images? I just tried and I got 0.2698.
I use the script https://github.com/richzhang/PerceptualSimilarity/blob/master/lpips_2dirs.py and average the final LPIPS values.
According to here, the PSNR should be 26.91. Are you using the official weights to generate the outputs?
I got an PSNR of 26.91 and LPIPS of 0.2698.
According to here, the PSNR should be 26.91. Are you using the official weights to generate the outputs?
I got an PSNR of 26.91 and LPIPS of 0.2698.
Sorry, I forget to remove the border cropping in the config files. The PSNR is 26.89918989839131 now, but the LPIPS is still 0.2864.
It may affects by the CelebaHQ files ? I use the imresize
to process the LR files.
Can you tell me where do you get the CelebaHQ files? Or maybe share the validation set?
If I remembered correctly, I downloaded the dataset from here. The LR images are downsampled by the MATLAB imresize
.
Thanks a lot! now I can get the 0.2698 LPIPS values
Hi, I tried to test the LPIPS performance on CelebaHQ100 with your glean_ffhq_16x weights. However, the performance is 0.2864, which is higher than the paper report 0.2681. I use the https://github.com/richzhang/PerceptualSimilarity 0.1 version with Alex. Do you have any idea on it? Besides, I got the 26.847 PSNR, which is very similar to the paper report of 26.84.
Thanks