Open chaos5958 opened 6 years ago
Hi @chaos5958 , please refer https://github.com/twtygqyy/pytorch-SRDenseNet/blob/master/eval.py Let me know if you have further problem.
@chaos5958 hi i wonder if you solved that problem. for me it also happens, the difference between pre trainded weight and trained data which i did. @twtygqyy hi i want your help...how can i get the PSNR as your weight.
Hello, I tested the given model and the model I trained using default parameters you provide. Also, I constructed test dataset using matlab, and check bicubic performance is reproduced as the paper. However, vdsr performance is a little bit different form you provided in README.pd.
Besides, I implemented VDSR once tensorflow+PIL, but results are too bad, so I'm currently testing user repo using pytorch+Matlab. Anyway, thank you for the code!
Why you 'shave_border' in evaluating PSNR? Is it norm in the super resolution research?
How could I reproduce your result? (e.g. better than paper)
Should I do bicubic interpolation on normalized value (e.g. 0-1)? The result seems a little bit different. It's a little bit strange that, in the training stage, bicubic is done in double value, but ,in the test stage, bicubic is done in integer value.
Set 5, Scale 4, shave_border = 4 bicubic = 28.414 vdsr (given) = 30.880 vdsr (trained) = 30.727 vdsr (README) = 31.35 (I want to reproduce this one!)
for filename in filenames:
print("{} Scale {}] PSNR_bicubic={:.3f}".format(opt.dataset, opt.scale, total_psnr_bicubic / len(filenames))) print("{} Scale {}] PSNR_sr={:.3f}".format(opt.dataset, opt.scale, total_psnr_sr/ len(filenames)))