This Paper has been accepted for publication on IEEE Transaction on Multi-media.
We propose two novel techniques in the generative adversarial networks to produce photo-realistic images for image super-resolution in this paper.
Instead of producing a single score to discriminate images between real and fake, we propose a variant, called Fine-grained Attention Generative Adversarial Network for image super-resolution (FASRGAN), to discriminate each pixel between real and fake.
Instead of using different networks for the generator and the discriminator in the SR problem, we use a feature-sharing network (Fs-SRGAN) for both the generator and the discriminator.
We evaluated our methods on several datasets in terms of PSNR/SSIM/PI/LPIPS, where PSNR/SSIM are used to evaluation the accuracy of SR images, and PI/LPIPS are adopted to evaluate the perceptual quality. Perceptual Index (PI) is used in The PIRM Challenge on Perceptual Super-Resolution, and Learned Perceptual Image Patch Similarity (LPIPS) metric is proposed in the work "The Unreasonable Effectiveness of Deep Features as a Perceptual Metric", which evaluates the distance between image patches. Both of them with lower value means more similar.
You can use the codes in Test_scripts to calculate the PSNR/SSIM/PI/LPIPS:
The codes of model are defined in /codes/models.
And the pre-trained models can be downloaded from Baidu Netdisk (code: 723l) or Google drive.
python train.py -opt /options/train/train_FASRGAN.json python train.py -opt /options/train/train_FsSRGAN.json
python test.py -opt /options/test/test_FASRGAN.json python test.py -opt /options/test/test_FsSRGAN.json
If you find this repository useful for your research, please use the following.
@ARTICLE{9377002, author={Y. {Yan} and C. {Liu} and C. {Chen} and X. {Sun} and L. {Jin} and P. {Xinyi} and X. {Zhou}}, journal={IEEE Transactions on Multimedia}, title={Fine-grained Attention and Feature-sharing Generative Adversarial Networks for Single Image Super-Resolution}, year={2021}, volume={}, number={}, pages={1-1}, doi={10.1109/TMM.2021.3065731}}
The repository is built on the BasicSR repository.