yinboc / liif

Learning Continuous Image Representation with Local Implicit Image Function, in CVPR 2021 (Oral)
https://yinboc.github.io/liif/
BSD 3-Clause "New" or "Revised" License
1.26k stars 145 forks source link
implicit-neural-representation machine-learning pytorch super-resolution

LIIF

This repository contains the official implementation for LIIF introduced in the following paper:

Learning Continuous Image Representation with Local Implicit Image Function
Yinbo Chen, Sifei Liu, Xiaolong Wang
CVPR 2021 (Oral)

The project page with video is at https://yinboc.github.io/liif/.

Citation

If you find our work useful in your research, please cite:

@inproceedings{chen2021learning,
  title={Learning continuous image representation with local implicit image function},
  author={Chen, Yinbo and Liu, Sifei and Wang, Xiaolong},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={8628--8638},
  year={2021}
}

Environment

Quick Start

  1. Download a DIV2K pre-trained model.
Model File size Download
EDSR-baseline-LIIF 18M Dropbox | Google Drive
RDN-LIIF 256M Dropbox | Google Drive
  1. Convert your image to LIIF and present it in a given resolution (with GPU 0, [MODEL_PATH] denotes the .pth file)
python demo.py --input xxx.png --model [MODEL_PATH] --resolution [HEIGHT],[WIDTH] --output output.png --gpu 0

Reproducing Experiments

Data

mkdir load for putting the dataset folders.

Running the code

0. Preliminaries

1. DIV2K experiments

Train: python train_liif.py --config configs/train-div2k/train_edsr-baseline-liif.yaml (with EDSR-baseline backbone, for RDN replace edsr-baseline with rdn). We use 1 GPU for training EDSR-baseline-LIIF and 4 GPUs for RDN-LIIF.

Test: bash scripts/test-div2k.sh [MODEL_PATH] [GPU] for div2k validation set, bash scripts/test-benchmark.sh [MODEL_PATH] [GPU] for benchmark datasets. [MODEL_PATH] is the path to a .pth file, we use epoch-last.pth in corresponding save folder.

2. celebAHQ experiments

Train: python train_liif.py --config configs/train-celebAHQ/[CONFIG_NAME].yaml.

Test: python test.py --config configs/test/test-celebAHQ-32-256.yaml --model [MODEL_PATH] (or test-celebAHQ-64-128.yaml for another task). We use epoch-best.pth in corresponding save folder.