HighwayWu / ImageInpainting

36 stars 8 forks source link

ImageInpainting

An official implementation code for paper "Deep Generative Model for Image Inpainting with Local Binary Pattern Learning and Spatial Attention"

Table of Contents

Background

Inpainting results generated by our proposed model. In each pair, the upper is the input image with irregular or centering mask, and the bottom is the inpainting result.

In this paper, we have proposed a deep generative model for image inpainting with Local Binary Pattern (LBP) learning and a new spatial attention mechanism. The proposed model has been formed with two networks: a LBP learning network, which aims to learn the LBP feature of the missing region, and an image inpainting network, which generates the inpainting results by using the learned LBP as a guidance. Furthermore, we have designed a new spatial attention layer, and have incorporated it into the image inpainting network. The proposed spatial attention strategy not only considers the dependency between the know region and the filled region, but also the one within the filled region.

Overview of our proposed generative inpainting network.

An example of our spatial attention layer.

Dependency

Demo

To train or test the proposed model:

python main.py {train,test}

For example to test the proposed model:

python main.py test

Then the model will inpaint the images in the ./demo/input/ with corresponding masks in the ./demo/mask/ and save the results in the ./demo/output/ directory. The pre-trained weights should be put in the ./weights/ directory.

Note: The pretrained weights can be downloaded from: Google Drive or Baidu Yun (Code: a6n2)

Citation

If you use this code for your research, please cite our paper

@ARTICLE{9537606,
author={Wu, Haiwei and Zhou, Jiantao and Li, Yuanman},
journal={IEEE Transactions on Multimedia}, 
title={Deep Generative Model for Image Inpainting with Local Binary Pattern Learning and Spatial Attention}, 
year={2021},
volume={},
number={},
pages={1-1},
doi={10.1109/TMM.2021.3111491}}

Acknowledgments

Our code is based on the Shift-Net.