lyttonhao / Neural-Style-MMD

MXNet Code For Demystifying Neural Style Transfer (IJCAI 2017)
83 stars 22 forks source link
batch-normalization maximum-mean-discrepancy neural-style

Neural-Style-MMD

This repository holds the MXNet code for the paper

Demystifying Neural Style Transfer, Yanghao Li, Naiyan Wang, Jiaying Liu, and Xiaodi Hou, International Joint Conference on Artificial Intelligence (IJCAI), 2017

[Arxiv Preprint]

Introduction

Neural-Style-MMD presents a neural style transfer algorithm based on a new interpretation. Instead of using Gram matrix in original neural style transfer methods, this repo provides two methods to implement style transfer, including a Maximum Mean Discrepancy (MMD) loss and a Batch Normalization (BN) statistic loss. The paper also demonstrates the original matching Gram matrix is equivalent to the a specific polynomial MMD. Details could be found in the paper. Our implementation is based on the neural-style example of MXNet.

Prerequisites

Before running this code, you should make the following preparations:

Usage

Basic Usage:

python neural-style.py --mmd-kernel linear --gpu 0 --style-weight 5.0 --content-image input/brad_pitt.jpg --style-image input/starry_night.jpg --output brad_pitt-starry_night --output-folder output_images

We support 4 single transfer methods, including 3 mmd kernels, including linear, poly and Gaussian, and a BN Statistics Matching method. At the same time, the code supports fusing different transfer methods with specific weights.

Options

You can run python neural-style.py with -h to see more options.