AkiraTOSEI / ML_papers

ML_paper_summary(in Japanese)
5 stars 1 forks source link

GAN Slimming: All-in-One GAN Compression by A Unified Optimization Framework #71

Open AkiraTOSEI opened 4 years ago

AkiraTOSEI commented 4 years ago

TL;DR

Research to reduce the size of the GAN generator. A combination of distillation, quantization, and pruning. Quantization is usually non-differentiable, but they use pseudo-gradient to enable E2E training. They have successfully reduced the size of an existing model to 1/47th of its original size. Original result

Why it matters:

Paper URL

https://arxiv.org/abs/2008.11062

Submission Dates(yyyy/mm/dd)

2020/08/25

Authors and institutions

Haotao Wang, Shupeng Gui, Haichuan Yang, Ji Liu, Zhangyang Wang

Methods

Results

Comments

ECCV 2020