Skip to content

Latest commit

 

History

History
48 lines (34 loc) · 1.98 KB

README.md

File metadata and controls

48 lines (34 loc) · 1.98 KB

TinyGAN

BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression

Python 3.7 PyTorch 1.2.0

This repository contains the official PyTorch implementation of the following paper:

TinyGAN: Distilling BigGAN for Conditional Image Generation (ACCV 2020)
Ting-Yun Chang and Chi-Jen Lu

https://arxiv.org/abs/2009.13829

https://www.youtube.com/watch?v=EsUxQT1su6s

Abstract: Generative Adversarial Networks (GANs) have become a powerful approach for generative image modeling. However, GANs are notorious for their training instability, especially on large-scale, complex datasets. While the recent work of BigGAN has significantly improved the quality of image generation on ImageNet, it requires a huge model, making it hard to deploy on resource-constrained devices. To reduce the model size, we propose a black-box knowledge distillation framework for compressing GANs, which highlights a stable and efficient training process. Given BigGAN as the teacher network, we manage to train a much smaller student network to mimic its functionality, achieving competitive performance on Inception and FID scores but with the generator having 16 times fewer parameters.

The trained model is in gan/models (73 MB) and can be directly downloaded from Github.

Training

$ bash train.sh

Evaluation

$ bash eval.sh

Fig

Fig

Citation

@InProceedings{Chang_2020_ACCV,
    author    = {Chang, Ting-Yun and Lu, Chi-Jen},
    title     = {TinyGAN: Distilling BigGAN for Conditional Image Generation},
    booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)},
    month     = {November},
    year      = {2020}
}