Reimplementation of the Paper: Large Scale GAN Training for High Fidelity Natural Image Synthesis
Simply implement the great paper (BigGAN)Large Scale GAN Training for High Fidelity Natural Image Synthesis, which can generate very realistic images. However, due to my poor device 😭, I just train the image of size 32x32 of cifar-10 and the image of size 64x64 of Imagenet64. By the way, the training procedure is really slow.
- Image 32x32: cifar-10: http://www.cs.toronto.edu/~kriz/cifar-10-matlab.tar.gz
- Image 64x64: ImageNet64: https://drive.google.com/open?id=1uN9O69eeqJEPV797d05ZuUmJ23kGVtfU
Just download the dataset, and put them into the folder 'dataset'
Training iteration: 100,000 Truncation threshold: 1.0
Discriminator | Generator | |
---|---|---|
Update step | 2 | 1 |
Learning rate | 4e-4 | 1e-4 |
Orthogonal reg | ✔️ | ✔️ |
Orthogonal init | ✔️ | ✔️ |
Hierarchical latent | ❌ | ✔️ |
Projection batchnorm | ✔️ | ❌ |
Truncation threshold | ❌ | ✔️ |
Truncation threshold = 1.0, A little mode collapse (truncation threshold is too small).
Truncation threshold = 2.0.
car2plane | ship2horse | cat2bird |
---|---|---|
Training iteration: 100,000
Discriminator | Generator | |
---|---|---|
Update step | 2 | 1 |
Learning rate | 4e-4 | 1e-4 |
Orthogonal reg | ✔️ | ✔️ |
Orthogonal init | ✔️ | ✔️ |
Hierarchical latent | ❌ | ✔️ |
Projection batchnorm | ✔️ | ❌ |
Truncation threshold | ❌ | ✔️ |