Executive-коуч предпринимателей и ИТ-лидеров, ICF PCC, ex-дизайн лид. После 15 лет в ИТ-продуктах занимаюсь тем, что дает много энергии и драйва: 💠 Self-лидерство: @TurboPonchikBot 💠 1:1 коучинг: @whatsalt 💠 Турбохоган: @turbohogan 💠 FAQ: @PonchikFAQ
Progressive Growing of GANs for Improved Quality, Stability, and Variation
Submission video of our paper, published at ICLR 2018. Please see the final version at https://youtu.be/G06dEcZ-QTg
Authors:
Tero Karras (NVIDIA)
Timo Aila (NVIDIA)
Samuli Laine (NVIDIA)
Jaakko Lehtinen (NVIDIA and Aalto University)
For business inquiries, please contact researchinquiries@nvidia.com
For press and other inquiries, please contact Hector Marinez at hmarinez@nvidia.com
Links:
http://arxiv.org/abs/1710.10196
https://github.com/tkarras/progressive_growing_of_gans
Abstract:
We describe a new training methodology for generative adversarial networks. The key idea is to grow both the generator and discriminator progressively: starting from a low resolution, we add new layers that model increasingly fine details as training progresses. This both speeds the training up and greatly stabilizes it, allowing us to produce images of unprecedented quality, e.g., CelebA images at 1024². We also propose a simple way to increase the variation in generated images, and achieve a record inception score of 8.80 in unsupervised CIFAR10. Additionally, we describe several implementation details that are important for discouraging unhealthy competition between the generator and discriminator. Finally, we suggest a new metric for evaluating GAN results, both in terms of image quality and variation. As an additional contribution, we construct a higher-quality version of the CelebA dataset.
This site uses cookies to collect data and enhance your experience. Visit our Privacy Policy to learn more. By using this website you consent to their use. Accept and close.