Skip to content

Generative models (GAN, VAE, Diffusion Models, Autoregressive Models) implemented with Pytorch, Pytorch_lightning and hydra.

Notifications You must be signed in to change notification settings

Victarry/Image-Generation-models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

90 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Colletions of Image Generation Models

python pytorch pytorch_lignthing hydra

An easily scalable and hierachical framework including lots of image generation method with various datasets.



Highlights 💡:

  • Various types of image generation methods(Continuous updating):

    • GANs: WGAN, InfoGAN, BiGAN

    • VAEs: VQ-VAE, Beta-VAE, FactorVAE

    • Augoregressive Models: PixelCNN

    • Diffusion Models: DDPM

  • Decomposition of model training, datasets and networks:

    python run.py model=wgan networks=conv64 datamodule=celeba exp_name=wgan/celeba_conv64
  • Hierachical configuration of experiment in yaml file

    • Manual change of configs in configs/model, configs/datamodule and configs/networks

    • Run predefined experiments in configs/experiment

      python run.py experiment=vanilla_gan/cifar10
    • Override hyperparameters from command line

      python run.py experiment=vanilla_gan/cifar10 model.lrG=1e-3 model.lrD=1e-3 exp_name=vanilla_gan/custom_lr
  • Run multiple experiments at the same time:

    • Grid search of hyperparameters:

      python run.py experiment=vae/mnist_conv model.lr=1e-3,5e-4,1e-4 "exp_name=vae/lr_${model.lr}"
    • Run multiple experiments from config files:

      python run.py -m experiment=vae/mnist_conv,vae/cifar10,vae/celeba

Setup

  • Clone this repo

    git clone https://github.com/Victarry/Image-Generation-models.git
  • Create new python environment using conda and install requirements

    conda env create -n image-generation python=3.10
    conda activate image-generation
    pip install requirement.txt
  • Run your first experiment ✔️

    python run.py experiment=vae/mnist_conv

For different datasets, refer to documentation of datasets.

Project Structure

Generative Adversarial Networks(GANs)

GAN

Generative adversarial nets.
Ian J. Goodfellow, Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-Farley, Sherjil Ozair, Aaron Courville, Yoshua Bengio*
NeurIPS 2014. [PDF] [Tutorial]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

LSGAN

Least Squares Generative Adversarial Networks.
Xudong Mao, Qing Li, Haoran Xie, Raymond Y.K. Lau, Zhen Wang, Stephen Paul Smolley.
ICCV 2017. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

WGAN

Wasserstein GAN
Martin Arjovsky, Soumith Chintala, Léon Bottou.
ICML 2017. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

WGAN-GP

Improved training of wasserstein gans
Ishaan Gulrajani, Faruk Ahmed, Martin Arjovsky, Vincent Dumoulin, Aaron Courville
NeurIPS 2017. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

VAE-GAN

Autoencoding beyond pixels using a learned similarity metric.
Anders Boesen Lindbo Larsen, Søren Kaae Sønderby, Hugo Larochelle, Ole Winther.
ICML 2016. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

BiGAN/ALI

BiGAN Adversarial Feature Learning
Jeff Donahue, Philipp Krähenbühl, Trevor Darrell.
ICLR 2017. [PDF]

ALI Adversarial Learned Inference
Vincent Dumoulin, Ishmael Belghazi, Ben Poole, Olivier Mastropietro, Alex Lamb, Martin Arjovsky, Aaron Courville
ICLR 2017. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

GGAN

Geometric GAN
Jae Hyun Lim, Jong Chul Ye.
Arxiv 2017. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

InfoGAN

InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets
Xi Chen, Yan Duan, Rein Houthooft, John Schulman, Ilya Sutskever, Pieter Abbeel
NeruIPS 2016. [PDF]

Manipulated Latent Random samples Discrete Latent (class label) Continuous Latent-1 (rotation) Continuous Latent-2 (thickness)

Results

mnist_mlp

mnist_mlp

cleba_conv

cifar10_conv

Variational Autoencoders(VAEs)

VAE

Auto-Encoding Variational Bayes.
Diederik P.Kingma, Max Welling.
ICLR 2014. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

cVAE

Learning Structured Output Representation using Deep Conditional Generative Models
Kihyuk Sohn, Honglak Lee, Xinchen Yan.
NeurIPS 2015. [PDF]

Dataset MNIST CIFAR10

Results

mnist_mlp

cifar10_conv

Beta-VAE

beta-VAE: Learning Basic Visual Concepts with a Constrained Variational Framework
Irina Higgins, Loic Matthey, Arka Pal, Christopher Burgess, Xavier Glorot, Matthew Botvinick, Shakir Mohamed, Alexander Lerchner.
ICLR 2017. [PDF]

Dataset CelebA dsprites

Sample

celeba

dsprites

Latent Interpolation

celeba

dsprites

Factor-VAE

Disentangling by Factorising
Hyunjik Kim, Andriy Mnih.
NeurIPS 2017. [PDF]

Dataset CelebA dsprites

Sample

celeba

dsprites

Latent Interpolation

celeba

dsprites

AAE

Adversarial Autoencoders.
Alireza Makhzani, Jonathon Shlens, Navdeep Jaitly, Ian Goodfellow, Brendan Frey.
arxiv 2015. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

AGE

AGE Adversarial Generator-Encoder Networks.
Dmitry Ulyanov, Andrea Vedaldi, Victor Lempitsky.
AAAI 2018. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

TODO

TODO

VQ-VAE

Neural Discrete Representation Learning.
Aaron van den Oord, Oriol Vinyals, Koray Kavukcuoglu
NeruIPS 2017. [PDF]

Dataset MNIST CelebA CIFAR10

Ground truth

mnist_mlp

cleba_conv

cifar10_conv

Reconstruction

mnist_mlp

cleba_conv

cifar10_conv

Augoregressive Models

MADE: Masked Autoencoder for Distribution Estimation

MADE: Masked Autoencoder for Distribution Estimation
Mathieu Germain, Karol Gregor, Iain Murray, Hugo Larochelle
ICML 2015. [PDF]

Dataset Samples

MNIST

mnist_mlp

PixelCNN

Conditional Image Generation with PixelCNN Decoders
Aaron van den Oord, Nal Kalchbrenner, Oriol Vinyals, Lasse Espeholt, Alex Graves, Koray Kavukcuoglu
NeruIPS 2016. [PDF]

Dataset Samples Class Condition Samples

MNIST

mnist_mlp

mnist_mlp

Transformer

Vanilla transformer based augoregressive models.

Dataset Samples Class Condition Samples

MNIST

mnist_mlp

mnist_mlp

python run.py experiment=tar/mnist

python run.py experiment=tar/mnist_cond

Diffusion Models

DDPM

Denoising Diffusion Probabilistic Models
Jonathan Ho, Ajay Jain, Pieter Abbeel
NeurIPS 2020. [PDF]

Dataset MNIST CelebA CIFAR10

Results

mnist_mlp

cleba_conv

cifar10_conv

Packages

No packages published

Languages