A collection of different latent variable and generative models. This repository contains Vanilla AE, Convolutional AE and two popular family of generative models: VAEs and GANs
Variational Auto-Encoder (VAE)
In simple words, variational autoencoder is a kind of 'Approximate Density Estimation' methods which try to transform the complex high-dimenstional input distibution to a tracktable and known dirstribution. The basic assumption of VAE is that eveything is Gaussain, thus the encoder part maps evey input to a normal dirstribution, then a random sample from normal distribution is passed through the decoder net. To generate new samples we only need to sample from normal dirstribution and pass it through the decoder.
Generative Adversarial Network (GAN)
Unlike the other method intoduced above, in this type of models we actully want to learn the complex high-dimentional distribution of data but the there is no a straight way to do that so instead, let's learn the transformation from a random noise to the distribution of data. The current model consists of two networks: Generator and Discriminator and the objective is formed as a 'Mini-Max' game in which every network tries to fool the other one and improve itself.
Some intuitive resources on VAE and GAN:
- Generative Models - Stanford
- GAN - Ali Ghodsi
- Understanding Generative Adversarial Networks
- VAE - Ali Ghodsi
- Understanding Variational Autoencoders
- Python = 3.7
- Pytorch = 1.9
git clone https://github.com/Hojjat-Mokhtarabadi/Latent-variable-and-Generative-models.git
cd Latent-variable-and-Generative-models
python3 -m pip install -r requirements.txt
-- config: The configuration file must be chosen from src/configs
-- eval: Evaluation of a pretrained model
-- family: The model family is either gan or ae
Note: the family type and configuration file should be specified in run.sh
cd src
bash ../run.sh