Official implementation of Unconstrained Monotonic Neural Networks (UMNN) and the experiments presented in the paper:
Antoine Wehenkel and Gilles Louppe. "Unconstrained Monotonic Neural Networks." (2019). [arxiv]
The code has been tested with Pytorch 1.1 and Python3.6. Some code to draw figures and load dataset are taken from FFJORD and Sylvester normalizing flows for variational inference.
python ToyExperiments.py
See ToyExperiments.py for optional arguments.
python MNISTExperiment.py
See MNISTExperiment.py for optional arguments.
You have to download the datasets with the following command:
python datasets/download_datasets.py
Then you can execute:
python UCIExperiments.py --data ['power', 'gas', 'hepmass', 'miniboone', 'bsds300']
See UCIExperiments.py for optional arguments.
You have to download the datasets:
- MNIST:
python datasets/download_datasets.py
- OMNIGLOT: the dataset can be downloaded from link;
- Caltech 101 Silhouettes: the dataset can be downloaded from link.
- Frey Faces: the dataset can be downloaded from link.
python TrainVaeFlow.py -d ['mnist', 'freyfaces', 'omniglot', 'caltech']
All the files related to the implementation of UMNN (Conditionner network, Integrand Network and Integral) are located in the folder models/UMNN.
NeuralIntegral.py
computes the integral of a neural network (with 1d output) using the Clenshaw-Curtis(CC) quadrature, it computes sequentially the different evaluation points required by CC.ParallelNeuralIntegral.py
processes all the evaluation points at once making the computation almost as fast as the forward evaluation the net but to the price of a higher memory cost.UMNNMAF.py
contains the implementation of the different networks required by UMNN.UMNNMAFFlow.py
contains the implementation of flows made of UMNNs.