This library contains a Pytorch implementation of the hyperspherical variational auto-encoder, or S-VAE, as presented in [1](http://arxiv.org/abs/1804.00891). Check also our blogpost (https://nicola-decao.github.io/s-vae).
- Don't use Pytorch? Take a look here for a tensorflow implementation!
- python>=3.6
- pytorch>=0.4.1: https://pytorch.org
- scipy: https://scipy.org
- numpy: https://www.numpy.org
To install, run
$ python setup.py install
- distributions: Pytorch implementation of the von Mises-Fisher and hyperspherical Uniform distributions. Both inherit from
torch.distributions.Distribution
. - ops: Low-level operations used for computing the exponentially scaled modified Bessel function of the first kind and its derivative.
- examples: Example code for using the library within a PyTorch project.
Please have a look into the examples folder. We adapted our implementation to follow the structure of the Pytorch probability distributions.
Please cite [1] in your work when using this library in your experiments.
To sample the von Mises-Fisher distribution we follow the rejection sampling procedure as outlined by Ulrich, 1984. This simulation pipeline is visualized below:
Note that as is a scalar, this approach does not suffer from the curse of dimensionality. For the final transformation, , a Householder reflection is utilized.
For questions and comments, feel free to contact Nicola De Cao or Tim Davidson.
MIT
[1] Davidson, T. R., Falorsi, L., De Cao, N., Kipf, T.,
and Tomczak, J. M. (2018). Hyperspherical Variational
Auto-Encoders. 34th Conference on Uncertainty in Artificial Intelligence (UAI-18).
BibTeX format:
@article{s-vae18,
title={Hyperspherical Variational Auto-Encoders},
author={Davidson, Tim R. and
Falorsi, Luca and
De Cao, Nicola and
Kipf, Thomas and
Tomczak, Jakub M.},
journal={34th Conference on Uncertainty in Artificial Intelligence (UAI-18)},
year={2018}
}