Random variate representations based on repeated one-to-one transforms of a standard random variable which produce generative models and closed-form densities.
Normalizing flows are transformations of simpler probability distributions. They have tractable likelihoods and direct mechanisms for learning features. The transformation maps between the space of observations and a latent space. It satisfies three requirements.
- It is deterministic.
- It is invertible.
- It has an easily computable, easily differentiable determinant of the Jacobian.
Some example functions satisfying these requirements are here. Details about training models that have normalizing flows are here. The probability density of an observation is then computed (from the base distribution's pdf and the transformation) using the change of variables formula.
References
- Tabak and Vanden-Eijnden (Commun. Math. Sci. 2010): Density estimation by dual ascent of the log-likelihood
- Agnelli, Cadeiras, Tabak, Turner, and Vanden-Eijnden (Multiscale Model. Simul. 2012): Clustering and Classification through Normalizing Flows in Feature Space
- Tabak and Turner (Commun. Pure and App. Math. 2013): A Family of Nonparametric Density Estimation Algorithms
- Rippel and Adams (2013): High-Dimensional Probability Estimation with Deep Density Models
- Rezende and Mohamad (ICLR 2015): Variational Inference with Normalizing Flows
- Dinh, Krueger, and Bengio (ICLR 2015): NICE: Non-linear Independent Components Estimation
- Kobyzev, Prince, and Brubaker (2019): Normalizing Flows: An Introduction and Review of Current Methods
- Papamakarios, et al. (2019): Normalizing Flows for Probabilistic Modeling and Inference