Skip to main content

Random variate representations based on repeated one-to-one transforms of a standard random variable which produce generative models and closed-form densities.

Normalizing flows are transformations of simpler probability distributions. They have tractable likelihoods and direct mechanisms for learning features. The transformation maps between the space of observations and a latent space. It satisfies three requirements.

  1. It is deterministic.
  2. It is invertible.
  3. It has an easily computable, easily differentiable determinant of the Jacobian.

Some example functions satisfying these requirements are here. Details about training models that have normalizing flows are here. The probability density of an observation is then computed (from the base distribution's pdf and the transformation) using the change of variables formula.

References