Skip to main content

All Questions

Filter by
Sorted by
Tagged with
4 votes
1 answer
2k views

Does minimizing KL-divergence result in maximum entropy principle?

The Kullback-Leibler divergence (or relative entropy) is a measure of how a probability distribution differs from another reference probability distribution. I want to know what connection it has to ...
develarist's user avatar
  • 4,049
0 votes
0 answers
73 views

How to minimise and expectation with respect to a parameter?

Suppose that $X$ is a random variable with distribution $G$. Let $H(X;\theta)$ be a parametric function with $\theta \in \Theta \subset {\mathbb R}^p$. I want to maximize the function $$\varphi(\...
Lex's user avatar
  • 1
2 votes
0 answers
225 views

Interpreting base measure in exponential family as an improper prior (because entropy)

Long-time listener, first-time caller. I'm reading the Wikipedia pages on exponential families and maximum entropy probability distributions, and trying to wrap my head round the role of the base ...
Sophie M's user avatar
  • 121
6 votes
2 answers
863 views

Maximum entropy sampler

I want to sample from a distribution which has fixed to a given values mean(=0), standard deviation(=1), skewness(=0) and kurtosis. I also want this distribution to be as general as possible, i.e. to ...
Adam Ryczkowski's user avatar