Skip to main content

a method of estimating parameters of a statistical model by choosing the parameter value that optimizes the probability of observing the given sample.

Given certain regularity conditions (e.g. the support of the density function does not depend on the unknown parameter), maximum-likelihood estimators are consistent, efficient (in that they achieve the Cramer-Rao lower bound) and are asymptotically normal with covariance matrix given by the inverse of the Fisher Information matrix.

Given that ML is a parametric method based on a specified distribution family, it relies on the correctness of the assumed distribution model of the data. In many cases it is not possible to find a closed form solution, thereby requiring numerical methods (e.g. Newton-Raphson search).