Basic examples of Maximum Likelihood Estimation
Estimation of best parameter for iid Exponential Distributions
Let be a random sample from the exponential distribution with probability density functions of the form for and any parameter The likelihood function is then given as the product
We look for the parameter value that offers an absolute maximum of Notice that, since the logarithm is a one-to-one increasing function, the maximum of coincides with the maximum of The latter expression is easier to handle than the former, so we use this one to look for the extrema in the usual way:
Set it is then Note that if and only if which happens to be positive and actually a maximum of
Note that the found parameter is nothing but the arithmetic mean of
Estimation of best parameter for iid Geometric Distributions
In this case, the random sample for the Geometric distribution has probability density functions of the form for any and parameter We operate as in the previous example, by looking for extrema of the log-likelihood function:
- Set for
- Consider but only for
- It is then
- if and only if
This time, the solution coincides with the inverse of the arithmetic mean of the samples (which is trivially positive and less than one). It is not hard to prove that this critical point is a maximum, and therefore is the parameter that we are looking for.
Estimation of best parameter for iid Poisson Distributions
The random variables in this case have probability density functions given by for any and parameter
- Set
- Set
- Its derivative is given by
- Note that only for which is trivially a maximum for
As in the case of exponential distributions, the computed parameter is the arithmetic mean of
Estimation of best parameter for iid Normal Distributions
This case is a bit different, since we are dealing with two parameters instead of one: Assume is a random sample from the normal distribution with probability density functions of the form for any and parameters For ease of computations below, and since the parameter appears always squared on the expression of , we prefer to work instead with and require the parameter to be non-negative. Note the abuse of notation, and how this does not really affect the final result. We proceed to compute the likelihood function and its logarithm as before:
- The partial derivatives of are given by
- Note that if and only if Let us denote it by since it represents the mean of the values
- Also, by virtue of the previous statement, a solution for is given uniquely by . Note that this value (which is positive, and hence satisfies the constraints) coincides with the variance of the set It is a priori a valid parameter for
- It is not hard to see that the computed critical point offers indeed an absolute maximum for Indeed, the Hessian of is given by:
Its determinant at is always positive: and since is always negative, a maximum is attained.
References
Maximum Likelihood Estimation and Inference: With Examples in R, SAS and ADMB (Statistics in Practice)