Maximum Likelihood Estimation
Definition
MLE or Maximum likelihood estimation , is defined as a
function of the parameters involved in any chosen statistical model under
investigation , it gives us intuitions about the parameters whose values are
most likely produced the probability distributions and/or the data observed.
Example
I have two examples of maximum likelihood estimation one for
binomial distribution and another one for normal distribution .
Here, I will an example of maximum likelihood estimation for
normal distribution.
Suppose that we have a random variate called X which belongs
to a normal distribution that obey the following criterion.
X ~ N(Mu,SD)
Let Mean(Mu) = Theta
which is the parameter we need to estimate and for our conveniences we will let
the standard deviation equals 1.
So, given a probability density function for the normal
distribution as below .
We substitute the mean variable by our parameter symbol
theta which we need to estimate its value and substituting the values for
standard deviation to 1, we will obtain the following equation in return.
Now , we need to derive the likelihood function for a
general normal distribution, assume that we have a sample of I.I.D random
variables which are independent but identically distributed random variables.
A general likelihood estimation function would be as follows
Now , this accordingly will be
Finally , we will have
No comments:
Post a Comment