Parametric Estimation
This section discusses how to estimate the parameters of a distribution i.e. of the line .
We denote the parameters by
The likelihood of given a sample X is given by:
Therefore, the Log Likelihood of given a sample X is denoted by:
This assumes that the observations in X are independent.
The Maximum Likelihood Estimator (MLE) is given by:
Estimating the Parameter of a Bernoulli Distribution
X is a Bernoulli Random Variable.
For example, consider the following:
Let 1 denote Heads, and 0 denote Tails. Say X = {1,1,0} We need to determine i.e. .
We have
More generally, for , we have:
It can be proved that the MLE is given by .
Estimating the Parameters of a Multinomial Distribution
Consider a die with 6 faces numbered from 1 to 6.
If X is a Multinomial Random Variable, there are k>2 possible values of X (here, 6).
Say X={5, 4, 6}. We can imagine indicator vectors for each observation as [0 0 0 0 1 0], [0 0 0 1 0 0] and [0 0 0 0 0 1].
Say X={4,6,4,2,3,3}. The MLE of i.e. side i shows up, can be given by:
.
Estimating the Parameters of a Gaussian Distribution
The MLE for the mean m is and the MLE for the variance is .
However, if we divide by N-1 instead of N (for variance), it is called the unbiased estimate.
Last updated