Bias and Variance of an Estimator
Last updated
Last updated
Consider the following estimators of the mean of a distribution. X is an i.i.d. sample from the distribution.
(this is the MLE)
Now, draw a sample of size N (say N=3): X={6,1,5}
If we consider the means to be random variables, each of them will have a variance.
Say we want to estimate (here, of the distribution from which we are drawing X)
The desirable property of the estimator d of is that the expected value of d must be equal to the quantity we want to estimate i.e. . d is then called the unbiased estimator.
The bias of an estimator 'd' is given by:
If , d is an unbiased estimator.
Is an unbiased estimator of ?
Since (by definition),
Therefore, is an unbiased estimator of the mean .
Is an unbiased estimator of ?
The variance of an estimator 'd' is given by
More data leads to lower variance.
Clearly, iff . Therefore, is not an unbiased estimator of the mean
has the least variance (it is always 5!). has a lower variance than .
The square error of an estimator is given by: