Statistics [10]: Evaluation of Point Estimation

4 minute read

Published:

Properties of point estimation, mean squre error and minimum variance unbiased estimation.


Properties of Point Estimation

Consistency

Assume is an unknown parameter, is an estimation obtained from samples. If for any , there exists that satisfies . Then is called a consistent estimation of .

Bias

Assume is an unknown parameter, is an estimation obtained from samples. If for any there is . Then is called an unbiased estimation of .

Efficiency

Assume and are two unbiased estimation. if for any there is , and there exists at least one so that . Then is more efficient than .


Example 1

Assume are samples from uniform distribution , the moment estimation is and the maximum likelihood estimation is , consider the unbiasedness of and .

Solution. Fisrt, , hence, is an unbiased estimation.

As for , when , the distribution function of is

Then, the probability function would be

Hence,

Hence, is a biased estimation.

Example 2

Assume are samples from uniform distribution , and , compare their efficiency.

Solution. For , we have

For , firstly, from example 1, we have

Then, we have

Therefore,


Mean Square Error (MSE)

Example 3

Assume are samples from , calculate MSE of moment estimation and maximum likelihood estimation of and .

Solution. Fisrtly, moment estimation of and are and , and maximum likelihood estimation of and are and .

For and ,

To obtain , we can use the fact that and , so that

For and ,

Therefore,


Minimum Variance Unbiased (MVU)

Assume is an unbiased estimation of , if for any unbiased estimation , holds for any , then is called minimum variance unbiased estimation of .

Cramer-Rao Inequality

Suppose is an unbiased estimator of , the variance of any unbiased estimator is then bounded by

where

is called fisher information, is the number of the samples and is the probability density function.

Proof. Firstly,

Denoting

Then

On the other hand,

Therefore,

Example 4

Assume are samples from exponential distribution , verify that is a minimum variance unbiased estimation.

Solution.

Therefore,

Example 5

Assume are samples from Poisson distribution , verify that is a minimum variance unbiased estimation.

Solution.

Therefore,

Example 6

Assume are samples from normal distribution , is known, verify that is a minimum variance unbiased estimation.

Solution.

Therefore,


Improvement of Unbiased Estimation

Rao-Blackwell Inequality

Assume propability density function of the population is , are samples. is sufficient statistics of , then for any unbiased estimation of , is also an unbiased estimation of , and there is .

Sufficient Statistics

Assume are samples from a population and the distribution function is , if given , the distribution of is independent from , then is called sufficient statistics of .

Theorem. Assume the distribution function of the population is , are samples from the population, then is sufficient statistics of , if and only if: there exist two functions and , for any and , there is .

Example 7

Assume are samples from a normal distribution . Then, is sufficient statistic of .

Solution.

Example 8

Assume the arrival of the customers every nimute follows Poisson distribution , are samples of consecutive minutes, estimate the probability that no customer comes within one minute .

Solution.

is an unbiased estimation of

is sufficient statistics of

is an improvement of unbiased estimation of


Table of Contents

Comments