Namespaces
Variants
Actions

Consistent estimator

From Encyclopedia of Mathematics
Revision as of 17:07, 7 February 2011 by 127.0.0.1 (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

An abbreviated form of the term "consistent sequence of estimators" , applied to a sequence of statistical estimators converging to a value being evaluated.

In probability theory, there are several different notions of the concept of convergence, of which the most important for the theory of statistical estimation are convergence in probability and convergence with probability 1. If a sequence of statistical estimators converges in probability to the value being evaluated, then one says that this sequence is "weakly consistent estimatorweakly consistent" or simply "consistent" , and one reserves the term "strongly consistent estimatorstrong consistency" for a sequence of estimators that converges with probability 1 to the value being evaluated.

Example 1)

Let be independent random variables with the same normal distribution . Then the statistics

and

are consistent estimators for the parameters and , respectively.

Example 2) Let be independent random variables subject to the same probability law, the distribution function of which is . In this case, the empirical distribution function constructed from an initial sample is a consistent estimator of .

Example 3) Let be independent random variables subject to the same Cauchy law, whose probability density is . For any natural number , the statistic

is subject to the initial Cauchy law, hence the sequence of estimators does not converge in probability to , that is, in the given example the sequence is not consistent. A consistent estimator for here is the sample median.

A consistent estimator has the following property: If is a continuous function and is a consistent estimator of a parameter , then is a consistent estimator for . The most common method for obtaining statistical point estimators is the maximum-likelihood method, which gives a consistent estimator. It must be noted that a consistent estimator of a parameter is not unique, since any estimator of the form is also consistent, where is a sequence of random variables converging in probability to zero. This fact reduces the value of the concept of a consistent estimator.

References

[1] H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)
[2] I.A. Ibragimov, R.Z. [R.Z. Khas'minskii] Has'minskii, "Statistical estimation: asymptotic theory" , Springer (1981) (Translated from Russian)
How to Cite This Entry:
Consistent estimator. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Consistent_estimator&oldid=14325
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article