Namespaces
Variants
Actions

Difference between revisions of "Statistics"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (tex encoded by computer)
(latex details)
 
Line 24: Line 24:
 
$$  
 
$$  
 
{\mathsf P}  ^ {T} \{ B \}  =  {\mathsf P} \{ T( X) \in B \}  = \  
 
{\mathsf P}  ^ {T} \{ B \}  =  {\mathsf P} \{ T( X) \in B \}  = \  
{\mathsf P} \{ X \in T  ^ {-} 1 ( B) \} =
+
{\mathsf P} \{ X \in T  ^ {- 1 }( B) \} =
 
$$
 
$$
  
 
$$  
 
$$  
 
= \  
 
= \  
{\mathsf P}  ^ {X} \{ T  ^ {-} 1 ( B) \} \ \  
+
{\mathsf P}  ^ {X} \{ T  ^ {- 1} ( B) \} \ \  
 
(\forall B \in {\mathcal A}).
 
(\forall B \in {\mathcal A}).
 
$$
 
$$
Line 41: Line 41:
 
\overline{X}\;  =   
 
\overline{X}\;  =   
 
\frac{1}{n}
 
\frac{1}{n}
  \sum _ { i= } 1 ^ { n }  X _ {i} \  \textrm{ and } \ \  
+
  \sum _ {i=1} ^ { n }  X _ {i} \  \textrm{ and } \ \  
 
s  ^ {2}  =   
 
s  ^ {2}  =   
 
\frac{1}{(}
 
\frac{1}{(}
  n- 1) \sum _ { i= } 1 ^ { n }  ( X _ {i} - \overline{X}\; )  ^ {2}
+
  n- 1) \sum _{i=1} ^ { n }  ( X _ {i} - \overline{X}\; )  ^ {2}
 
$$
 
$$
  
Line 67: Line 67:
 
I _ {n} ( \lambda )  =   
 
I _ {n} ( \lambda )  =   
 
\frac{1}{2 \pi n }
 
\frac{1}{2 \pi n }
  \left | \sum _ { k= } 1 ^ { n }  X _ {k} e ^  
+
  \left | \sum _{k=1} ^ { n }  X _ {k} e ^  
 
{- ik \lambda } \right |  ^ {2} ,\ \  
 
{- ik \lambda } \right |  ^ {2} ,\ \  
 
\lambda \in [- \pi , \pi ],
 
\lambda \in [- \pi , \pi ],
Line 85: Line 85:
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1988)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  V.G. Voinov,  M.S. Nikulin,  "Unbiased estimates and their applications" , Moscow  (1989)  (In Russian)</TD></TR></table>
+
<table>
 +
<TR><TD valign="top">[1]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1988)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  V.G. Voinov,  M.S. Nikulin,  "Unbiased estimates and their applications" , Moscow  (1989)  (In Russian)</TD></TR>
 +
</table>

Latest revision as of 16:24, 6 January 2024


A term used in mathematical statistics as a name for functions of the results of observations.

Let a random variable $ X $ take values in the sample space $ ( \mathfrak X, {\mathcal B}, {\mathsf P} ^ {X} ) $. Any $ {\mathcal B} $- measurable mapping $ T( \cdot ) $ from $ \mathfrak X $ onto a measurable space $ ( \mathfrak Y, {\mathcal A} ) $ is then called a statistic, and the probability distribution of the statistic $ T $ is defined by the formula

$$ {\mathsf P} ^ {T} \{ B \} = {\mathsf P} \{ T( X) \in B \} = \ {\mathsf P} \{ X \in T ^ {- 1 }( B) \} = $$

$$ = \ {\mathsf P} ^ {X} \{ T ^ {- 1} ( B) \} \ \ (\forall B \in {\mathcal A}). $$

Examples.

1) Let $ X _ {1} \dots X _ {n} $ be independent identically-distributed random variables which have a variance. The statistics

$$ \overline{X}\; = \frac{1}{n} \sum _ {i=1} ^ { n } X _ {i} \ \textrm{ and } \ \ s ^ {2} = \frac{1}{(} n- 1) \sum _{i=1} ^ { n } ( X _ {i} - \overline{X}\; ) ^ {2} $$

are then unbiased estimators for the mathematical expectation $ {\mathsf E} X _ {1} $ and the variance $ {\mathsf D} X _ {1} $, respectively.

2) The terms of the variational series (series of order statistics, cf. Order statistic)

$$ X _ {(} 1) \leq \dots \leq X _ {(} n) , $$

constructed from the observations $ X _ {1} \dots X _ {n} $, are statistics.

3) Let the random variables $ X _ {1} \dots X _ {n} $ form a stationary stochastic process with spectral density $ f( \cdot ) $. In this case the statistic

$$ I _ {n} ( \lambda ) = \frac{1}{2 \pi n } \left | \sum _{k=1} ^ { n } X _ {k} e ^ {- ik \lambda } \right | ^ {2} ,\ \ \lambda \in [- \pi , \pi ], $$

called the periodogram, is an asymptotically-unbiased estimator for $ f( \cdot ) $, given certain specific conditions of regularity on $ f( \cdot ) $, i.e.

$$ \lim\limits _ {n \rightarrow \infty } {\mathsf E} I _ {n} ( \lambda ) = \ f( \lambda ),\ \ \lambda \in [- \pi , \pi ]. $$

In the theory of estimation and statistical hypotheses testing, great importance is attached to the concept of a sufficient statistic, which brings about a reduction of data without any loss of information on the (parametric) family of distributions under consideration.

References

[1] E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1988)
[2] V.G. Voinov, M.S. Nikulin, "Unbiased estimates and their applications" , Moscow (1989) (In Russian)
How to Cite This Entry:
Statistics. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Statistics&oldid=48822
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article