Namespaces
Variants
Actions

Difference between revisions of "Student test"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
Line 1: Line 1:
''<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907202.png" />-test''
+
<!--
 +
s0907202.png
 +
$#A+1 = 70 n = 0
 +
$#C+1 = 70 : ~/encyclopedia/old_files/data/S090/S.0900720 Student test,
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
 +
'' $  t $-
 +
test''
  
 
A [[Significance test|significance test]] for the mean value of a [[Normal distribution|normal distribution]].
 
A [[Significance test|significance test]] for the mean value of a [[Normal distribution|normal distribution]].
  
 
==The single-sample Student test.==
 
==The single-sample Student test.==
Let the independent random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907203.png" /> be subject to the normal law <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907204.png" />, the parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907205.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907206.png" /> of which are unknown, and let a [[Simple hypothesis|simple hypothesis]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907207.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907208.png" /> be tested against the composite alternative <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s0907209.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072010.png" />. In solving this problem, a Student test is used, based on the statistic
+
Let the independent random variables $  X _ {1} \dots X _ {n} $
 +
be subject to the normal law $  N _ {1} ( a, \sigma  ^ {2} ) $,  
 +
the parameters $  a $
 +
and $  \sigma  ^ {2} $
 +
of which are unknown, and let a [[Simple hypothesis|simple hypothesis]] $  H _ {0} $:  
 +
$  a = a _ {0} $
 +
be tested against the composite alternative $  H _ {1} $:  
 +
$  a \neq a _ {0} $.  
 +
In solving this problem, a Student test is used, based on the statistic
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072011.png" /></td> </tr></table>
+
$$
 +
t _ {n-} 1  = \sqrt n
 +
\frac{\overline{X}\; - a _ {0} }{s}
 +
,
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072012.png" /></td> </tr></table>
+
$$
 +
\overline{X}\;  =
 +
\frac{1}{n}
 +
\sum _ { i= } 1 ^ { n }  X _ {i} \  \textrm{ and } \ \
 +
s  ^ {2}  =
 +
\frac{1}{n-}
 +
1 \sum _ { i= } 1 ^ { n }  ( X _ {i} - \overline{X}\; )  ^ {2}
 +
$$
  
are estimators of the parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072013.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072014.png" />, calculated with respect to the sample <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072015.png" />. When <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072016.png" /> is correct, the statistic <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072017.png" /> is subject to the [[Student distribution|Student distribution]] with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072018.png" /> degrees of freedom, i.e.
+
are estimators of the parameters $  a $
 +
and $  \sigma  ^ {2} $,  
 +
calculated with respect to the sample $  X _ {1} \dots X _ {n} $.  
 +
When $  H _ {0} $
 +
is correct, the statistic $  t _ {n-} 1 $
 +
is subject to the [[Student distribution|Student distribution]] with $  f = n- 1 $
 +
degrees of freedom, i.e.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072019.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ | t _ {n-} 1 | < t \mid  H _ {0} \}  = \
 +
2S _ {n-} 1 ( t) - 1,\ \
 +
t > 0,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072020.png" /> is the Student distribution function with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072021.png" /> degrees of freedom. According to the single-sample Student test with significance level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072022.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072023.png" />, the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072024.png" /> must be accepted if
+
where $  S _ {f} ( t) $
 +
is the Student distribution function with $  f $
 +
degrees of freedom. According to the single-sample Student test with significance level $  \alpha $,
 +
$  0 < \alpha < 0.5 $,  
 +
the hypothesis $  H _ {0} $
 +
must be accepted if
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072025.png" /></td> </tr></table>
+
$$
 +
\left | \sqrt n
 +
\frac{\overline{X}\; - a _ {0} }{s}
 +
\right |  < t _ {n-} 1 \left ( 1 -
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072026.png" /> is the [[Quantile|quantile]] of level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072027.png" /> of the Student distribution with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072028.png" /> degrees of freedom, i.e. <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072029.png" /> is the solution of the equation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072030.png" />. On the other hand, if
+
\frac \alpha {2}
 +
\right ) ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072031.png" /></td> </tr></table>
+
where  $  t _ {n-} 1 ( 1- \alpha /2) $
 +
is the [[Quantile|quantile]] of level  $  1- \alpha /2 $
 +
of the Student distribution with  $  f= n- 1 $
 +
degrees of freedom, i.e. $  t _ {n-} 1 ( 1- \alpha /2) $
 +
is the solution of the equation  $  S _ {n-} 1 ( t) = 1- \alpha /2 $.  
 +
On the other hand, if
  
then, according to the Student test of level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072032.png" />, the tested hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072033.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072034.png" /> has to be rejected, and the alternative hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072035.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072036.png" /> has to be accepted.
+
$$
 +
\left | \sqrt n
 +
\frac{\overline{X}\; - a _ {0} }{s}
 +
\right |  \geq  t _ {n-} 1 \left ( 1 -
 +
 
 +
\frac \alpha {2}
 +
\right ) ,
 +
$$
 +
 
 +
then, according to the Student test of level $  \alpha $,  
 +
the tested hypothesis $  H _ {0} $:  
 +
$  a = a _ {0} $
 +
has to be rejected, and the alternative hypothesis $  H _ {1} $:  
 +
$  a \neq a _ {0} $
 +
has to be accepted.
  
 
==The two-sample Student test.==
 
==The two-sample Student test.==
Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072037.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072038.png" /> be mutually independent normally-distributed random variables with the same unknown variance <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072039.png" />, and let
+
Let $  X _ {1} \dots X _ {n} $
 +
and $  Y _ {1} \dots Y _ {m} $
 +
be mutually independent normally-distributed random variables with the same unknown variance $  \sigma  ^ {2} $,
 +
and let
 +
 
 +
$$
 +
{\mathsf E} X _ {1}  = \dots = {\mathsf E} X _ {n}  = a _ {1} ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072040.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} Y _ {1}  = \dots = {\mathsf E} Y _ {m}  = a _ {2} ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072041.png" /></td> </tr></table>
+
where the parameters  $  a _ {1} $
 +
and  $  a _ {2} $
 +
are also unknown (it is often said that there are two independent normal samples). Moreover, let the hypothesis  $  H _ {0} $:  
 +
$  a _ {1} = a _ {2} $
 +
be tested against the alternative  $  H _ {1} $:  
 +
$  a _ {1} \neq a _ {2} $.  
 +
In this instance, both hypotheses are composite. Using the observations  $  X _ {1} \dots X _ {n} $
 +
and  $  Y _ {1} \dots Y _ {m} $
 +
it is possible to calculate the estimators
  
where the parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072042.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072043.png" /> are also unknown (it is often said that there are two independent normal samples). Moreover, let the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072044.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072045.png" /> be tested against the alternative <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072046.png" />: <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072047.png" />. In this instance, both hypotheses are composite. Using the observations <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072048.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072049.png" /> it is possible to calculate the estimators
+
$$
 +
\overline{X}\;  =
 +
\frac{1}{n}
 +
\sum _ { i= } 1 ^ { n }  X _ {i} \  \textrm{ and } \ \
 +
\overline{Y}\;  =
 +
\frac{1}{m}
 +
\sum _ { j= } 1 ^ { m }  Y _ {j}  $$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072050.png" /></td> </tr></table>
+
for the unknown mathematical expectations  $  a _ {1} $
 +
and  $  a _ {2} $,
 +
as well as the estimators
  
for the unknown mathematical expectations <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072051.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072052.png" />, as well as the estimators
+
$$
 +
s _ {1}  ^ {2}  =
 +
\frac{1}{n-}
 +
1 \sum _ { i= } 1 ^ { n }  ( X _ {i} - \overline{X}\; )  ^ {2}
 +
\  \textrm{ and } \ \
 +
s _ {2}  ^ {2}  =
 +
\frac{1}{m-}
 +
1 \sum _ { j= } 1 ^ { m }  ( Y _ {j} - \overline{Y}\; )  ^ {2}
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072053.png" /></td> </tr></table>
+
for the unknown variance  $  \sigma  ^ {2} $.  
 +
Moreover, let
  
for the unknown variance <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072054.png" />. Moreover, let
+
$$
 +
s  ^ {2}  =
 +
\frac{1}{n+}
 +
m- 2 [( n- 1) s _ {1}  ^ {2} + ( m- 1) s _ {2}  ^ {2} ].
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072055.png" /></td> </tr></table>
+
Then, when  $  H _ {0} $
 +
is correct, the statistic
  
Then, when <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072056.png" /> is correct, the statistic
+
$$
 +
t _ {n+} m- 2  =
 +
\frac{\overline{X}\; - \overline{Y}\; }{s \sqrt 1/n+ 1/m }
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072057.png" /></td> </tr></table>
+
$$
  
is subject to the Student distribution with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072058.png" /> degrees of freedom. This fact forms the basis of the two-sample Student test for testing <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072059.png" /> against <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072060.png" />. According to the two-sample Student test of level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072061.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072062.png" />, the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072063.png" /> is accepted if
+
is subject to the Student distribution with $  f = n+ m- 2 $
 +
degrees of freedom. This fact forms the basis of the two-sample Student test for testing $  H _ {0} $
 +
against $  H _ {1} $.  
 +
According to the two-sample Student test of level $  \alpha $,
 +
$  0 < \alpha < 0.5 $,  
 +
the hypothesis $  H _ {0} $
 +
is accepted if
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072064.png" /></td> </tr></table>
+
$$
 +
| t _ {n+} m- 2 |  < t _ {n+} m- 2 \left ( 1 -  
 +
\frac \alpha {2}
 +
\right ) ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072065.png" /> is the quantile of level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072066.png" /> of the Student distribution with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072067.png" /> degrees of freedom. If
+
where $  t _ {n+} m- 2 ( 1- \alpha /2) $
 +
is the quantile of level $  1- \alpha /2 $
 +
of the Student distribution with $  f= n+ m- 2 $
 +
degrees of freedom. If
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072068.png" /></td> </tr></table>
+
$$
 +
| t _ {n+} m- 2 |  \geq  t _ {n+} m- 2 \left ( 1-
 +
\frac \alpha {2}
 +
\right ) ,
 +
$$
  
then, according to the Student test of level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072069.png" />, the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072070.png" /> is rejected in favour of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s090/s090720/s09072071.png" />.
+
then, according to the Student test of level $  \alpha $,  
 +
the hypothesis $  H _ {0} $
 +
is rejected in favour of $  H _ {1} $.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  S.S. Wilks,  "Mathematical statistics" , Wiley  (1962)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  N.V. Smirnov,  I.V. Dunin-Barkovskii,  "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft.  (1969)  (Translated from Russian)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  L.N. Bol'shev,  N.V. Smirnov,  "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka  (1983)  (In Russian)  (Processed by L.S. Bark and E.S. Kedrova)</TD></TR><TR><TD valign="top">[5]</TD> <TD valign="top">  Yu.V. Linnik,  "Methoden der kleinsten Quadraten in moderner Darstellung" , Deutsch. Verlag Wissenschaft.  (1961)  (Translated from Russian)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  H. Cramér,  "Mathematical methods of statistics" , Princeton Univ. Press  (1946)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  S.S. Wilks,  "Mathematical statistics" , Wiley  (1962)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  N.V. Smirnov,  I.V. Dunin-Barkovskii,  "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft.  (1969)  (Translated from Russian)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  L.N. Bol'shev,  N.V. Smirnov,  "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka  (1983)  (In Russian)  (Processed by L.S. Bark and E.S. Kedrova)</TD></TR><TR><TD valign="top">[5]</TD> <TD valign="top">  Yu.V. Linnik,  "Methoden der kleinsten Quadraten in moderner Darstellung" , Deutsch. Verlag Wissenschaft.  (1961)  (Translated from Russian)</TD></TR></table>

Revision as of 08:24, 6 June 2020


$ t $- test

A significance test for the mean value of a normal distribution.

The single-sample Student test.

Let the independent random variables $ X _ {1} \dots X _ {n} $ be subject to the normal law $ N _ {1} ( a, \sigma ^ {2} ) $, the parameters $ a $ and $ \sigma ^ {2} $ of which are unknown, and let a simple hypothesis $ H _ {0} $: $ a = a _ {0} $ be tested against the composite alternative $ H _ {1} $: $ a \neq a _ {0} $. In solving this problem, a Student test is used, based on the statistic

$$ t _ {n-} 1 = \sqrt n \frac{\overline{X}\; - a _ {0} }{s} , $$

where

$$ \overline{X}\; = \frac{1}{n} \sum _ { i= } 1 ^ { n } X _ {i} \ \textrm{ and } \ \ s ^ {2} = \frac{1}{n-} 1 \sum _ { i= } 1 ^ { n } ( X _ {i} - \overline{X}\; ) ^ {2} $$

are estimators of the parameters $ a $ and $ \sigma ^ {2} $, calculated with respect to the sample $ X _ {1} \dots X _ {n} $. When $ H _ {0} $ is correct, the statistic $ t _ {n-} 1 $ is subject to the Student distribution with $ f = n- 1 $ degrees of freedom, i.e.

$$ {\mathsf P} \{ | t _ {n-} 1 | < t \mid H _ {0} \} = \ 2S _ {n-} 1 ( t) - 1,\ \ t > 0, $$

where $ S _ {f} ( t) $ is the Student distribution function with $ f $ degrees of freedom. According to the single-sample Student test with significance level $ \alpha $, $ 0 < \alpha < 0.5 $, the hypothesis $ H _ {0} $ must be accepted if

$$ \left | \sqrt n \frac{\overline{X}\; - a _ {0} }{s} \right | < t _ {n-} 1 \left ( 1 - \frac \alpha {2} \right ) , $$

where $ t _ {n-} 1 ( 1- \alpha /2) $ is the quantile of level $ 1- \alpha /2 $ of the Student distribution with $ f= n- 1 $ degrees of freedom, i.e. $ t _ {n-} 1 ( 1- \alpha /2) $ is the solution of the equation $ S _ {n-} 1 ( t) = 1- \alpha /2 $. On the other hand, if

$$ \left | \sqrt n \frac{\overline{X}\; - a _ {0} }{s} \right | \geq t _ {n-} 1 \left ( 1 - \frac \alpha {2} \right ) , $$

then, according to the Student test of level $ \alpha $, the tested hypothesis $ H _ {0} $: $ a = a _ {0} $ has to be rejected, and the alternative hypothesis $ H _ {1} $: $ a \neq a _ {0} $ has to be accepted.

The two-sample Student test.

Let $ X _ {1} \dots X _ {n} $ and $ Y _ {1} \dots Y _ {m} $ be mutually independent normally-distributed random variables with the same unknown variance $ \sigma ^ {2} $, and let

$$ {\mathsf E} X _ {1} = \dots = {\mathsf E} X _ {n} = a _ {1} , $$

$$ {\mathsf E} Y _ {1} = \dots = {\mathsf E} Y _ {m} = a _ {2} , $$

where the parameters $ a _ {1} $ and $ a _ {2} $ are also unknown (it is often said that there are two independent normal samples). Moreover, let the hypothesis $ H _ {0} $: $ a _ {1} = a _ {2} $ be tested against the alternative $ H _ {1} $: $ a _ {1} \neq a _ {2} $. In this instance, both hypotheses are composite. Using the observations $ X _ {1} \dots X _ {n} $ and $ Y _ {1} \dots Y _ {m} $ it is possible to calculate the estimators

$$ \overline{X}\; = \frac{1}{n} \sum _ { i= } 1 ^ { n } X _ {i} \ \textrm{ and } \ \ \overline{Y}\; = \frac{1}{m} \sum _ { j= } 1 ^ { m } Y _ {j} $$

for the unknown mathematical expectations $ a _ {1} $ and $ a _ {2} $, as well as the estimators

$$ s _ {1} ^ {2} = \frac{1}{n-} 1 \sum _ { i= } 1 ^ { n } ( X _ {i} - \overline{X}\; ) ^ {2} \ \textrm{ and } \ \ s _ {2} ^ {2} = \frac{1}{m-} 1 \sum _ { j= } 1 ^ { m } ( Y _ {j} - \overline{Y}\; ) ^ {2} $$

for the unknown variance $ \sigma ^ {2} $. Moreover, let

$$ s ^ {2} = \frac{1}{n+} m- 2 [( n- 1) s _ {1} ^ {2} + ( m- 1) s _ {2} ^ {2} ]. $$

Then, when $ H _ {0} $ is correct, the statistic

$$ t _ {n+} m- 2 = \frac{\overline{X}\; - \overline{Y}\; }{s \sqrt 1/n+ 1/m } $$

is subject to the Student distribution with $ f = n+ m- 2 $ degrees of freedom. This fact forms the basis of the two-sample Student test for testing $ H _ {0} $ against $ H _ {1} $. According to the two-sample Student test of level $ \alpha $, $ 0 < \alpha < 0.5 $, the hypothesis $ H _ {0} $ is accepted if

$$ | t _ {n+} m- 2 | < t _ {n+} m- 2 \left ( 1 - \frac \alpha {2} \right ) , $$

where $ t _ {n+} m- 2 ( 1- \alpha /2) $ is the quantile of level $ 1- \alpha /2 $ of the Student distribution with $ f= n+ m- 2 $ degrees of freedom. If

$$ | t _ {n+} m- 2 | \geq t _ {n+} m- 2 \left ( 1- \frac \alpha {2} \right ) , $$

then, according to the Student test of level $ \alpha $, the hypothesis $ H _ {0} $ is rejected in favour of $ H _ {1} $.

References

[1] H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)
[2] S.S. Wilks, "Mathematical statistics" , Wiley (1962)
[3] N.V. Smirnov, I.V. Dunin-Barkovskii, "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft. (1969) (Translated from Russian)
[4] L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , Libr. math. tables , 46 , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova)
[5] Yu.V. Linnik, "Methoden der kleinsten Quadraten in moderner Darstellung" , Deutsch. Verlag Wissenschaft. (1961) (Translated from Russian)
How to Cite This Entry:
Student test. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Student_test&oldid=48883
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article