Namespaces
Variants
Actions

Difference between revisions of "Sign test"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (Undo revision 48694 by Ulf Rehmann (talk))
Tag: Undo
m (tex encoded by computer)
Line 1: Line 1:
A [[Non-parametric test|non-parametric test]] for a hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850401.png" />, according to which a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850402.png" /> has a binomial distribution with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850403.png" />. If the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850404.png" /> is true, then
+
<!--
 +
s0850401.png
 +
$#A+1 = 25 n = 0
 +
$#C+1 = 25 : ~/encyclopedia/old_files/data/S085/S.0805040 Sign test
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850405.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850406.png" /></td> </tr></table>
+
A [[Non-parametric test|non-parametric test]] for a hypothesis  $  H _ {0} $,
 +
according to which a random variable  $  \mu $
 +
has a binomial distribution with parameters  $  ( n ;  p = 0 . 5 ) $.
 +
If the hypothesis  $  H _ {0} $
 +
is true, then
 +
 
 +
$$
 +
{\mathsf P} \left \{ \mu \leq  k \left | n ,
 +
\frac{1}{2}
 +
\right . \right \}  = \sum _
 +
{i = 0 } ^ { k }  \left ( \begin{array}{c}
 +
n \\
 +
i
 +
\end{array}
 +
\right ) \left (
 +
\frac{1}{2}
 +
\right )  ^ {n}  = \
 +
I _ {0,5} ( n - k , k + 1 ) ,
 +
$$
 +
 
 +
$$
 +
= 0 \dots n ,
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850407.png" /></td> </tr></table>
+
$$
 +
I _ {z} ( a , b )  =
 +
\frac{1}{B ( a , b ) }
 +
 
 +
\int\limits _ { 0 } ^ { z }  t  ^ {a-} 1 ( 1 - t )  ^ {b-} 1 dt ,\ \
 +
0 \leq  z \leq  1 ,
 +
$$
  
and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850408.png" /> is the beta-function. According to the sign test with significance level <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s0850409.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504010.png" />, the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504011.png" /> is rejected if
+
and $  B ( a , b ) $
 +
is the beta-function. According to the sign test with significance level $  \alpha $,
 +
0 < \alpha \leq  0 . 5 $,  
 +
the hypothesis $  H _ {0} $
 +
is rejected if
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504012.png" /></td> </tr></table>
+
$$
 +
\min \{ \mu , n - \mu \}  \leq  m ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504013.png" />, the critical value of the test, is the integer solution of the inequalities
+
where $  m = m ( \alpha , n ) $,  
 +
the critical value of the test, is the integer solution of the inequalities
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504014.png" /></td> </tr></table>
+
$$
 +
\sum _ {i = 0 } ^ { m }  \left ( \begin{array}{c}
 +
n \\
 +
i
 +
\end{array}
 +
\right )
 +
\left (
 +
\frac{1}{2}
 +
\right )  ^ {n}  \leq 
 +
\frac \alpha {2}
 +
,\ \
 +
\sum _ {i = 0 } ^ { {m }  + 1 } \left ( \begin{array}{c}
 +
n \\
 +
i
 +
\end{array}
 +
\right )
 +
\left (
 +
\frac{1}{2}
 +
\right )  ^ {n}  >
 +
\frac \alpha {2}
 +
.
 +
$$
  
The sign test can be used to test a hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504015.png" /> according to which the unknown continuous distribution of independent identically-distributed random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504016.png" /> is symmetric about zero, i.e. for any real <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504017.png" />,
+
The sign test can be used to test a hypothesis $  H _ {0} $
 +
according to which the unknown continuous distribution of independent identically-distributed random variables $  X _ {1} \dots X _ {n} $
 +
is symmetric about zero, i.e. for any real $  x $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504018.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ X _ {i} \langle  - x \}  = {\mathsf P} \{ X _ {i} \rangle x \} .
 +
$$
  
 
In this case the sign test is based on the statistic
 
In this case the sign test is based on the statistic
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504019.png" /></td> </tr></table>
+
$$
 +
\mu  = \sum _ { i= } 1 ^ { n }  \delta ( X _ {i} ) ,\ \
 +
\delta ( x)  = \left \{
 +
\begin{array}{ll}
 +
1  & \textrm{ if }  x > 0 ,  \\
 +
0  & \textrm{ if }  x < 0 ,  \\
 +
\end{array}
 +
 
 +
\right .$$
  
which is governed by a binomial law with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504020.png" /> if the hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504021.png" /> is true.
+
which is governed by a binomial law with parameters $  ( n ;  p = 0 . 5 ) $
 +
if the hypothesis $  H _ {0} $
 +
is true.
  
Similarly, the sign test is used to test a hypothesis <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504022.png" /> according to which the median of an unknown continuous distribution to which independent random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504023.png" /> are subject is <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504024.png" />; to this end one simply replaces the given random variables by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/s/s085/s085040/s08504025.png" />.
+
Similarly, the sign test is used to test a hypothesis $  H _ {0} $
 +
according to which the median of an unknown continuous distribution to which independent random variables $  X _ {1} \dots X _ {n} $
 +
are subject is $  \xi _ {0} $;  
 +
to this end one simply replaces the given random variables by $  Y _ {1} = X _ {1} - \xi _ {0} \dots Y _ {n} = X _ {n} - \xi _ {0} $.
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  L.N. Bol'shev,  N.V. Smirnov,  "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka  (1983)  (In Russian)  (Processed by L.S. Bark and E.S. Kedrova)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1986)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  B.L. van der Waerden,  "Mathematische Statistik" , Springer  (1957)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  N.V. Smirnov,  I.V. Dunin-Barkovskii,  "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft.  (1969)  (Translated from Russian)</TD></TR></table>
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  L.N. Bol'shev,  N.V. Smirnov,  "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka  (1983)  (In Russian)  (Processed by L.S. Bark and E.S. Kedrova)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  E.L. Lehmann,  "Testing statistical hypotheses" , Wiley  (1986)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  B.L. van der Waerden,  "Mathematische Statistik" , Springer  (1957)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  N.V. Smirnov,  I.V. Dunin-Barkovskii,  "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft.  (1969)  (Translated from Russian)</TD></TR></table>

Revision as of 14:55, 7 June 2020


A non-parametric test for a hypothesis $ H _ {0} $, according to which a random variable $ \mu $ has a binomial distribution with parameters $ ( n ; p = 0 . 5 ) $. If the hypothesis $ H _ {0} $ is true, then

$$ {\mathsf P} \left \{ \mu \leq k \left | n , \frac{1}{2} \right . \right \} = \sum _ {i = 0 } ^ { k } \left ( \begin{array}{c} n \\ i \end{array} \right ) \left ( \frac{1}{2} \right ) ^ {n} = \ I _ {0,5} ( n - k , k + 1 ) , $$

$$ k = 0 \dots n , $$

where

$$ I _ {z} ( a , b ) = \frac{1}{B ( a , b ) } \int\limits _ { 0 } ^ { z } t ^ {a-} 1 ( 1 - t ) ^ {b-} 1 dt ,\ \ 0 \leq z \leq 1 , $$

and $ B ( a , b ) $ is the beta-function. According to the sign test with significance level $ \alpha $, $ 0 < \alpha \leq 0 . 5 $, the hypothesis $ H _ {0} $ is rejected if

$$ \min \{ \mu , n - \mu \} \leq m , $$

where $ m = m ( \alpha , n ) $, the critical value of the test, is the integer solution of the inequalities

$$ \sum _ {i = 0 } ^ { m } \left ( \begin{array}{c} n \\ i \end{array} \right ) \left ( \frac{1}{2} \right ) ^ {n} \leq \frac \alpha {2} ,\ \ \sum _ {i = 0 } ^ { {m } + 1 } \left ( \begin{array}{c} n \\ i \end{array} \right ) \left ( \frac{1}{2} \right ) ^ {n} > \frac \alpha {2} . $$

The sign test can be used to test a hypothesis $ H _ {0} $ according to which the unknown continuous distribution of independent identically-distributed random variables $ X _ {1} \dots X _ {n} $ is symmetric about zero, i.e. for any real $ x $,

$$ {\mathsf P} \{ X _ {i} \langle - x \} = {\mathsf P} \{ X _ {i} \rangle x \} . $$

In this case the sign test is based on the statistic

$$ \mu = \sum _ { i= } 1 ^ { n } \delta ( X _ {i} ) ,\ \ \delta ( x) = \left \{ \begin{array}{ll} 1 & \textrm{ if } x > 0 , \\ 0 & \textrm{ if } x < 0 , \\ \end{array} \right .$$

which is governed by a binomial law with parameters $ ( n ; p = 0 . 5 ) $ if the hypothesis $ H _ {0} $ is true.

Similarly, the sign test is used to test a hypothesis $ H _ {0} $ according to which the median of an unknown continuous distribution to which independent random variables $ X _ {1} \dots X _ {n} $ are subject is $ \xi _ {0} $; to this end one simply replaces the given random variables by $ Y _ {1} = X _ {1} - \xi _ {0} \dots Y _ {n} = X _ {n} - \xi _ {0} $.

References

[1] L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , Libr. math. tables , 46 , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova)
[2] E.L. Lehmann, "Testing statistical hypotheses" , Wiley (1986)
[3] B.L. van der Waerden, "Mathematische Statistik" , Springer (1957)
[4] N.V. Smirnov, I.V. Dunin-Barkovskii, "Mathematische Statistik in der Technik" , Deutsch. Verlag Wissenschaft. (1969) (Translated from Russian)
How to Cite This Entry:
Sign test. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Sign_test&oldid=49582
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article