Namespaces
Variants
Actions

Difference between revisions of "Sheppard corrections"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (tex encoded by computer)
m (fix tex)
 
Line 18: Line 18:
 
be a continuously-distributed random variable for which the probability density  $  p( x) $,  
 
be a continuously-distributed random variable for which the probability density  $  p( x) $,  
 
$  x \in \mathbf R  ^ {1} $,  
 
$  x \in \mathbf R  ^ {1} $,  
has an everywhere continuous derivative  $  p  ^ {(} s) ( x) $
+
has an everywhere continuous derivative  $  p  ^ {( s)} ( x) $
 
of order  $  s $
 
of order  $  s $
 
on  $  \mathbf R  ^ {1} $
 
on  $  \mathbf R  ^ {1} $
Line 24: Line 24:
  
 
$$  
 
$$  
p  ^ {(} s) ( x)  =  O( | x | ^ {- 1- \delta } ) \  \textrm{ as }  x \rightarrow \infty
+
p  ^ {( s)} ( x)  =  O( | x | ^ {- 1- \delta } ) \  \textrm{ as }  x \rightarrow \infty
 
$$
 
$$
  
Line 53: Line 53:
  
 
$$  
 
$$  
a _ {i}  =  \sum _ {m=- \infty } ^ { {+ \infty } x _ {m}  ^ {i} {\mathsf P}
+
a _ {i}  =  \sum _ {m=- \infty } ^ { {+ \infty} } x _ {m}  ^ {i} {\mathsf P}
 
\{ Y = x _ {m} \} =
 
\{ Y = x _ {m} \} =
 
$$
 
$$
Line 59: Line 59:
 
$$  
 
$$  
 
= \  
 
= \  
\sum _ { m = - \infty } ^ { {+ \infty } x _ {m}  ^ {i} \int\limits _ {x _ {m} - h/2 } ^ { {x _ m } + h/2 } p( x)  dx.
+
\sum _ { m = - \infty } ^ { {+ \infty} } x _ {m}  ^ {i} \int\limits _ {x _ {m} - h/2 } ^ { {x _ m } + h/2 } p( x)  dx.
 
$$
 
$$
  
Line 87: Line 87:
  
 
$$  
 
$$  
f( t)  =  g( t) \phi ( t) + O( h  ^ {s-} 1 ),
+
f( t)  =  g( t) \phi ( t) + O( h  ^ {s- 1} ),
 
$$
 
$$
  
 
hence the moments of the discrete random variable  $  Y $
 
hence the moments of the discrete random variable  $  Y $
coincide up to  $  O( h  ^ {s-} 1 ) $
+
coincide up to  $  O( h  ^ {s- 1} ) $
 
with the moments of the random variable  $  X + \eta $
 
with the moments of the random variable  $  X + \eta $
and, thus, up to  $  O( h  ^ {s-} 1 ) $,  
+
and, thus, up to  $  O( h  ^ {s- 1} ) $,  
 
the following equalities hold:
 
the following equalities hold:
  

Latest revision as of 19:38, 18 January 2021


for moments

Corrections to the discretization of the realizations of continuous random variables, used in order to diminish systematic errors in the problem of estimating the moments of the continuous random variables under a given system of rounding-off. Such corrections were first proposed by W.F. Sheppard [1].

Let $ X $ be a continuously-distributed random variable for which the probability density $ p( x) $, $ x \in \mathbf R ^ {1} $, has an everywhere continuous derivative $ p ^ {( s)} ( x) $ of order $ s $ on $ \mathbf R ^ {1} $ such that

$$ p ^ {( s)} ( x) = O( | x | ^ {- 1- \delta } ) \ \textrm{ as } x \rightarrow \infty $$

for some $ \delta > 0 $, and let the moments (cf. Moment) $ \alpha _ {k} = {\mathsf E} X ^ {k} $ exist. Further, let a system of rounding-off the results of observations be given (i.e. an origin $ x _ {0} $ and a step $ h $, $ h > 0 $, are given), the choice of which leads to the situation, when instead of the realizations of the initial continuous random variable $ X $, in reality one observes realizations $ x _ {m} = x _ {0} + mh $, $ m = 0, \pm 1 , \pm 2 \dots $ of a discrete random variable

$$ Y = x _ {0} + h \left [ \frac{X- x _ {0} }{h} + \frac{1}{2} \right ] , $$

where $ [ a] $ is the integer part of $ a $. The moments $ a _ {i} = {\mathsf E} Y ^ {i} $, $ i = 1 \dots k $, of $ Y $ are computed from the formula

$$ a _ {i} = \sum _ {m=- \infty } ^ { {+ \infty} } x _ {m} ^ {i} {\mathsf P} \{ Y = x _ {m} \} = $$

$$ = \ \sum _ { m = - \infty } ^ { {+ \infty} } x _ {m} ^ {i} \int\limits _ {x _ {m} - h/2 } ^ { {x _ m } + h/2 } p( x) dx. $$

Generally speaking, $ a _ {i} \neq \alpha _ {i} $. Thus a question arises: Is it possible to adjust the moments $ a _ {1} \dots a _ {k} $ in order to obtain "good" approximations to the moments $ \alpha _ {1} \dots \alpha _ {k} $? The Sheppard corrections give a positive answer to this question.

Let $ g( t) $ be the characteristic function of the random variable $ X $, let $ f( t) $ be the characteristic function of the random variable $ Y $, and let

$$ \phi ( t) = {\mathsf E} e ^ {it \eta } = \frac{2}{th} \sin \frac{th}{2} $$

be the characteristic function of a random variable $ \eta $ which is uniformly distributed on $ [- h/2, h/2] $ and which is stochastically independent of $ X $. Under these conditions, for a small $ h $,

$$ f( t) = g( t) \phi ( t) + O( h ^ {s- 1} ), $$

hence the moments of the discrete random variable $ Y $ coincide up to $ O( h ^ {s- 1} ) $ with the moments of the random variable $ X + \eta $ and, thus, up to $ O( h ^ {s- 1} ) $, the following equalities hold:

$$ \alpha _ {1} = a _ {1} ,\ \ \alpha _ {4} = a _ {4} - \frac{1}{2} a _ {2} h ^ {2} + \frac{7}{240} h ^ {4} , $$

$$ \alpha _ {2} = a _ {2} - \frac{1}{12} h ^ {2} ,\ \alpha _ {5} = a _ {5} - \frac{5}{6} a _ {3} h ^ {2} + \frac{7}{48} a _ {1} h ^ {4} , $$

$$ \alpha _ {3} = a _ {3} - \frac{1}{4} a _ {1} h ^ {2} , $$

$$ \alpha _ {6} = a _ {6} - \frac{5}{4} a _ {4} h ^ {2} + \frac{7}{16} a _ {2} h ^ {4} - \frac{31}{1344} h ^ {6} \dots $$

which contain the so-called Sheppard corrections for the moments $ a _ {1} \dots a _ {k} $.

References

[1] W.F. Sheppard, "On the calculation of the most probable values of frequency-constants, for data arranged according to equidistant divisions of a scale" Proc. Lond. Math. Soc. , 29 (1898) pp. 353–380
[2] H. Cramér, "Mathematical methods of statistics" , Princeton Univ. Press (1946)
[3] S.S. Wilks, "Mathematical statistics" , Wiley (1962)
[4] B.L. van der Waerden, "Mathematische Statistik" , Springer (1957)
How to Cite This Entry:
Sheppard corrections. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Sheppard_corrections&oldid=51394
This article was adapted from an original article by M.S. Nikulin (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article