Namespaces
Variants
Actions

Difference between revisions of "Random variables, transformations of"

From Encyclopedia of Mathematics
Jump to: navigation, search
m (tex encoded by computer)
(latex details)
 
Line 17: Line 17:
 
Then the random variable  $  Y = F ( X) $
 
Then the random variable  $  Y = F ( X) $
 
has a uniform distribution on the interval  $  [ 0 , 1 ] $,  
 
has a uniform distribution on the interval  $  [ 0 , 1 ] $,  
and the random variable  $  Z = \Phi  ^ {-} 1 ( F ( X) ) $(
+
and the random variable  $  Z = \Phi  ^ {-1} ( F ( X) ) $(
 
where  $  \Phi $
 
where  $  \Phi $
 
is the standard normal distribution function) has a [[Normal distribution|normal distribution]] with parameters 0 and 1. Conversely, the formula  $  X = F ^ { - 1 } ( \Phi ( Z) ) $
 
is the standard normal distribution function) has a [[Normal distribution|normal distribution]] with parameters 0 and 1. Conversely, the formula  $  X = F ^ { - 1 } ( \Phi ( Z) ) $

Latest revision as of 20:33, 16 January 2024


The determination of functions of given arbitrary random variables for which the probability distributions possess given properties.

Example 1. Let $ X $ be a random variable having a continuous and strictly increasing distribution function $ F $. Then the random variable $ Y = F ( X) $ has a uniform distribution on the interval $ [ 0 , 1 ] $, and the random variable $ Z = \Phi ^ {-1} ( F ( X) ) $( where $ \Phi $ is the standard normal distribution function) has a normal distribution with parameters 0 and 1. Conversely, the formula $ X = F ^ { - 1 } ( \Phi ( Z) ) $ enables one to obtain a random variable $ X $ that has the given distribution function $ F $ from a random variable $ Z $ with a standard normal distribution.

Transformations of random variables are often used in connection with limit theorems of probability theory. For example, let a sequence of random variables $ Z _ {n} $ be asymptotically normal with parameters $ ( 0 , 1 ) $. One then poses the problem of constructing simple (and simply invertible) functions $ f _ {n} $ such that the random variables $ V _ {n} = Z _ {n} + f _ {n} ( Z _ {n} ) $ are "more normal" than $ Z _ {n} $.

Example 2. Let $ X _ {1} \dots X _ {n} \dots $ be independent random variables, each having a uniform distribution on $ [ - 1 , 1 ] $, and put

$$ Z _ {n} = \frac{X _ {1} + \dots + X _ {n} }{\sqrt {n / 3 } } . $$

By the central limit theorem,

$$ {\mathsf P} \{ Z _ {n} < x \} - \Phi ( x) = O \left ( \frac{1}{n} \right ) . $$

If one sets

$$ V _ {n} = Z _ {n} - \frac{1}{20n} ( 3 Z _ {n} - Z _ {n} ^ {3} ) , $$

then

$$ {\mathsf P} \{ V _ {n} < x \} - \Phi ( x) = O \left ( \frac{1}{n ^ {2} } \right ) . $$

Example 3. The random variables $ \chi _ {n} ^ {2} $, $ \sqrt {2 \chi _ {n} ^ {2} } $ and $ ( \chi _ {n} ^ {2} / n ) ^ {1/3} $ are asymptotically normal as $ n \rightarrow \infty $( see Chi-squared distribution). The uniform deviation of the corresponding distribution functions from their normal approximations becomes less than $ 0 . 0 1 $ for $ \chi _ {n} ^ {2} $ when $ n \geq 354 $, and for $ \sqrt {2 \chi _ {n} ^ {2} } $( the Fisher transformation) — when $ n \geq 23 $; for $ ( \chi _ {n} ^ {2} / n ) ^ {1/3} $( the Wilson–Hilferty transformation) when $ n \geq 3 $ this deviation does not exceed $ 0.0007 $.

Transformations of random variables have long been applied in problems of mathematical statistics as the basis for constructing simple asymptotic formulas of high precision. Transformations of random variables are also used in the theory of stochastic processes (for example, the method of the "single probability space" ).

References

[1] L.N. Bol'shev, "On transformations of random variables" Theory Probab. Appl. , 4 (1959) pp. 129–141 Teor. Veryatnost. Primenen. , 4 : 2 (1959) pp. 136–149
[2] L.N. Bol'shev, "Asymptotically Pearson transformations" Theory Probab. Appl. , 8 : 2 (1963) pp. 121–146 Teor. Veroyatnost. Primenen. , 8 : 2 (1963) pp. 129–155
[3] L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics" , Libr. math. tables , 46 , Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova)

Comments

Related to the transformations above are the Edgeworth expansions (see, e.g., [a1]; cf. also Edgeworth series).

References

[a1] V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian)
How to Cite This Entry:
Random variables, transformations of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Random_variables,_transformations_of&oldid=48428
This article was adapted from an original article by V.I. PagurovaYu.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article