Namespaces
Variants
Actions

Difference between revisions of "Uniformly integrable set of random variables"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
 
Line 1: Line 1:
A set of random variables (cf. [[Random variable|Random variable]]) having finite expectations such that integrated tails of their distribution functions are uniformly small. Let a set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100201.png" /> consist of random variables defined on a common [[Probability space|probability space]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100202.png" />. It is called uniformly integrable if
+
<!--
 +
u1100201.png
 +
$#A+1 = 17 n = 0
 +
$#C+1 = 17 : ~/encyclopedia/old_files/data/U110/U.1100020 Uniformly integrable set of random variables
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100203.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100204.png" /></td> </tr></table>
+
A set of random variables (cf. [[Random variable|Random variable]]) having finite expectations such that integrated tails of their distribution functions are uniformly small. Let a set  $  {\mathcal X} $
 +
consist of random variables defined on a common [[Probability space|probability space]]  $  ( \Omega, {\mathcal F}, {\mathsf P} ) $.
 +
It is called uniformly integrable if
 +
 
 +
$$
 +
{\lim\limits } _ {c \rightarrow \infty }  \sup  _ {X \in {\mathcal X} } {\mathsf E} ( X; \left | X \right | > c ) =
 +
$$
 +
 
 +
$$
 +
=  
 +
{\lim\limits } _ {c \rightarrow \infty }  \sup  _ {X \in {\mathcal X} } \int\limits _ {\left \{ \omega : {\left | {X ( \omega ) } \right | > c } \right \} } X ( \omega )  { {\mathsf P} ( d \omega ) } = 0 .
 +
$$
  
 
Uniform integrability is a kind of compactness of sets of random variables or their distribution functions. It plays a key role in a variety of convergence problems. An example of this is the following theorem [[#References|[a1]]].
 
Uniform integrability is a kind of compactness of sets of random variables or their distribution functions. It plays a key role in a variety of convergence problems. An example of this is the following theorem [[#References|[a1]]].
  
 
===Theorem 1.===
 
===Theorem 1.===
Let a sequence <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100205.png" /> of random variables such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100206.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100207.png" />, converge in probability to a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100208.png" /> (cf. [[Convergence in probability|Convergence in probability]]). Then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u1100209.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002010.png" /> if and only if the set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002011.png" /> is uniformly integrable.
+
Let a sequence $  {\mathcal X} = \{ X _ {n} \} _ {n \geq  0 }  $
 +
of random variables such that $  {\mathsf E} | {X _ {n} } | < \infty $,  
 +
$  n \geq  0 $,  
 +
converge in probability to a random variable $  X $(
 +
cf. [[Convergence in probability|Convergence in probability]]). Then $  {\mathsf E} | X | < \infty $
 +
and $  {\lim\limits } _ {n \rightarrow \infty }  {\mathsf E} | {X _ {n} - X } | = 0 $
 +
if and only if the set $  {\mathcal X} $
 +
is uniformly integrable.
  
In fact, the definition of uniform integrability is stated in terms of marginal distribution functions of random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002012.png" /> and does not necessarily require that all these random variables are defined on the same probability space.
+
In fact, the definition of uniform integrability is stated in terms of marginal distribution functions of random variables $  X \in {\mathcal X} $
 +
and does not necessarily require that all these random variables are defined on the same probability space.
  
 
Each finite set of random variables having finite absolute expectations is uniformly integrable. This does not hold, in general, for infinite sets.
 
Each finite set of random variables having finite absolute expectations is uniformly integrable. This does not hold, in general, for infinite sets.
  
 
===Theorem 2.===
 
===Theorem 2.===
A set <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002013.png" /> of random variables is uniformly integrable if and only if there exists a non-negative increasing convex function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002014.png" /> such that
+
A set $  {\mathcal X} $
 +
of random variables is uniformly integrable if and only if there exists a non-negative increasing convex function $  G : {\mathbf R _ {+} } \rightarrow {\mathbf R _ {+} } $
 +
such that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002015.png" /></td> </tr></table>
+
$$
 +
{\lim\limits } _ {t \rightarrow \infty } {
 +
\frac{G ( t ) }{t}
 +
} = \infty
 +
$$
  
 
and
 
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002016.png" /></td> </tr></table>
+
$$
 +
\sup  _ {X \in {\mathcal X} } {\mathsf E} G ( \left | X \right | ) = g < \infty .
 +
$$
  
 
The criterion above leads to a quantification of the notion of uniform integrability: The straightforward estimate
 
The criterion above leads to a quantification of the notion of uniform integrability: The straightforward estimate
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/u/u110/u110020/u11002017.png" /></td> </tr></table>
+
$$
 +
\sup  _ {X \in {\mathcal X} } {\mathsf E} ( \left | X \right | ; \left | X \right | > c ) \leq  {
 +
\frac{c ( g - G ( 0 ) ) }{G ( c ) - G ( 0 ) }
 +
}
 +
$$
  
 
represents a uniform upper bound of the integrated tails of all random variables belonging to a uniformly integrable set.
 
represents a uniform upper bound of the integrated tails of all random variables belonging to a uniformly integrable set.

Latest revision as of 08:27, 6 June 2020


A set of random variables (cf. Random variable) having finite expectations such that integrated tails of their distribution functions are uniformly small. Let a set $ {\mathcal X} $ consist of random variables defined on a common probability space $ ( \Omega, {\mathcal F}, {\mathsf P} ) $. It is called uniformly integrable if

$$ {\lim\limits } _ {c \rightarrow \infty } \sup _ {X \in {\mathcal X} } {\mathsf E} ( X; \left | X \right | > c ) = $$

$$ = {\lim\limits } _ {c \rightarrow \infty } \sup _ {X \in {\mathcal X} } \int\limits _ {\left \{ \omega : {\left | {X ( \omega ) } \right | > c } \right \} } X ( \omega ) { {\mathsf P} ( d \omega ) } = 0 . $$

Uniform integrability is a kind of compactness of sets of random variables or their distribution functions. It plays a key role in a variety of convergence problems. An example of this is the following theorem [a1].

Theorem 1.

Let a sequence $ {\mathcal X} = \{ X _ {n} \} _ {n \geq 0 } $ of random variables such that $ {\mathsf E} | {X _ {n} } | < \infty $, $ n \geq 0 $, converge in probability to a random variable $ X $( cf. Convergence in probability). Then $ {\mathsf E} | X | < \infty $ and $ {\lim\limits } _ {n \rightarrow \infty } {\mathsf E} | {X _ {n} - X } | = 0 $ if and only if the set $ {\mathcal X} $ is uniformly integrable.

In fact, the definition of uniform integrability is stated in terms of marginal distribution functions of random variables $ X \in {\mathcal X} $ and does not necessarily require that all these random variables are defined on the same probability space.

Each finite set of random variables having finite absolute expectations is uniformly integrable. This does not hold, in general, for infinite sets.

Theorem 2.

A set $ {\mathcal X} $ of random variables is uniformly integrable if and only if there exists a non-negative increasing convex function $ G : {\mathbf R _ {+} } \rightarrow {\mathbf R _ {+} } $ such that

$$ {\lim\limits } _ {t \rightarrow \infty } { \frac{G ( t ) }{t} } = \infty $$

and

$$ \sup _ {X \in {\mathcal X} } {\mathsf E} G ( \left | X \right | ) = g < \infty . $$

The criterion above leads to a quantification of the notion of uniform integrability: The straightforward estimate

$$ \sup _ {X \in {\mathcal X} } {\mathsf E} ( \left | X \right | ; \left | X \right | > c ) \leq { \frac{c ( g - G ( 0 ) ) }{G ( c ) - G ( 0 ) } } $$

represents a uniform upper bound of the integrated tails of all random variables belonging to a uniformly integrable set.

References

[a1] P.A. Meyer, "Probability and potentials" , Blaisdell (1966)
How to Cite This Entry:
Uniformly integrable set of random variables. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Uniformly_integrable_set_of_random_variables&oldid=16808
This article was adapted from an original article by V. Kalashnikov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article