Namespaces
Variants
Actions

Difference between revisions of "Independent functions, system of"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
m (tex encoded by computer)
Line 1: Line 1:
A sequence of measurable functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506001.png" /> such that
+
<!--
 +
i0506001.png
 +
$#A+1 = 22 n = 0
 +
$#C+1 = 22 : ~/encyclopedia/old_files/data/I050/I.0500600 Independent functions, system of
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506002.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506003.png" /> and any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506004.png" />. The simplest example of a system of independent functions is the [[Rademacher system|Rademacher system]].
+
A sequence of measurable functions $  \{ f _ {i} \} $
 +
such that
  
(Kolmogorov's) criterion for the almost-everywhere convergence of a series of independent functions: For a series of independent functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506005.png" /> to converge almost everywhere it is necessary and sufficient that for some <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506006.png" /> the following three series converge:
+
$$
 +
\mu \{ {x } : {f _ {1} ( x) <
 +
\alpha _ {1} \dots f _ {n} ( x) < \alpha _ {n} } \}
 +
= \
 +
\prod _ { i= } 1 ^ { n }
 +
\mu \{ {x } : {f _ {i} ( x) < \alpha _ {i} } \}
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506007.png" /></td> </tr></table>
+
for any  $  n $
 +
and any  $  \alpha _ {1} \dots \alpha _ {n} $.  
 +
The simplest example of a system of independent functions is the [[Rademacher system|Rademacher system]].
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506008.png" /></td> </tr></table>
+
(Kolmogorov's) criterion for the almost-everywhere convergence of a series of independent functions: For a series of independent functions  $  \sum _ {i=} 1  ^  \infty  f _ {i} $
 +
to converge almost everywhere it is necessary and sufficient that for some  $  C > 0 $
 +
the following three series converge:
 +
 
 +
$$
 +
\sum _ { i } \mu
 +
\{ {x } : {f _ {i} ( x) > C } \}
 +
,\ \
 +
\sum _ { i } \int\limits
 +
f _ {i} ^ { C } ( x)  d x ,
 +
$$
 +
 
 +
$$
 +
\sum _ { i } \int\limits ( f _ {i} ^ { C } ( x) )  ^ {2} \
 +
d x - \left ( \int\limits f _ {i} ^ { C } ( x)  d x \right )  ^ {2} ,
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i0506009.png" /></td> </tr></table>
+
$$
 
+
f _ {i} ^ { C } ( x)  = \
 +
\left \{
 +
\begin{array}{ll}
 +
f _ {i} ( x) ,  & | f _ {i} ( x) | \leq  C ,  \\
 +
0 ,  & | f _ {i} ( x) | > C . \\
 +
\end{array}
  
 +
\right .$$
  
 
====Comments====
 
====Comments====
Of course, to be able to introduce the concept of a system of independent functions one needs to have a [[Measure space|measure space]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060010.png" /> on which the functions are defined and measurable (with respect to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060011.png" />). Moreover, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060012.png" /> must be positive and finite, so <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060013.png" /> can be taken a [[Probability measure|probability measure]] (then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060014.png" /> is a [[Probability space|probability space]]). An example is <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060015.png" />.
+
Of course, to be able to introduce the concept of a system of independent functions one needs to have a [[Measure space|measure space]] $  ( X , \mu ) $
 +
on which the functions are defined and measurable (with respect to $  \mu $).  
 +
Moreover, $  \mu $
 +
must be positive and finite, so $  \mu $
 +
can be taken a [[Probability measure|probability measure]] (then $  ( X , \mu ) $
 +
is a [[Probability space|probability space]]). An example is $  ( X , \mu ) = ( [ 0 , 1 ],  \textrm{ Lebesgue  measure  } ) $.
  
 
In this abstract setting, instead of functions one takes random variables, thus obtaining a system of independent random variables.
 
In this abstract setting, instead of functions one takes random variables, thus obtaining a system of independent random variables.
  
The notion of a system of independent functions (random variables) should not be mixed up with that of an independent set of elements of a vector space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060016.png" /> over a field <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060017.png" />: A set of elements <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060018.png" /> in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060019.png" /> such that for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060020.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060021.png" /> implies <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050600/i05060022.png" />, see also [[Vector space|Vector space]].
+
The notion of a system of independent functions (random variables) should not be mixed up with that of an independent set of elements of a vector space $  V $
 +
over a field $  K $:  
 +
A set of elements $  \{ x _ {1} \dots x _ {n} \} $
 +
in $  V $
 +
such that for $  c _ {i} \in K $,  
 +
$  c _ {1} x _ {1} + \dots + c _ {n} x _ {n} = 0 $
 +
implies  $  c _ {1} = \dots = c _ {n} = 0 $,  
 +
see also [[Vector space|Vector space]].
  
 
====References====
 
====References====
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  J.-P. Kahane,  "Some random series of functions" , Cambridge Univ. Press  (1985)</TD></TR></table>
 
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  J.-P. Kahane,  "Some random series of functions" , Cambridge Univ. Press  (1985)</TD></TR></table>

Revision as of 22:12, 5 June 2020


A sequence of measurable functions $ \{ f _ {i} \} $ such that

$$ \mu \{ {x } : {f _ {1} ( x) < \alpha _ {1} \dots f _ {n} ( x) < \alpha _ {n} } \} = \ \prod _ { i= } 1 ^ { n } \mu \{ {x } : {f _ {i} ( x) < \alpha _ {i} } \} $$

for any $ n $ and any $ \alpha _ {1} \dots \alpha _ {n} $. The simplest example of a system of independent functions is the Rademacher system.

(Kolmogorov's) criterion for the almost-everywhere convergence of a series of independent functions: For a series of independent functions $ \sum _ {i=} 1 ^ \infty f _ {i} $ to converge almost everywhere it is necessary and sufficient that for some $ C > 0 $ the following three series converge:

$$ \sum _ { i } \mu \{ {x } : {f _ {i} ( x) > C } \} ,\ \ \sum _ { i } \int\limits f _ {i} ^ { C } ( x) d x , $$

$$ \sum _ { i } \int\limits ( f _ {i} ^ { C } ( x) ) ^ {2} \ d x - \left ( \int\limits f _ {i} ^ { C } ( x) d x \right ) ^ {2} , $$

where

$$ f _ {i} ^ { C } ( x) = \ \left \{ \begin{array}{ll} f _ {i} ( x) , & | f _ {i} ( x) | \leq C , \\ 0 , & | f _ {i} ( x) | > C . \\ \end{array} \right .$$

Comments

Of course, to be able to introduce the concept of a system of independent functions one needs to have a measure space $ ( X , \mu ) $ on which the functions are defined and measurable (with respect to $ \mu $). Moreover, $ \mu $ must be positive and finite, so $ \mu $ can be taken a probability measure (then $ ( X , \mu ) $ is a probability space). An example is $ ( X , \mu ) = ( [ 0 , 1 ], \textrm{ Lebesgue measure } ) $.

In this abstract setting, instead of functions one takes random variables, thus obtaining a system of independent random variables.

The notion of a system of independent functions (random variables) should not be mixed up with that of an independent set of elements of a vector space $ V $ over a field $ K $: A set of elements $ \{ x _ {1} \dots x _ {n} \} $ in $ V $ such that for $ c _ {i} \in K $, $ c _ {1} x _ {1} + \dots + c _ {n} x _ {n} = 0 $ implies $ c _ {1} = \dots = c _ {n} = 0 $, see also Vector space.

References

[a1] J.-P. Kahane, "Some random series of functions" , Cambridge Univ. Press (1985)
How to Cite This Entry:
Independent functions, system of. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Independent_functions,_system_of&oldid=13159
This article was adapted from an original article by E.M. Semenov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article