Namespaces
Variants
Actions

Difference between revisions of "Martingale"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(latex details)
 
(5 intermediate revisions by 2 users not shown)
Line 1: Line 1:
A [[Stochastic process|stochastic process]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625701.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625702.png" />, defined on a probability space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625703.png" /> with a non-decreasing family of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625704.png" />-algebras <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625705.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625706.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625707.png" />, such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625708.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m0625709.png" /> is <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257010.png" />-measurable and
+
<!--
 +
m0625701.png
 +
$#A+1 = 120 n = 0
 +
$#C+1 = 120 : ~/encyclopedia/old_files/data/M062/M.0602570 Martingale
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257011.png" /></td> <td valign="top" style="width:5%;text-align:right;">(1)</td></tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
(with probability 1). In the case of discrete time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257012.png" />; in the case of continuous time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257013.png" />. Related notions are stochastic processes which form a submartingale, if
+
{{MSC|60G42|60G44}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257014.png" /></td> </tr></table>
+
[[Category:Stochastic processes]]
 +
 
 +
A [[Stochastic process|stochastic process]]  $  X = ( X _ {t} , {\mathcal F} _ {t} ) $,
 +
$  t \in T \subseteq [ 0 , \infty ) $,
 +
defined on a probability space  $  ( \Omega , {\mathcal F} , {\mathsf P} ) $
 +
with a non-decreasing family of  $  \sigma $-
 +
algebras  $  ( {\mathcal F} _ {t} ) _ {t \in T }  $,
 +
$  {\mathcal F} _ {s} \subseteq {\mathcal F} _ {t} \subseteq {\mathcal F} $,
 +
$  s \leq  t $,
 +
such that  $  {\mathsf E} | X _ {t} | < \infty $,
 +
$  X _ {t} $
 +
is  $  {\mathcal F} _ {t} $-
 +
measurable and
 +
 
 +
$$ \tag{1 }
 +
{\mathsf E} ( X _ {t} \mid  {\mathcal F} _ {s} )  = X _ {s}  $$
 +
 
 +
(with probability 1). In the case of discrete time  $  T = \{ 1 , 2 ,\dots \} $;  
 +
in the case of continuous time  $  T = [ 0 , \infty ) $.  
 +
Related notions are stochastic processes which form a submartingale, if
 +
 
 +
$$
 +
{\mathsf E} ( X _ {t} \mid  {\mathcal F} _ {s} )  \geq  X _ {s} ,
 +
$$
  
 
or a supermartingale, if
 
or a supermartingale, if
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257015.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} ( X _ {t} \mid  {\mathcal F} _ {s} )  \leq  X _ {s} .
 +
$$
  
Example 1. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257016.png" /> is a sequence of independent random variables with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257017.png" />, then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257018.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257019.png" />, with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257020.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257021.png" /> the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257022.png" />-algebra generated by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257023.png" />, is a martingale.
+
Example 1. If $  \xi _ {1} , \xi _ {2} \dots $
 +
is a sequence of independent random variables with $  {\mathsf E} \xi _ {j} = 0 $,  
 +
then $  X = ( X _ {n} , {\mathcal F} _ {n} ) $,  
 +
$  n \geq  1 $,  
 +
with $  X _ {n} = \xi _ {1} + \dots + \xi _ {n} $
 +
and $  {\mathcal F} _ {n} = \sigma \{ \xi _ {1} \dots \xi _ {n} \} $
 +
the $  \sigma $-
 +
algebra generated by $  \xi _ {1} \dots \xi _ {n} $,  
 +
is a martingale.
  
Example 2. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257024.png" /> be a martingale (submartingale), <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257025.png" /> a predictable sequence (that is, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257026.png" /> is not only <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257027.png" />-measurable but also <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257028.png" />-measurable, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257029.png" />), <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257030.png" />, and let
+
Example 2. Let $  Y = ( Y _ {n} , {\mathcal F} _ {n} ) $
 +
be a martingale (submartingale), $  V = ( V _ {n} , {\mathcal F} _ {n} ) $
 +
a predictable sequence (that is, $  V _ {n} $
 +
is not only $  {\mathcal F} _ {n} $-
 +
measurable but also $  {\mathcal F} _ {n-1} $-
 +
measurable, $  n \geq  1 $),  
 +
$  {\mathcal F} _ {0} = \{ \emptyset , \Omega \} $,  
 +
and let
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257031.png" /></td> </tr></table>
+
$$
 +
( V \cdot Y ) _ {n}  = \
 +
V _ {1} Y _ {1} +
 +
\sum _ { k=2}^ { n }  V _ {k} \Delta Y _ {k} ,\ \
 +
\Delta Y _ {k}  = Y _ {k} - Y _ {k-1} .
 +
$$
  
Then, if the variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257032.png" /> are integrable, the stochastic process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257033.png" /> forms a martingale (submartingale). In particular, if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257034.png" /> is a sequence of independent random variables corresponding to a Bernoulli scheme
+
Then, if the variables $  ( V \cdot Y ) _ {n} $
 +
are integrable, the stochastic process $  ( ( V \cdot Y ) _ {n} , {\mathcal F} _ {n} ) $
 +
forms a martingale (submartingale). In particular, if $  \xi _ {1} , \xi _ {2} \dots $
 +
is a sequence of independent random variables corresponding to a Bernoulli scheme
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257035.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ \xi _ {i} = \pm  1 \}  =
 +
\frac{1}{2}
 +
,\ \
 +
Y _ {k}  = \xi _ {1} + \dots + \xi _ {k} ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257036.png" /></td> </tr></table>
+
$$
 +
{\mathcal F} _ {k}  = \sigma \{ \xi _ {1} \dots \xi _ {k} \} ,
 +
$$
  
 
and
 
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257037.png" /></td> <td valign="top" style="width:5%;text-align:right;">(2)</td></tr></table>
+
$$ \tag{2 }
 +
V _ {k}  = \
 +
\left \{
 +
\begin{array}{ll}
 +
2  &\textrm{ if }  \xi _ {1} = \dots = \xi _ {k-1} = 1 ,  \\
 +
0   &\textrm{ otherwise } ,  \\
 +
\end{array}
  
then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257038.png" /> is a martingale. This stochastic process is a mathematical model of a game in which a player wins one unit of capital if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257039.png" /> and loses one unit of capital if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257040.png" />, and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257041.png" /> is the stake at the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257042.png" />-th game. The game-theoretic sense of the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257043.png" /> defined by (2) is that the player doubles his stake when he loses and stops the game on his first win. In the gambling world such a system is called a martingale, which explains the origin of the mathematical term "martingale" .
+
  \right .$$
  
One of the basic facts of the theory of martingales is that the structure of a martingale (submartingale) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257044.png" /> is preserved under a random change of time. A precise statement of this (called the optimal sampling theorem) is the following: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257045.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257046.png" /> are two finite stopping times (cf. [[Markov moment|Markov moment]]), if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257047.png" /> and if
+
then  $  ( ( V \cdot Y ) _ {n} , {\mathcal F} _ {n} ) $
 +
is a martingale. This stochastic process is a mathematical model of a game in which a player wins one unit of capital if  $  \xi _ {k} = + 1 $
 +
and loses one unit of capital if  $  \xi _ {k} = - 1 $,
 +
and  $  V _ {k} $
 +
is the stake at the  $  k $-
 +
th game. The game-theoretic sense of the function  $  V _ {k} $
 +
defined by (2) is that the player doubles his stake when he loses and stops the game on his first win. In the gambling world such a system is called a martingale, which explains the origin of the mathematical term "martingale" .
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257048.png" /></td> <td valign="top" style="width:5%;text-align:right;">(3)</td></tr></table>
+
One of the basic facts of the theory of martingales is that the structure of a martingale (submartingale)  $  X = ( X _ {t} , {\mathcal F} _ {t} ) $
 +
is preserved under a random change of time. A precise statement of this (called the optimal sampling theorem) is the following: If  $  \tau _ {1} $
 +
and  $  \tau _ {2} $
 +
are two finite stopping times (cf. [[Markov moment|Markov moment]]), if  $  {\mathsf P} \{ \tau _ {1} \leq  \tau _ {2} \} = 1 $
 +
and if
  
then <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257049.png" /> (with probability 1), where
+
$$ \tag{3 }
 +
{\mathsf E} | X _ {\tau _ {i}  } | \
 +
< \infty ,\  \lim\limits _ { t } \
 +
\inf  \int\limits _
 +
{\{ \tau _ {i} > t \} }
 +
| X _ {t} |  d {\mathsf P}  = 0 ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257050.png" /></td> </tr></table>
+
then  $  {\mathsf E} ( X _ {\tau _ {2}  } \mid  {\mathcal F} _ {\tau _ {1}  } ) ( \geq  ) = X _ {\tau _ {1}  } $(
 +
with probability 1), where
 +
 
 +
$$
 +
{\mathcal F} _ {\tau _ {1}  }  = \
 +
\{ {A \in {\mathcal F} } : {A \cap \{ \tau _ {1} \leq  t \}
 +
\in {\mathcal F} _ {t}  \textrm{ for  all  }  t \in T } \} .
 +
$$
  
 
As a particular case of this the [[Wald identity|Wald identity]] follows:
 
As a particular case of this the [[Wald identity|Wald identity]] follows:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257051.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} ( \xi _ {1} + \dots + \xi _  \tau  )  = {\mathsf E}
 +
\xi _ {1} {\mathsf E} \tau .
 +
$$
  
Among the basic results of the theory of martingales is Doob's inequality: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257052.png" /> is a non-negative submartingale,
+
Among the basic results of the theory of martingales is Doob's inequality: If $  X = ( X _ {n} , {\mathcal F} _ {n} ) $
 +
is a non-negative submartingale,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257053.png" /></td> </tr></table>
+
$$
 +
X _ {n}  ^ {*}  = \max _ {1 \leq  j \leq  n }  X _ {j} ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257054.png" /></td> </tr></table>
+
$$
 +
\| X _ {n} \| _ {p}  = ( {\mathsf E} | X _ {n} |  ^ {p} )  ^ {1/p} ,\  p \geq  1 ,\  n \geq  1 ,
 +
$$
  
 
then
 
then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257055.png" /></td> <td valign="top" style="width:5%;text-align:right;">(4)</td></tr></table>
+
$$ \tag{4 }
 +
{\mathsf P} \{ X _ {n}  ^ {*} \geq  \epsilon \}  \leq  \
 +
 
 +
\frac{ {\mathsf E} X _ {n} } \epsilon
 +
,
 +
$$
 +
 
 +
$$ \tag{5 }
 +
\| X _ {n} \| _ {p}  \leq  \| X _ {n}  ^ {*} \| _ {p}  \leq 
 +
\frac{p}{p-1} \| X _ {n} \| _ {p} ,\  p > 1 ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257056.png" /></td> <td valign="top" style="width:5%;text-align:right;">(5)</td></tr></table>
+
$$ \tag{6 }
 +
\| X _ {n}  ^ {*} \| _ {p}  \leq 
 +
\frac{e}{e-1} [ 1 +
 +
\| X _ {n}  \mathop{\rm ln}  ^ {+}  X _ {n} \| _ {p} ] ,\  p = 1 .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257057.png" /></td> <td valign="top" style="width:5%;text-align:right;">(6)</td></tr></table>
+
If  $  X = ( X _ {n} , {\mathcal F} _ {n} ) $
 +
is a martingale, then for  $  p > 1 $
 +
the Burkholder inequalities hold (generalizations of the inequalities of Khinchin and Marcinkiewicz–Zygmund for sums of independent random variables):
  
If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257058.png" /> is a martingale, then for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257059.png" /> the Burkholder inequalities hold (generalizations of the inequalities of Khinchin and Marcinkiewicz–Zygmund for sums of independent random variables):
+
$$ \tag{7 }
 +
A _ {p}  \| \sqrt {[ X ] _ {n} } \| _ {p}  \leq  \| X _ {n} \| _ {p}  \leq  B _ {p}  \| \sqrt {[ X ] _ {n} } \| _ {p} ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257060.png" /></td> <td valign="top" style="width:5%;text-align:right;">(7)</td></tr></table>
+
where  $  A _ {p} $
 +
and  $  B _ {p} $
 +
are certain universal constants (not depending on  $  X $
 +
or  $  n $),
 +
for which one can take
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257061.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257062.png" /> are certain universal constants (not depending on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257063.png" /> or <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257064.png" />), for which one can take
+
$$
 +
A _ {p}  = \
 +
\left (  
 +
\frac{18 p  ^ {3/2} }{p - 1 }
 +
\right ) ^ {-1} ,\ \
 +
B _ {p}  = \
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257065.png" /></td> </tr></table>
+
\frac{18 p  ^ {3/2} }{( p - 1 )  ^ {1/2} }
 +
,
 +
$$
  
 
and
 
and
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257066.png" /></td> </tr></table>
+
$$
 +
[ X ] _ {n}  = \
 +
\sum _ { i=1} ^ { n }
 +
( \Delta X _ {i} )  ^ {2} ,\ \
 +
X _ {0}  = 0 .
 +
$$
  
 
Taking (5) and (7) into account, it follows that
 
Taking (5) and (7) into account, it follows that
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257067.png" /></td> <td valign="top" style="width:5%;text-align:right;">(8)</td></tr></table>
+
$$ \tag{8 }
 +
A _ {p}  \| \sqrt {[ X ] _ {n} } \| _ {p} \
 +
\leq  \| X _ {n}  ^ {*} \| _ {p} \
 +
\leq  \widetilde{B}  _ {p}  \| \sqrt {[ X ] _ {n} } \| _ {p} ,
 +
$$
  
 
where
 
where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257068.png" /></td> </tr></table>
+
$$
 +
\widetilde{B}  _ {p}  = \
 +
 
 +
\frac{18 p  ^ {5/2} }{( p - 1 )  ^ {3/2} }
 +
.
 +
$$
 +
 
 +
When  $  p = 1 $
 +
inequality (8) can be generalized. Namely, Davis' inequality holds: There are universal constants  $  A $
 +
and  $  B $
 +
such that
  
When <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257069.png" /> inequality (8) can be generalized. Namely, Davis' inequality holds: There are universal constants <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257070.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257071.png" /> such that
+
$$
 +
A  \| \sqrt {[ X ] _ {n} } \| _ {1} \
 +
\leq  \| X _ {n}  ^ {*} \| _ {1} \
 +
\leq  B  \| \sqrt {[ X ] _ {n} } \| _ {1} .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257072.png" /></td> </tr></table>
+
In the proof of a different kind of theorem on the convergence of submartingales with probability 1, a key role is played by Doob's inequality for the mathematical expectation  $  {\mathsf E} \beta _ {n} ( a , b) $
 +
of the number of upcrossings,  $  \beta _ {n} ( a , b ) $,
 +
of the interval  $  [ a , b ] $
 +
by the submartingale  $  X = ( X _ {n} , {\mathcal F} _ {n} ) $
 +
in  $  n $
 +
steps; namely
  
In the proof of a different kind of theorem on the convergence of submartingales with probability 1, a key role is played by Doob's inequality for the mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257073.png" /> of the number of upcrossings, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257074.png" />, of the interval <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257075.png" /> by the submartingale <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257076.png" /> in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257077.png" /> steps; namely
+
$$ \tag{9 }
 +
{\mathsf E} \beta _ {n} ( a , b )  \leq  \
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257078.png" /></td> <td valign="top" style="width:5%;text-align:right;">(9)</td></tr></table>
+
\frac{ {\mathsf E} | X _ {n} | + | a | }{b - a }
 +
.
 +
$$
  
The basic result on the convergence of submartingales is Doob's theorem: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257079.png" /> is a submartingale and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257080.png" />, then with probability 1, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257081.png" /> (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257082.png" />) exists and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257083.png" />. If the submartingale <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257084.png" /> is uniformly integrable, then, in addition to convergence with probability <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257085.png" />, convergence in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257086.png" /> holds, that is,
+
The basic result on the convergence of submartingales is Doob's theorem: If $  X = ( X _ {n} , {\mathcal F} _ {n} ) $
 +
is a submartingale and $  \sup  {\mathsf E} | X _ {n} | < \infty $,  
 +
then with probability 1, $  \lim\limits _ {n \rightarrow \infty }  X _ {n} $(
 +
= X _  \infty  $)  
 +
exists and $  {\mathsf E} | X _  \infty  | < \infty $.  
 +
If the submartingale $  X $
 +
is uniformly integrable, then, in addition to convergence with probability $  1 $,  
 +
convergence in $  L _ {1} $
 +
holds, that is,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257087.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} | X _ {n} - X _  \infty  |  \rightarrow  0 ,\  n \rightarrow \infty .
 +
$$
  
A corollary of this result is Lévy's theorem on the continuity of conditional mathematical expectations: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257088.png" />, then
+
A corollary of this result is Lévy's theorem on the continuity of conditional mathematical expectations: If $  {\mathsf E} | \xi | < \infty $,  
 +
then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257089.png" /></td> </tr></table>
+
$$
 +
{\mathsf E} ( \xi | {\mathcal F} _ {n} )  \rightarrow  {\mathsf E} ( \xi | {\mathcal F} _  \infty  ) ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257090.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257091.png" />.
+
where $  {\mathcal F} _ {1} \subseteq {\mathcal F} _ {2} \subseteq \dots $
 +
and $  {\mathcal F} _  \infty  = \sigma ( \cup _ {n} {\mathcal F} _ {n} ) $.
  
A natural generalization of a martingale is the concept of a local martingale, that is, a stochastic process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257092.png" /> for which there is a sequence <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257093.png" /> of finite stopping times <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257094.png" /> (with probability 1), <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257095.png" />, such that for each <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257096.png" /> the "stopped" processes
+
A natural generalization of a martingale is the concept of a local martingale, that is, a stochastic process $  X = ( X _ {t} , {\mathcal F} _ {t} ) $
 +
for which there is a sequence $  ( \tau _ {m} ) _ {m \geq  1 }  $
 +
of finite stopping times $  \tau _ {m} \uparrow \infty $(
 +
with probability 1), m \geq  1 $,  
 +
such that for each m \geq  1 $
 +
the "stopped" processes
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257097.png" /></td> </tr></table>
+
$$
 +
X ^ {\tau _ {m} }  = \
 +
( X _ {t \wedge \tau _ {m}  } I ( \tau _ {m} > 0 ) , {\mathcal F} _ {t} )
 +
$$
  
are martingales. In the case of discrete time each local martingale <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257098.png" /> is a martingale transform, that is, can be represented in the form <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m06257099.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570100.png" /> is a predictable process and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570101.png" /> is a martingale.
+
are martingales. In the case of discrete time each local martingale $  X = ( X _ {n} , {\mathcal F} _ {n} ) $
 +
is a martingale transform, that is, can be represented in the form $  X _ {n} = ( V \cdot Y ) _ {n} $,  
 +
where $  V $
 +
is a predictable process and $  Y $
 +
is a martingale.
  
Each submartingale <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570102.png" /> has, moreover, a unique Doob–Meyer decomposition <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570103.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570104.png" /> is a local martingale and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570105.png" /> is a predictable non-decreasing process. In particular, if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570106.png" /> is a square-integrable martingale, then its square <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570107.png" /> is a submartingale in whose Doob–Meyer decomposition <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570108.png" /> the process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570109.png" /> is called the quadratic characteristic of the martingale <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570110.png" />. For each square-integrable martingale <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570111.png" /> and predictable process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570112.png" /> such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570113.png" /> (with probability 1), <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570114.png" />, it is possible to define the [[Stochastic integral|stochastic integral]]
+
Each submartingale $  X = ( X _ {t} , {\mathcal F} _ {t} ) $
 +
has, moreover, a unique Doob–Meyer decomposition $  X _ {t} = M _ {t} + A _ {t} $,  
 +
where $  M = ( M _ {t} , {\mathcal F} _ {t} ) $
 +
is a local martingale and $  A = ( A _ {t} , {\mathcal F} _ {t} ) $
 +
is a predictable non-decreasing process. In particular, if $  m = ( m _ {t} , {\mathcal F} _ {t} ) $
 +
is a square-integrable martingale, then its square $  m  ^ {2} = ( m _ {t}  ^ {2} , {\mathcal F} _ {t} ) $
 +
is a submartingale in whose Doob–Meyer decomposition $  m _ {t}  ^ {2} = M _ {t} + \langle  m \rangle _ {t} $
 +
the process $  \langle  m \rangle = ( \langle  m \rangle _ {t} , {\mathcal F} _ {t} ) $
 +
is called the quadratic characteristic of the martingale m $.  
 +
For each square-integrable martingale m $
 +
and predictable process $  V = ( V _ {t} , {\mathcal F} _ {t} ) $
 +
such that $  \int _ {0}  ^ {t} V _ {s}  ^ {2}  d \langle  m \rangle _ {s} < \infty $(
 +
with probability 1), $  t > 0 $,  
 +
it is possible to define the [[Stochastic integral|stochastic integral]]
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570115.png" /></td> </tr></table>
+
$$
 +
( V \cdot m ) _ {t}  = \int\limits _ { 0 } ^ { t }  V _ {s}  d m _ {s} ,
 +
$$
  
which is a local martingale. In the case of a Wiener process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570116.png" />, which is a square-integrable martingale, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570117.png" /> and the stochastic integral <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570118.png" /> is none other than the Itô stochastic integral with respect to the Wiener process.
+
which is a local martingale. In the case of a Wiener process $  W = ( W _ {t} , {\mathcal F} _ {t} ) $,  
 +
which is a square-integrable martingale, $  \langle  m \rangle _ {t} = t $
 +
and the stochastic integral $  ( V \cdot W ) _ {t} $
 +
is none other than the Itô stochastic integral with respect to the Wiener process.
  
 
In the case of continuous time the Doob, Burkholder and Davis inequalities are still true (for right-continuous processes having left limits).
 
In the case of continuous time the Doob, Burkholder and Davis inequalities are still true (for right-continuous processes having left limits).
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  J.L. Doob,   "Stochastic processes" , Chapman &amp; Hall (1953)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  I.I. [I.I. Gikhman] Gihman,   A.V. [A.V. Skorokhod] Skorohod,   "The theory of stochastic processes" , '''1''' , Springer (1974) (Translated from Russian)</TD></TR></table>
+
{|
 
+
|valign="top"|{{Ref|D}}|| J.L. Doob, "Stochastic processes" , Chapman &amp; Hall (1953) {{MR|1570654}} {{MR|0058896}} {{ZBL|0053.26802}}
 
+
|-
 +
|valign="top"|{{Ref|GS}}|| I.I. Gihman, A.V. Skorohod, "The theory of stochastic processes" , '''1''' , Springer (1974) (Translated from Russian) {{MR|0346882}} {{ZBL|0291.60019}}
 +
|}
  
 
====Comments====
 
====Comments====
 
Stopping times are also called optimal times, or, in the older literature, Markov times or Markov moments, cf. [[Markov moment|Markov moment]]. The optimal sampling theorem is also called the stopping theorem or Doob's stopping theorem.
 
Stopping times are also called optimal times, or, in the older literature, Markov times or Markov moments, cf. [[Markov moment|Markov moment]]. The optimal sampling theorem is also called the stopping theorem or Doob's stopping theorem.
  
The notion of a martingale is one of the most important concepts in modern probability theory. It is basic in the theories of Markov processes and stochastic integrals, and is useful in many parts of analysis (convergence theorems in ergodic theory, derivatives and lifting in measure theory, inequalities in the theory of singular integrals, etc.). More generally one can define martingales with values in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570119.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062570/m062570120.png" />, a Hilbert or a Banach space; Banach-valued martingales are used in the study of Banach spaces (Radon–Nikodým property, etc.).
+
The notion of a martingale is one of the most important concepts in modern probability theory. It is basic in the theories of Markov processes and stochastic integrals, and is useful in many parts of analysis (convergence theorems in ergodic theory, derivatives and lifting in measure theory, inequalities in the theory of singular integrals, etc.). More generally one can define martingales with values in $  \mathbf C $,  
 +
$  \mathbf R  ^ {n} $,  
 +
a Hilbert or a Banach space; Banach-valued martingales are used in the study of Banach spaces (Radon–Nikodým property, etc.).
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  C. Dellacherie,   P.A. Meyer,   "Probabilities and potential" , '''1–3''' , North-Holland (1978–1988) pp. Chapts. V-VIII. Theory of martingales (Translated from French)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"J.L. Doob,   "Classical potential theory and its probabilistic counterpart" , Springer (1984) pp. 390</TD></TR><TR><TD valign="top">[a3]</TD> <TD valign="top">  J. Neveu,   "Discrete-parameter martingales" , North-Holland (1975) (Translated from French)</TD></TR><TR><TD valign="top">[a4]</TD> <TD valign="top">  J. Ville,   "Etude critique de la notion de collectif" , Gauthier-Villars (1939)</TD></TR><TR><TD valign="top">[a5]</TD> <TD valign="top"P. Wall,   C.C. Heyde,   "Martingale limit theory and its application" , Acad. Press (1980)</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|DM}}|| C. Dellacherie, P.A. Meyer, "Probabilities and potential" , '''1–3''' , North-Holland (1978–1988) pp. Chapts. V-VIII. Theory of martingales (Translated from French) {{MR|0939365}} {{MR|0898005}} {{MR|0727641}} {{MR|0745449}} {{MR|0566768}} {{MR|0521810}} {{ZBL|0716.60001}} {{ZBL|0494.60002}} {{ZBL|0494.60001}}
 +
|-
 +
|valign="top"|{{Ref|D2}}|| J.L. Doob, "Classical potential theory and its probabilistic counterpart" , Springer (1984) pp. 390 {{MR|0731258}} {{ZBL|0549.31001}}
 +
|-
 +
|valign="top"|{{Ref|N}}|| J. Neveu, "Discrete-parameter martingales" , North-Holland (1975) (Translated from French) {{MR|0402915}} {{ZBL|0345.60026}}
 +
|-
 +
|valign="top"|{{Ref|V}}|| J. Ville, "Etude critique de la notion de collectif" , Gauthier-Villars (1939) {{MR|}} {{ZBL|0021.14601}} {{ZBL|0021.14505}} {{ZBL|65.0547.05}}
 +
|-
 +
|valign="top"|{{Ref|WH}}|| P. Wall, C.C. Heyde, "Martingale limit theory and its application" , Acad. Press (1980) {{MR|624435}} {{ZBL|}}
 +
|}

Latest revision as of 08:14, 9 January 2024


2020 Mathematics Subject Classification: Primary: 60G42 Secondary: 60G44 [MSN][ZBL]

A stochastic process $ X = ( X _ {t} , {\mathcal F} _ {t} ) $, $ t \in T \subseteq [ 0 , \infty ) $, defined on a probability space $ ( \Omega , {\mathcal F} , {\mathsf P} ) $ with a non-decreasing family of $ \sigma $- algebras $ ( {\mathcal F} _ {t} ) _ {t \in T } $, $ {\mathcal F} _ {s} \subseteq {\mathcal F} _ {t} \subseteq {\mathcal F} $, $ s \leq t $, such that $ {\mathsf E} | X _ {t} | < \infty $, $ X _ {t} $ is $ {\mathcal F} _ {t} $- measurable and

$$ \tag{1 } {\mathsf E} ( X _ {t} \mid {\mathcal F} _ {s} ) = X _ {s} $$

(with probability 1). In the case of discrete time $ T = \{ 1 , 2 ,\dots \} $; in the case of continuous time $ T = [ 0 , \infty ) $. Related notions are stochastic processes which form a submartingale, if

$$ {\mathsf E} ( X _ {t} \mid {\mathcal F} _ {s} ) \geq X _ {s} , $$

or a supermartingale, if

$$ {\mathsf E} ( X _ {t} \mid {\mathcal F} _ {s} ) \leq X _ {s} . $$

Example 1. If $ \xi _ {1} , \xi _ {2} \dots $ is a sequence of independent random variables with $ {\mathsf E} \xi _ {j} = 0 $, then $ X = ( X _ {n} , {\mathcal F} _ {n} ) $, $ n \geq 1 $, with $ X _ {n} = \xi _ {1} + \dots + \xi _ {n} $ and $ {\mathcal F} _ {n} = \sigma \{ \xi _ {1} \dots \xi _ {n} \} $ the $ \sigma $- algebra generated by $ \xi _ {1} \dots \xi _ {n} $, is a martingale.

Example 2. Let $ Y = ( Y _ {n} , {\mathcal F} _ {n} ) $ be a martingale (submartingale), $ V = ( V _ {n} , {\mathcal F} _ {n} ) $ a predictable sequence (that is, $ V _ {n} $ is not only $ {\mathcal F} _ {n} $- measurable but also $ {\mathcal F} _ {n-1} $- measurable, $ n \geq 1 $), $ {\mathcal F} _ {0} = \{ \emptyset , \Omega \} $, and let

$$ ( V \cdot Y ) _ {n} = \ V _ {1} Y _ {1} + \sum _ { k=2}^ { n } V _ {k} \Delta Y _ {k} ,\ \ \Delta Y _ {k} = Y _ {k} - Y _ {k-1} . $$

Then, if the variables $ ( V \cdot Y ) _ {n} $ are integrable, the stochastic process $ ( ( V \cdot Y ) _ {n} , {\mathcal F} _ {n} ) $ forms a martingale (submartingale). In particular, if $ \xi _ {1} , \xi _ {2} \dots $ is a sequence of independent random variables corresponding to a Bernoulli scheme

$$ {\mathsf P} \{ \xi _ {i} = \pm 1 \} = \frac{1}{2} ,\ \ Y _ {k} = \xi _ {1} + \dots + \xi _ {k} , $$

$$ {\mathcal F} _ {k} = \sigma \{ \xi _ {1} \dots \xi _ {k} \} , $$

and

$$ \tag{2 } V _ {k} = \ \left \{ \begin{array}{ll} 2 &\textrm{ if } \xi _ {1} = \dots = \xi _ {k-1} = 1 , \\ 0 &\textrm{ otherwise } , \\ \end{array} \right .$$

then $ ( ( V \cdot Y ) _ {n} , {\mathcal F} _ {n} ) $ is a martingale. This stochastic process is a mathematical model of a game in which a player wins one unit of capital if $ \xi _ {k} = + 1 $ and loses one unit of capital if $ \xi _ {k} = - 1 $, and $ V _ {k} $ is the stake at the $ k $- th game. The game-theoretic sense of the function $ V _ {k} $ defined by (2) is that the player doubles his stake when he loses and stops the game on his first win. In the gambling world such a system is called a martingale, which explains the origin of the mathematical term "martingale" .

One of the basic facts of the theory of martingales is that the structure of a martingale (submartingale) $ X = ( X _ {t} , {\mathcal F} _ {t} ) $ is preserved under a random change of time. A precise statement of this (called the optimal sampling theorem) is the following: If $ \tau _ {1} $ and $ \tau _ {2} $ are two finite stopping times (cf. Markov moment), if $ {\mathsf P} \{ \tau _ {1} \leq \tau _ {2} \} = 1 $ and if

$$ \tag{3 } {\mathsf E} | X _ {\tau _ {i} } | \ < \infty ,\ \lim\limits _ { t } \ \inf \int\limits _ {\{ \tau _ {i} > t \} } | X _ {t} | d {\mathsf P} = 0 , $$

then $ {\mathsf E} ( X _ {\tau _ {2} } \mid {\mathcal F} _ {\tau _ {1} } ) ( \geq ) = X _ {\tau _ {1} } $( with probability 1), where

$$ {\mathcal F} _ {\tau _ {1} } = \ \{ {A \in {\mathcal F} } : {A \cap \{ \tau _ {1} \leq t \} \in {\mathcal F} _ {t} \textrm{ for all } t \in T } \} . $$

As a particular case of this the Wald identity follows:

$$ {\mathsf E} ( \xi _ {1} + \dots + \xi _ \tau ) = {\mathsf E} \xi _ {1} {\mathsf E} \tau . $$

Among the basic results of the theory of martingales is Doob's inequality: If $ X = ( X _ {n} , {\mathcal F} _ {n} ) $ is a non-negative submartingale,

$$ X _ {n} ^ {*} = \max _ {1 \leq j \leq n } X _ {j} , $$

$$ \| X _ {n} \| _ {p} = ( {\mathsf E} | X _ {n} | ^ {p} ) ^ {1/p} ,\ p \geq 1 ,\ n \geq 1 , $$

then

$$ \tag{4 } {\mathsf P} \{ X _ {n} ^ {*} \geq \epsilon \} \leq \ \frac{ {\mathsf E} X _ {n} } \epsilon , $$

$$ \tag{5 } \| X _ {n} \| _ {p} \leq \| X _ {n} ^ {*} \| _ {p} \leq \frac{p}{p-1} \| X _ {n} \| _ {p} ,\ p > 1 , $$

$$ \tag{6 } \| X _ {n} ^ {*} \| _ {p} \leq \frac{e}{e-1} [ 1 + \| X _ {n} \mathop{\rm ln} ^ {+} X _ {n} \| _ {p} ] ,\ p = 1 . $$

If $ X = ( X _ {n} , {\mathcal F} _ {n} ) $ is a martingale, then for $ p > 1 $ the Burkholder inequalities hold (generalizations of the inequalities of Khinchin and Marcinkiewicz–Zygmund for sums of independent random variables):

$$ \tag{7 } A _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} \leq \| X _ {n} \| _ {p} \leq B _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} , $$

where $ A _ {p} $ and $ B _ {p} $ are certain universal constants (not depending on $ X $ or $ n $), for which one can take

$$ A _ {p} = \ \left ( \frac{18 p ^ {3/2} }{p - 1 } \right ) ^ {-1} ,\ \ B _ {p} = \ \frac{18 p ^ {3/2} }{( p - 1 ) ^ {1/2} } , $$

and

$$ [ X ] _ {n} = \ \sum _ { i=1} ^ { n } ( \Delta X _ {i} ) ^ {2} ,\ \ X _ {0} = 0 . $$

Taking (5) and (7) into account, it follows that

$$ \tag{8 } A _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} \ \leq \| X _ {n} ^ {*} \| _ {p} \ \leq \widetilde{B} _ {p} \| \sqrt {[ X ] _ {n} } \| _ {p} , $$

where

$$ \widetilde{B} _ {p} = \ \frac{18 p ^ {5/2} }{( p - 1 ) ^ {3/2} } . $$

When $ p = 1 $ inequality (8) can be generalized. Namely, Davis' inequality holds: There are universal constants $ A $ and $ B $ such that

$$ A \| \sqrt {[ X ] _ {n} } \| _ {1} \ \leq \| X _ {n} ^ {*} \| _ {1} \ \leq B \| \sqrt {[ X ] _ {n} } \| _ {1} . $$

In the proof of a different kind of theorem on the convergence of submartingales with probability 1, a key role is played by Doob's inequality for the mathematical expectation $ {\mathsf E} \beta _ {n} ( a , b) $ of the number of upcrossings, $ \beta _ {n} ( a , b ) $, of the interval $ [ a , b ] $ by the submartingale $ X = ( X _ {n} , {\mathcal F} _ {n} ) $ in $ n $ steps; namely

$$ \tag{9 } {\mathsf E} \beta _ {n} ( a , b ) \leq \ \frac{ {\mathsf E} | X _ {n} | + | a | }{b - a } . $$

The basic result on the convergence of submartingales is Doob's theorem: If $ X = ( X _ {n} , {\mathcal F} _ {n} ) $ is a submartingale and $ \sup {\mathsf E} | X _ {n} | < \infty $, then with probability 1, $ \lim\limits _ {n \rightarrow \infty } X _ {n} $( $ = X _ \infty $) exists and $ {\mathsf E} | X _ \infty | < \infty $. If the submartingale $ X $ is uniformly integrable, then, in addition to convergence with probability $ 1 $, convergence in $ L _ {1} $ holds, that is,

$$ {\mathsf E} | X _ {n} - X _ \infty | \rightarrow 0 ,\ n \rightarrow \infty . $$

A corollary of this result is Lévy's theorem on the continuity of conditional mathematical expectations: If $ {\mathsf E} | \xi | < \infty $, then

$$ {\mathsf E} ( \xi | {\mathcal F} _ {n} ) \rightarrow {\mathsf E} ( \xi | {\mathcal F} _ \infty ) , $$

where $ {\mathcal F} _ {1} \subseteq {\mathcal F} _ {2} \subseteq \dots $ and $ {\mathcal F} _ \infty = \sigma ( \cup _ {n} {\mathcal F} _ {n} ) $.

A natural generalization of a martingale is the concept of a local martingale, that is, a stochastic process $ X = ( X _ {t} , {\mathcal F} _ {t} ) $ for which there is a sequence $ ( \tau _ {m} ) _ {m \geq 1 } $ of finite stopping times $ \tau _ {m} \uparrow \infty $( with probability 1), $ m \geq 1 $, such that for each $ m \geq 1 $ the "stopped" processes

$$ X ^ {\tau _ {m} } = \ ( X _ {t \wedge \tau _ {m} } I ( \tau _ {m} > 0 ) , {\mathcal F} _ {t} ) $$

are martingales. In the case of discrete time each local martingale $ X = ( X _ {n} , {\mathcal F} _ {n} ) $ is a martingale transform, that is, can be represented in the form $ X _ {n} = ( V \cdot Y ) _ {n} $, where $ V $ is a predictable process and $ Y $ is a martingale.

Each submartingale $ X = ( X _ {t} , {\mathcal F} _ {t} ) $ has, moreover, a unique Doob–Meyer decomposition $ X _ {t} = M _ {t} + A _ {t} $, where $ M = ( M _ {t} , {\mathcal F} _ {t} ) $ is a local martingale and $ A = ( A _ {t} , {\mathcal F} _ {t} ) $ is a predictable non-decreasing process. In particular, if $ m = ( m _ {t} , {\mathcal F} _ {t} ) $ is a square-integrable martingale, then its square $ m ^ {2} = ( m _ {t} ^ {2} , {\mathcal F} _ {t} ) $ is a submartingale in whose Doob–Meyer decomposition $ m _ {t} ^ {2} = M _ {t} + \langle m \rangle _ {t} $ the process $ \langle m \rangle = ( \langle m \rangle _ {t} , {\mathcal F} _ {t} ) $ is called the quadratic characteristic of the martingale $ m $. For each square-integrable martingale $ m $ and predictable process $ V = ( V _ {t} , {\mathcal F} _ {t} ) $ such that $ \int _ {0} ^ {t} V _ {s} ^ {2} d \langle m \rangle _ {s} < \infty $( with probability 1), $ t > 0 $, it is possible to define the stochastic integral

$$ ( V \cdot m ) _ {t} = \int\limits _ { 0 } ^ { t } V _ {s} d m _ {s} , $$

which is a local martingale. In the case of a Wiener process $ W = ( W _ {t} , {\mathcal F} _ {t} ) $, which is a square-integrable martingale, $ \langle m \rangle _ {t} = t $ and the stochastic integral $ ( V \cdot W ) _ {t} $ is none other than the Itô stochastic integral with respect to the Wiener process.

In the case of continuous time the Doob, Burkholder and Davis inequalities are still true (for right-continuous processes having left limits).

References

[D] J.L. Doob, "Stochastic processes" , Chapman & Hall (1953) MR1570654 MR0058896 Zbl 0053.26802
[GS] I.I. Gihman, A.V. Skorohod, "The theory of stochastic processes" , 1 , Springer (1974) (Translated from Russian) MR0346882 Zbl 0291.60019

Comments

Stopping times are also called optimal times, or, in the older literature, Markov times or Markov moments, cf. Markov moment. The optimal sampling theorem is also called the stopping theorem or Doob's stopping theorem.

The notion of a martingale is one of the most important concepts in modern probability theory. It is basic in the theories of Markov processes and stochastic integrals, and is useful in many parts of analysis (convergence theorems in ergodic theory, derivatives and lifting in measure theory, inequalities in the theory of singular integrals, etc.). More generally one can define martingales with values in $ \mathbf C $, $ \mathbf R ^ {n} $, a Hilbert or a Banach space; Banach-valued martingales are used in the study of Banach spaces (Radon–Nikodým property, etc.).

References

[DM] C. Dellacherie, P.A. Meyer, "Probabilities and potential" , 1–3 , North-Holland (1978–1988) pp. Chapts. V-VIII. Theory of martingales (Translated from French) MR0939365 MR0898005 MR0727641 MR0745449 MR0566768 MR0521810 Zbl 0716.60001 Zbl 0494.60002 Zbl 0494.60001
[D2] J.L. Doob, "Classical potential theory and its probabilistic counterpart" , Springer (1984) pp. 390 MR0731258 Zbl 0549.31001
[N] J. Neveu, "Discrete-parameter martingales" , North-Holland (1975) (Translated from French) MR0402915 Zbl 0345.60026
[V] J. Ville, "Etude critique de la notion de collectif" , Gauthier-Villars (1939) Zbl 0021.14601 Zbl 0021.14505 Zbl 65.0547.05
[WH] P. Wall, C.C. Heyde, "Martingale limit theory and its application" , Acad. Press (1980) MR624435
How to Cite This Entry:
Martingale. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Martingale&oldid=14031
This article was adapted from an original article by A.N. Shiryaev (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article