Namespaces
Variants
Actions

Difference between revisions of "Infinitely-divisible distribution"

From Encyclopedia of Mathematics
Jump to: navigation, search
(refs format)
(latex details)
 
(One intermediate revision by one other user not shown)
Line 1: Line 1:
 +
<!--
 +
i0509101.png
 +
$#A+1 = 90 n = 0
 +
$#C+1 = 90 : ~/encyclopedia/old_files/data/I050/I.0500910 Infinitely\AAhdivisible distribution
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 
{{MSC|60E07}}
 
{{MSC|60E07}}
  
 
[[Category:Distribution theory]]
 
[[Category:Distribution theory]]
  
A probability distribution which, for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509101.png" /> may be represented as a composition (convolution) of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509102.png" /> identical probability distributions. The definition of an infinitely-divisible distribution is applicable to an equal degree to a distribution on the straight line, on a finite-dimensional Euclidean space and to a number of other, even more general, cases. The one-dimensional case will be considered below.
+
A probability distribution which, for any $  n = 2, 3 \dots $
 +
may be represented as a composition (convolution) of $  n $
 +
identical probability distributions. The definition of an infinitely-divisible distribution is applicable to an equal degree to a distribution on the straight line, on a finite-dimensional Euclidean space and to a number of other, even more general, cases. The one-dimensional case will be considered below.
  
The characteristic function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509103.png" /> of an infinitely-divisible distribution is called infinitely divisible. Such a function may be represented, for any value of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509104.png" />, as the <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509105.png" />-th power of some other characteristic function:
+
The characteristic function $  f( t) $
 +
of an infinitely-divisible distribution is called infinitely divisible. Such a function may be represented, for any value of $  n $,  
 +
as the $  n $-
 +
th power of some other characteristic function:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509106.png" /></td> </tr></table>
+
$$
 +
f( t)  = ( f _ {n} ( t))  ^ {n} .
 +
$$
  
 
Examples of infinitely-divisible distributions include the [[Normal distribution|normal distribution]], the [[Poisson distribution|Poisson distribution]], the [[Cauchy distribution|Cauchy distribution]], and the [[Chi-squared distribution|"chi-squared" distribution]]. The property of infinite divisibility is most easily tested by using characteristic functions. The composition of infinitely-divisible distributions and the limit of weakly-convergent sequences of infinitely-divisible distributions are again infinitely divisible.
 
Examples of infinitely-divisible distributions include the [[Normal distribution|normal distribution]], the [[Poisson distribution|Poisson distribution]], the [[Cauchy distribution|Cauchy distribution]], and the [[Chi-squared distribution|"chi-squared" distribution]]. The property of infinite divisibility is most easily tested by using characteristic functions. The composition of infinitely-divisible distributions and the limit of weakly-convergent sequences of infinitely-divisible distributions are again infinitely divisible.
  
A random variable, defined on some probability space, is called infinitely divisible if it can be represented, for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509107.png" />, as a sum of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509108.png" /> independent identically-distributed random variables defined on that space. The distribution of each such variable is infinitely divisible, but the converse is not always true. Consider, e.g., the discrete probability space formed by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i0509109.png" /> with Poisson probabilities
+
A random variable, defined on some probability space, is called infinitely divisible if it can be represented, for any $  n $,  
 +
as a sum of $  n $
 +
independent identically-distributed random variables defined on that space. The distribution of each such variable is infinitely divisible, but the converse is not always true. Consider, e.g., the discrete probability space formed by $  \{ 0, 1, 2 ,\dots \} $
 +
with Poisson probabilities
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091010.png" /></td> </tr></table>
+
$$
 +
P ( m )  =
 +
\frac{\lambda  ^ {m} }{m!}
 +
e ^ {- \lambda } \ \
 +
( m = 0 , 1 ,\dots ) .
 +
$$
  
The random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091011.png" /> is not infinitely divisible, even though its probability distribution (Poisson distribution) is infinitely divisible.
+
The random variable $  X( m) = m $
 +
is not infinitely divisible, even though its probability distribution (Poisson distribution) is infinitely divisible.
  
Infinitely-divisible distributions first appeared in connection with the study of stochastically-continuous homogeneous stochastic processes with stationary independent increments (cf. [[Stochastic process with stationary increments|Stochastic process with stationary increments]]; [[Stochastic process with independent increments|Stochastic process with independent increments]]) {{Cite|Fin}}, {{Cite|Ko}}, {{Cite|L}}. This is the name of processes <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091012.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091013.png" />, which satisfy the following requirements: 1) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091014.png" />; 2) the probability distribution of the increment <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091015.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091016.png" />, depends only on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091017.png" />; 3) for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091018.png" /> the differences
+
Infinitely-divisible distributions first appeared in connection with the study of stochastically-continuous homogeneous stochastic processes with stationary independent increments (cf. [[Stochastic process with stationary increments|Stochastic process with stationary increments]]; [[Stochastic process with independent increments|Stochastic process with independent increments]]) {{Cite|Fin}}, {{Cite|Ko}}, {{Cite|L}}. This is the name of processes $  X ( \tau ) $,  
 +
$  \tau \geq  0 $,  
 +
which satisfy the following requirements: 1) $  X( 0) = 0 $;  
 +
2) the probability distribution of the increment $  X ( \tau _ {2} ) - X ( \tau _ {1} ) $,  
 +
$  \tau _ {2} > \tau _ {1} $,  
 +
depends only on $  \tau _ {2} - \tau _ {1} $;  
 +
3) for $  \tau _ {1} \leq  \dots \leq  \tau _ {k} $
 +
the differences
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091019.png" /></td> </tr></table>
+
$$
 +
X ( \tau _ {2} ) - X ( \tau _ {1} ) \dots
 +
X ( \tau _ {k} ) - X ( \tau _ {k-1} )
 +
$$
  
are mutually-independent random variables; 4) for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091020.png" />,
+
are mutually-independent random variables; 4) for any $  \epsilon > 0 $,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091021.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} ( | X ( \tau ) | > \epsilon )  \rightarrow  0
 +
$$
  
as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091022.png" />. For such a process the value <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091023.png" /> for any <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091024.png" /> will be an infinitely-divisible random variable, and the corresponding characteristic function satisfies the relation
+
as $  \tau \rightarrow 0 $.  
 +
For such a process the value $  X( \tau ) $
 +
for any $  \tau $
 +
will be an infinitely-divisible random variable, and the corresponding characteristic function satisfies the relation
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091025.png" /></td> </tr></table>
+
$$
 +
f _  \tau  ( t)  = ( f _ {1} ( t))  ^  \tau  .
 +
$$
  
The general form of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091026.png" /> for such processes — on the assumption that the variances <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091027.png" /> are finite — was found by A.N. Kolmogorov {{Cite|Ko}} (a special case of the canonical representation of infinitely-divisible distributions presented below).
+
The general form of $  f _  \tau  ( t) $
 +
for such processes — on the assumption that the variances $  {\mathsf D} X ( \tau ) $
 +
are finite — was found by A.N. Kolmogorov {{Cite|Ko}} (a special case of the canonical representation of infinitely-divisible distributions presented below).
  
 
The characteristic function of an infinitely-divisible distribution never vanishes, and its logarithm (in the sense of the principal value) permits a representation of the form:
 
The characteristic function of an infinitely-divisible distribution never vanishes, and its logarithm (in the sense of the principal value) permits a representation of the form:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091028.png" /></td> <td valign="top" style="width:5%;text-align:right;">(*)</td></tr></table>
+
$$ \tag{* }
 +
\mathop{\rm ln}  f ( t )  = \
 +
i \gamma t +
 +
\int\limits _ {- \infty } ^ { +  \infty }
 +
L( u, t)
 +
\frac{1+ u  ^ {2} }{u  ^ {2} }
 +
  dG( u)
 +
$$
  
 
(the so-called Lévy–Khinchin canonical representation), where
 
(the so-called Lévy–Khinchin canonical representation), where
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091029.png" /></td> </tr></table>
+
$$
 +
L( u, t )  = e  ^ {itu} - 1 -  
 +
\frac{itu}{1+ u  ^ {2} }
 +
,
 +
$$
  
<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091030.png" /> is some real constant and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091031.png" /> is a non-decreasing function of bounded variation with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091032.png" />. The integrand is taken to be equal to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091033.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091034.png" />. Whatever the value of the constant <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091035.png" /> and of the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091036.png" /> with the above properties, formula (*) defines the logarithm of the characteristic function of some infinitely-divisible distribution. The correspondence between infinitely-divisible distributions and pairs <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091037.png" /> is one-to-one and is also bicontinuous. This means that an infinitely-divisible distribution is weakly convergent towards an infinitely-divisible limit distribution if and only if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091038.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091039.png" /> converges to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091040.png" /> as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091041.png" />.
+
$  \gamma $
 +
is some real constant and $  G( u) $
 +
is a non-decreasing function of bounded variation with $  G ( - \infty ) = 0 $.  
 +
The integrand is taken to be equal to $  - t  ^ {2} /2 $
 +
for $  u = 0 $.  
 +
Whatever the value of the constant $  \gamma $
 +
and of the function $  G $
 +
with the above properties, formula (*) defines the logarithm of the characteristic function of some infinitely-divisible distribution. The correspondence between infinitely-divisible distributions and pairs $  ( \gamma , G ) $
 +
is one-to-one and is also bicontinuous. This means that an infinitely-divisible distribution is weakly convergent towards an infinitely-divisible limit distribution if and only if $  \gamma _ {n} \rightarrow \gamma $
 +
and $  G _ {n} $
 +
converges to $  G $
 +
as $  n \rightarrow \infty $.
  
Examples. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091042.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091043.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091044.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091045.png" />. Then, in order to have a normal distribution with mathematical expectation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091046.png" /> and variance <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091047.png" /> in formula (*), one must put
+
Examples. Let $  U( x) = 0 $,  
 +
$  x \leq  0 $,
 +
$  U( x) = 1 $,  
 +
$  x > 0 $.  
 +
Then, in order to have a normal distribution with mathematical expectation $  a $
 +
and variance $  \sigma  ^ {2} $
 +
in formula (*), one must put
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091048.png" /></td> </tr></table>
+
$$
 +
\gamma  = a ,\ \
 +
G( x)  =
 +
\frac{\sigma  ^ {2} }{2}
 +
U ( x) .
 +
$$
  
For a Poisson distribution with parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091049.png" /> one has
+
For a Poisson distribution with parameter $  \lambda $
 +
one has
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091050.png" /></td> </tr></table>
+
$$
 +
\gamma  =
 +
\frac \lambda {2}
 +
,\ \
 +
G( x)  =
 +
\frac \lambda {2}
 +
U ( x- 1).
 +
$$
  
 
For a Cauchy distribution with density
 
For a Cauchy distribution with density
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091051.png" /></td> </tr></table>
+
$$
 +
p( x)  =
 +
\frac{1}{\pi ( 1+ x  ^ {2} ) }
  
one has <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091052.png" />,
+
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091053.png" /></td> </tr></table>
+
one has  $  \gamma = 0 $,
  
The canonical representation (*) is convenient from a purely "technical" point of view (owing to the fact that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091054.png" /> has bounded variation), but the function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091055.png" /> has no direct probabilistic interpretation. For this reason another form of representation of infinitely-divisible distributions, which permits a direct probabilistic interpretation, is employed as well. Let the functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091056.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091057.png" /> be defined, for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091058.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091059.png" /> respectively, by the formulas:
+
$$
 +
G( x) =
 +
\frac{1} \pi
 +
\
 +
\mathop{\rm arctan}  x + {
 +
\frac{1}{2}
 +
} .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091060.png" /></td> </tr></table>
+
The canonical representation (*) is convenient from a purely "technical" point of view (owing to the fact that  $  G $
 +
has bounded variation), but the function  $  G $
 +
has no direct probabilistic interpretation. For this reason another form of representation of infinitely-divisible distributions, which permits a direct probabilistic interpretation, is employed as well. Let the functions  $  M( u) $
 +
and  $  N( u) $
 +
be defined, for  $  u < 0 $
 +
and  $  u > 0 $
 +
respectively, by the formulas:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091061.png" /></td> </tr></table>
+
$$
 +
dM( u)  =
 +
\frac{1+ u  ^ {2} }{u  ^ {2} }
 +
  dG( u),
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091062.png" /></td> </tr></table>
+
$$
 +
dN( u)  =
 +
\frac{1+ u  ^ {2} }{u  ^ {2} }
 +
  dG( u),
 +
$$
  
These functions are non-decreasing, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091063.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091064.png" />, and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091065.png" /> for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091066.png" />; in a neighbourhood of zero the functions may be unbounded. If one denotes by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091067.png" /> the jump of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091068.png" /> at zero, formula (*) may be rewritten as follows:
+
$$
 +
M(- \infty )  = N ( \infty )  = 0 .
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091069.png" /></td> </tr></table>
+
These functions are non-decreasing,  $  M( u) \geq  0 $
 +
for  $  u < 0 $,
 +
and  $  N( u) \leq  0 $
 +
for  $  u > 0 $;  
 +
in a neighbourhood of zero the functions may be unbounded. If one denotes by  $  \sigma  ^ {2} $
 +
the jump of  $  G $
 +
at zero, formula (*) may be rewritten as follows:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091070.png" /></td> </tr></table>
+
$$
 +
\mathop{\rm ln}  f( t)  = i \gamma t -  
 +
\frac{1}{2}
  
(Lévy's canonical representation). The functions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091071.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091072.png" /> describe, roughly speaking, the frequency of the jumps of varying quantities in the homogeneous process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091073.png" /> with independent increments for which
+
\sigma  ^ {2} t  ^ {2} +
 +
\int\limits _ {- \infty } ^ {-0}
 +
L( u, t) dM( u) +
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091074.png" /></td> </tr></table>
+
$$
 +
+ \int\limits_{+0}^  \infty  L ( u, t)  dN ( u)
 +
$$
  
The importance of the role played in the limit theorems of probability theory by infinitely-divisible distributions is due to the fact that these and only these distributions can be the limit distributions for sums of independent random variables subject to the requirement of [[Asymptotic negligibility|asymptotic negligibility]]. Consider the triangular array <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091075.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091076.png" /> of mutually-independent random variables and select mutually-independent random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091077.png" /> with infinitely-divisible distributions (the so-called accompanying infinitely-divisible distributions); the characteristic function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091078.png" /> of the variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091079.png" /> is defined in terms of the characteristic function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091080.png" /> of the variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091081.png" /> so as to preserve the following basic property: The distributions of the sums
+
(Lévy's canonical representation). The functions  $  M $
 +
and $  N $
 +
describe, roughly speaking, the frequency of the jumps of varying quantities in the homogeneous process  $  X ( \tau ) $
 +
with independent increments for which
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091082.png" /></td> </tr></table>
+
$$
 +
\mathop{\rm ln}  f _  \tau  ( t)  = \tau  \mathop{\rm ln}  f( t).
 +
$$
  
converge to the same limit distribution (for a certain selection of the constants <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091083.png" />) if and only if the sums
+
The importance of the role played in the limit theorems of probability theory by infinitely-divisible distributions is due to the fact that these and only these distributions can be the limit distributions for sums of independent random variables subject to the requirement of [[Asymptotic negligibility|asymptotic negligibility]]. Consider the triangular array  $  X _ {n1 }  \dots X _ {nk _ {n}  } $,
 +
$  n = 1, 2 \dots $
 +
of mutually-independent random variables and select mutually-independent random variables  $  Y _ {n1 }  \dots Y _ {nk _ {n}  } $
 +
with infinitely-divisible distributions (the so-called accompanying infinitely-divisible distributions); the characteristic function  $  g _ {nk }  ( t) $
 +
of the variable  $  Y _ {nk }  $
 +
is defined in terms of the characteristic function  $  f _ {nk }  ( t) $
 +
of the variable  $  X _ {nk }  $
 +
so as to preserve the following basic property: The distributions of the sums
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091084.png" /></td> </tr></table>
+
$$
 +
\sum_{k=1}^ { {k _ n} } X _ {nk} - A _ {n}  $$
  
converge to a limit distribution. For a symmetric distribution <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091085.png" /> it is assumed that
+
converge to the same limit distribution (for a certain selection of the constants  $  A _ {n} $)
 +
if and only if the sums
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091086.png" /></td> </tr></table>
+
$$
 +
\sum_{k=1}^ { {k _ n} }
 +
Y _ {nk} - A _ {n}  $$
  
In other cases the expression for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091087.png" /> is more complex, and contains the so-called truncated mathematical expectations of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091088.png" />. The properties of infinitely-divisible distributions are described in terms of functions forming part of the canonical representations. For instance, an infinitely-divisible distribution function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091089.png" /> is continuous if and only if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/i/i050/i050910/i05091090.png" />.
+
converge to a limit distribution. For a symmetric distribution  $  X _ {nk }  $
 +
it is assumed that
 +
 
 +
$$
 +
g _ {nk} ( t)  =  \mathop{\rm exp} ( f _ {nk} ( t) - 1) .
 +
$$
 +
 
 +
In other cases the expression for $  g _ {nk }  $
 +
is more complex, and contains the so-called truncated mathematical expectations of $  X _ {nk }  $.  
 +
The properties of infinitely-divisible distributions are described in terms of functions forming part of the canonical representations. For instance, an infinitely-divisible distribution function $  F( x) $
 +
is continuous if and only if $  \int _ {- \infty }  ^ {+ \infty } u  ^ {-2}  dG ( u) = \infty $.
  
 
An important special case of infinitely-divisible distributions are the so-called stable distributions (cf. [[Stable distribution|Stable distribution]]). See also [[Infinitely-divisible distributions, factorization of|Infinitely-divisible distributions, factorization of]].
 
An important special case of infinitely-divisible distributions are the so-called stable distributions (cf. [[Stable distribution|Stable distribution]]). See also [[Infinitely-divisible distributions, factorization of|Infinitely-divisible distributions, factorization of]].
Line 113: Line 265:
  
 
====Comments====
 
====Comments====
 
  
 
====References====
 
====References====

Latest revision as of 09:00, 13 January 2024


2020 Mathematics Subject Classification: Primary: 60E07 [MSN][ZBL]

A probability distribution which, for any $ n = 2, 3 \dots $ may be represented as a composition (convolution) of $ n $ identical probability distributions. The definition of an infinitely-divisible distribution is applicable to an equal degree to a distribution on the straight line, on a finite-dimensional Euclidean space and to a number of other, even more general, cases. The one-dimensional case will be considered below.

The characteristic function $ f( t) $ of an infinitely-divisible distribution is called infinitely divisible. Such a function may be represented, for any value of $ n $, as the $ n $- th power of some other characteristic function:

$$ f( t) = ( f _ {n} ( t)) ^ {n} . $$

Examples of infinitely-divisible distributions include the normal distribution, the Poisson distribution, the Cauchy distribution, and the "chi-squared" distribution. The property of infinite divisibility is most easily tested by using characteristic functions. The composition of infinitely-divisible distributions and the limit of weakly-convergent sequences of infinitely-divisible distributions are again infinitely divisible.

A random variable, defined on some probability space, is called infinitely divisible if it can be represented, for any $ n $, as a sum of $ n $ independent identically-distributed random variables defined on that space. The distribution of each such variable is infinitely divisible, but the converse is not always true. Consider, e.g., the discrete probability space formed by $ \{ 0, 1, 2 ,\dots \} $ with Poisson probabilities

$$ P ( m ) = \frac{\lambda ^ {m} }{m!} e ^ {- \lambda } \ \ ( m = 0 , 1 ,\dots ) . $$

The random variable $ X( m) = m $ is not infinitely divisible, even though its probability distribution (Poisson distribution) is infinitely divisible.

Infinitely-divisible distributions first appeared in connection with the study of stochastically-continuous homogeneous stochastic processes with stationary independent increments (cf. Stochastic process with stationary increments; Stochastic process with independent increments) [Fin], [Ko], [L]. This is the name of processes $ X ( \tau ) $, $ \tau \geq 0 $, which satisfy the following requirements: 1) $ X( 0) = 0 $; 2) the probability distribution of the increment $ X ( \tau _ {2} ) - X ( \tau _ {1} ) $, $ \tau _ {2} > \tau _ {1} $, depends only on $ \tau _ {2} - \tau _ {1} $; 3) for $ \tau _ {1} \leq \dots \leq \tau _ {k} $ the differences

$$ X ( \tau _ {2} ) - X ( \tau _ {1} ) \dots X ( \tau _ {k} ) - X ( \tau _ {k-1} ) $$

are mutually-independent random variables; 4) for any $ \epsilon > 0 $,

$$ {\mathsf P} ( | X ( \tau ) | > \epsilon ) \rightarrow 0 $$

as $ \tau \rightarrow 0 $. For such a process the value $ X( \tau ) $ for any $ \tau $ will be an infinitely-divisible random variable, and the corresponding characteristic function satisfies the relation

$$ f _ \tau ( t) = ( f _ {1} ( t)) ^ \tau . $$

The general form of $ f _ \tau ( t) $ for such processes — on the assumption that the variances $ {\mathsf D} X ( \tau ) $ are finite — was found by A.N. Kolmogorov [Ko] (a special case of the canonical representation of infinitely-divisible distributions presented below).

The characteristic function of an infinitely-divisible distribution never vanishes, and its logarithm (in the sense of the principal value) permits a representation of the form:

$$ \tag{* } \mathop{\rm ln} f ( t ) = \ i \gamma t + \int\limits _ {- \infty } ^ { + \infty } L( u, t) \frac{1+ u ^ {2} }{u ^ {2} } dG( u) $$

(the so-called Lévy–Khinchin canonical representation), where

$$ L( u, t ) = e ^ {itu} - 1 - \frac{itu}{1+ u ^ {2} } , $$

$ \gamma $ is some real constant and $ G( u) $ is a non-decreasing function of bounded variation with $ G ( - \infty ) = 0 $. The integrand is taken to be equal to $ - t ^ {2} /2 $ for $ u = 0 $. Whatever the value of the constant $ \gamma $ and of the function $ G $ with the above properties, formula (*) defines the logarithm of the characteristic function of some infinitely-divisible distribution. The correspondence between infinitely-divisible distributions and pairs $ ( \gamma , G ) $ is one-to-one and is also bicontinuous. This means that an infinitely-divisible distribution is weakly convergent towards an infinitely-divisible limit distribution if and only if $ \gamma _ {n} \rightarrow \gamma $ and $ G _ {n} $ converges to $ G $ as $ n \rightarrow \infty $.

Examples. Let $ U( x) = 0 $, $ x \leq 0 $, $ U( x) = 1 $, $ x > 0 $. Then, in order to have a normal distribution with mathematical expectation $ a $ and variance $ \sigma ^ {2} $ in formula (*), one must put

$$ \gamma = a ,\ \ G( x) = \frac{\sigma ^ {2} }{2} U ( x) . $$

For a Poisson distribution with parameter $ \lambda $ one has

$$ \gamma = \frac \lambda {2} ,\ \ G( x) = \frac \lambda {2} U ( x- 1). $$

For a Cauchy distribution with density

$$ p( x) = \frac{1}{\pi ( 1+ x ^ {2} ) } $$

one has $ \gamma = 0 $,

$$ G( x) = \frac{1} \pi \ \mathop{\rm arctan} x + { \frac{1}{2} } . $$

The canonical representation (*) is convenient from a purely "technical" point of view (owing to the fact that $ G $ has bounded variation), but the function $ G $ has no direct probabilistic interpretation. For this reason another form of representation of infinitely-divisible distributions, which permits a direct probabilistic interpretation, is employed as well. Let the functions $ M( u) $ and $ N( u) $ be defined, for $ u < 0 $ and $ u > 0 $ respectively, by the formulas:

$$ dM( u) = \frac{1+ u ^ {2} }{u ^ {2} } dG( u), $$

$$ dN( u) = \frac{1+ u ^ {2} }{u ^ {2} } dG( u), $$

$$ M(- \infty ) = N ( \infty ) = 0 . $$

These functions are non-decreasing, $ M( u) \geq 0 $ for $ u < 0 $, and $ N( u) \leq 0 $ for $ u > 0 $; in a neighbourhood of zero the functions may be unbounded. If one denotes by $ \sigma ^ {2} $ the jump of $ G $ at zero, formula (*) may be rewritten as follows:

$$ \mathop{\rm ln} f( t) = i \gamma t - \frac{1}{2} \sigma ^ {2} t ^ {2} + \int\limits _ {- \infty } ^ {-0} L( u, t) dM( u) + $$

$$ + \int\limits_{+0}^ \infty L ( u, t) dN ( u) $$

(Lévy's canonical representation). The functions $ M $ and $ N $ describe, roughly speaking, the frequency of the jumps of varying quantities in the homogeneous process $ X ( \tau ) $ with independent increments for which

$$ \mathop{\rm ln} f _ \tau ( t) = \tau \mathop{\rm ln} f( t). $$

The importance of the role played in the limit theorems of probability theory by infinitely-divisible distributions is due to the fact that these and only these distributions can be the limit distributions for sums of independent random variables subject to the requirement of asymptotic negligibility. Consider the triangular array $ X _ {n1 } \dots X _ {nk _ {n} } $, $ n = 1, 2 \dots $ of mutually-independent random variables and select mutually-independent random variables $ Y _ {n1 } \dots Y _ {nk _ {n} } $ with infinitely-divisible distributions (the so-called accompanying infinitely-divisible distributions); the characteristic function $ g _ {nk } ( t) $ of the variable $ Y _ {nk } $ is defined in terms of the characteristic function $ f _ {nk } ( t) $ of the variable $ X _ {nk } $ so as to preserve the following basic property: The distributions of the sums

$$ \sum_{k=1}^ { {k _ n} } X _ {nk} - A _ {n} $$

converge to the same limit distribution (for a certain selection of the constants $ A _ {n} $) if and only if the sums

$$ \sum_{k=1}^ { {k _ n} } Y _ {nk} - A _ {n} $$

converge to a limit distribution. For a symmetric distribution $ X _ {nk } $ it is assumed that

$$ g _ {nk} ( t) = \mathop{\rm exp} ( f _ {nk} ( t) - 1) . $$

In other cases the expression for $ g _ {nk } $ is more complex, and contains the so-called truncated mathematical expectations of $ X _ {nk } $. The properties of infinitely-divisible distributions are described in terms of functions forming part of the canonical representations. For instance, an infinitely-divisible distribution function $ F( x) $ is continuous if and only if $ \int _ {- \infty } ^ {+ \infty } u ^ {-2} dG ( u) = \infty $.

An important special case of infinitely-divisible distributions are the so-called stable distributions (cf. Stable distribution). See also Infinitely-divisible distributions, factorization of.

References

[Fin] D. de Finetti, "Sulla funzione a incremento aleatorio" Atti Accad. Naz. Lincei Sez. 1 (6) , 10 (1929) pp. 163–168
[Ko] A.N. Kolmogorov, "Ancora sulla forma generale di un processo stocastico omogeneo" Atti Accad. Naz. Lincei Sez. 1 (6) , 15 (1932) pp. 866–869
[L] P. Lévy, "Sur les intégrales dont les éléments sont des variables aléatoires indépendantes" Ann. Scuola Norm. Sup. Pisa Ser. 2 , 3 (1934) pp. 337–366 MR1556734 Zbl 0010.07004 Zbl 60.1157.01
[L2] P. Lévy, "Théorie de l'addition des variables aléatoires" , Gauthier-Villars (1937)
[Kh] A.Ya. Khinchin, "Limit laws for sums of independent random variables" , Moscow-Leningrad (1938) (In Russian)
[GK] B.V. Gnedenko, A.N. Kolmogorov, "Limit distributions for sums of independent random variables" , Addison-Wesley (1954) (Translated from Russian) MR0062975 Zbl 0056.36001
[Fis] M. Fisz, "Infinitely divisible distributions: recent results and applications" Ann. of Math. Statist. , 33 (1962) pp. 68–84 MR0139188 Zbl 0102.35103
[P] V.V. Petrov, "Sums of independent random variables" , Springer (1975) (Translated from Russian) MR0388499 Zbl 0322.60043 Zbl 0322.60042
[ST] V.V. Sazonov, V.N. Tutubalin, "Probability distributions on topological groups" Theory Probab. Appl. , 11 : 1 (1966) pp. 1–45 Teor. Veroyatnost. i Primenen. , 11 : 1 (1966) pp. 3–55 MR0199872 Zbl 0171.38701

Comments

References

[S] F.W. Steutel, "Infinite divisibility in theory and practice" Scand. J. Statist., 6 (1979) pp. 57–64 MR0538596 Zbl 0402.62007
[F] W. Feller, "An introduction to probability theory and its applications", 1–2, Wiley (1968–1971)
How to Cite This Entry:
Infinitely-divisible distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Infinitely-divisible_distribution&oldid=26539
This article was adapted from an original article by Yu.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article