Namespaces
Variants
Actions

Difference between revisions of "Poisson distribution"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(latex details)
 
(6 intermediate revisions by 2 users not shown)
Line 1: Line 1:
A [[Probability distribution|probability distribution]] of a random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732801.png" /> taking non-negative integer values <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732802.png" /> with probabilities
+
<!--
 +
p0732801.png
 +
$#A+1 = 51 n = 0
 +
$#C+1 = 51 : ~/encyclopedia/old_files/data/P073/P.0703280 Poisson distribution
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732803.png" /></td> </tr></table>
+
{{TEX|auto}}
 +
{{TEX|done}}
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732804.png" /> is a parameter. The [[Generating function|generating function]] and the [[Characteristic function|characteristic function]] of the Poisson distribution are defined by
+
{{MSC|60E99}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732805.png" /></td> </tr></table>
+
[[Category:Distribution theory]]
  
respectively. The mean, variance and the semi-invariants of higher order are all equal to <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732806.png" />. The distribution function of the Poisson distribution,
+
A [[Probability distribution|probability distribution]] of a random variable  $  X $
 +
taking non-negative integer values  $  k = 0 , 1 \dots $
 +
with probabilities
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732807.png" /></td> </tr></table>
+
$$
 +
{\mathsf P} \{ X = k \}  = e ^ {- \lambda }
 +
\frac{\lambda  ^ {k} }{k!}
 +
,
 +
$$
  
is given at the points <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732808.png" /> by
+
where  $  \lambda > 0 $
 +
is a parameter. The [[Generating function|generating function]] and the [[Characteristic function|characteristic function]] of the Poisson distribution are defined by
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p0732809.png" /></td> </tr></table>
+
$$
 +
\phi ( z)  = e ^ {\lambda ( z - 1 ) } \ \
 +
\textrm{ and }  f ( t)  = \
 +
\mathop{\rm exp} [ \lambda ( e  ^ {it} - 1 ) ] ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328010.png" /> is the value at the point <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328011.png" /> of the [[Gamma-distribution|gamma-distribution]] function with parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328012.png" /> (or by <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328013.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328014.png" /> is the value at the point <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328015.png" /> of the [["Chi-squared" distribution| "chi-squared"  distribution]] function with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328016.png" /> degrees of freedom) whence, in particular,
+
respectively. The mean, variance and the semi-invariants of higher order are all equal to  $  \lambda $.  
 +
The distribution function of the Poisson distribution,
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328017.png" /></td> </tr></table>
+
$$
 +
F(x) = \sum_{i=0}^{[x]} e^{-\lambda} \frac{\lambda^{i}}{i!},
 +
$$
  
The sum of independent variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328018.png" /> each having a Poisson distribution with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328019.png" /> has a Poisson distribution with parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328020.png" />.
+
is given at the points  $  k = 0 , 1 ,\dots $
 +
by
  
Conversely, if the sum <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328021.png" /> of two independent random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328022.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328023.png" /> has a Poisson distribution, then each random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328024.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328025.png" /> is subject to a Poisson distribution (Raikov's theorem). There are general necessary and sufficient conditions for the convergence of the distribution of sums of independent random variables to a Poisson distribution. In the limit, as <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328026.png" />, the random variable <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328027.png" /> has the standard [[Normal distribution|normal distribution]].
+
$$
 +
F ( k) =
 +
\frac{1}{k!}
  
The Poisson distribution was first obtained by S. Poisson (1837) when deriving approximate formulas for the [[Binomial distribution|binomial distribution]] when <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328028.png" /> (the number of trials) is large and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328029.png" /> (the probability of success) is small. See [[Poisson theorem|Poisson theorem]] 2). The Poisson distribution describes many physical phenomena with good approximation (see [[#References|[2]]], Vol. 1, Chapt. 6). The Poisson distribution is the limiting case for many discrete distributions such as, for example, the [[Hypergeometric distribution|hypergeometric distribution]], the [[Negative binomial distribution|negative binomial distribution]], the [[Pólya distribution|Pólya distribution]], and for the distributions arising in problems about the arrangements of particles in cells with a given variation in the parameters. The Poisson distribution also plays an important role in probabilistic models as an exact probability distribution. The nature of the Poisson distribution as an exact probability distribution is discussed more fully in the theory of random processes (see [[Poisson process|Poisson process]]), where the Poisson distribution appears as the distribution of the number <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328030.png" /> of certain random events occurring in the course of time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328031.png" /> in a fixed interval:
+
\int\limits _  \lambda  ^  \infty 
 +
y  ^ {k} e ^ {- \lambda }  d y  =  1 - S _ {k+1} ( \lambda ) ,
 +
$$
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328032.png" /></td> </tr></table>
+
where  $  S _ {k+} 1 ( \lambda ) $
 +
is the value at the point  $  \lambda $
 +
of the [[gamma-distribution]] function with parameter  $  k + 1 $(
 +
or by  $  F ( k) = 1 - H _ {2k+} 2 ( 2 \lambda ) $,
 +
where  $  H _ {2k+} 2 ( 2 \lambda ) $
 +
is the value at the point  $  2 \lambda $
 +
of the [[Chi-squared distribution| "chi-squared" distribution]] function with  $  2 k + 2 $
 +
degrees of freedom) whence, in particular,
  
(the parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328033.png" /> is the mean number of events in unit time), or, more generally, as the distribution of a random number of points in a certain fixed domain of Euclidean space (the parameter of the distribution is proportional to the volume of the domain).
+
$$
 +
{\mathsf P} \{ X = k \}  = S _ {k} ( \lambda ) - S _ {k-} 1 ( \lambda ) .
 +
$$
  
Along with the Poisson distribution, as defined above, one considers the so-called generalized or compound Poisson distribution. This is the probability distribution of the sum <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328034.png" /> of a random number <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328035.png" /> of identically-distributed random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328036.png" /> (where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328037.png" /> are considered to be mutually independent and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328038.png" /> is distributed according to the Poisson distribution with parameter <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328039.png" />). The characteristic function <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328040.png" /> of the compound Poisson distribution is
+
The sum of independent variables  $  X _ {1} \dots X _ {n} $
 +
each having a Poisson distribution with parameters  $  \lambda _ {1} \dots \lambda _ {n} $
 +
has a Poisson distribution with parameter $  \lambda _ {1} + \dots + \lambda _ {n} $.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328041.png" /></td> </tr></table>
+
Conversely, if the sum  $  X _ {1} + X _ {2} $
 +
of two independent random variables  $  X _ {1} $
 +
and  $  X _ {2} $
 +
has a Poisson distribution, then each random variable  $  X _ {1} $
 +
and  $  X _ {2} $
 +
is subject to a Poisson distribution (Raikov's theorem). There are general necessary and sufficient conditions for the convergence of the distribution of sums of independent random variables to a Poisson distribution. In the limit, as  $  \lambda \rightarrow \infty $,
 +
the random variable  $  ( X - \lambda ) / \sqrt \lambda $
 +
has the standard [[Normal distribution|normal distribution]].
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328042.png" /> is the characteristic function of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328043.png" />. For example, the negative binomial distribution with parameters <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328044.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328045.png" /> is a compound Poisson distribution, since one can put
+
The Poisson distribution was first obtained by S. Poisson (1837) when deriving approximate formulas for the [[Binomial distribution|binomial distribution]] when  $  n $(
 +
the number of trials) is large and  $  p $(
 +
the probability of success) is small. See [[Poisson theorem]] 2). The Poisson distribution describes many physical phenomena with good approximation (see {{Cite|F}}, Vol. 1, Chapt. 6). The Poisson distribution is the limiting case for many discrete distributions such as, for example, the [[Hypergeometric distribution|hypergeometric distribution]], the [[Negative binomial distribution|negative binomial distribution]], the [[Pólya distribution|Pólya distribution]], and for the distributions arising in problems about the arrangements of particles in cells with a given variation in the parameters. The Poisson distribution also plays an important role in probabilistic models as an exact probability distribution. The nature of the Poisson distribution as an exact probability distribution is discussed more fully in the theory of random processes (see [[Poisson process|Poisson process]]), where the Poisson distribution appears as the distribution of the number  $  X ( t) $
 +
of certain random events occurring in the course of time  $  t $
 +
in a fixed interval:
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328046.png" /></td> </tr></table>
+
$$
 +
P \{ X ( t) = k \}  = e ^ {- \lambda t }
  
The compound Poisson distributions are infinitely divisible and every [[Infinitely-divisible distribution|infinitely-divisible distribution]] is a limit of compound Poisson distributions (perhaps "shifted" , that is, with characteristic functions of the form <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328047.png" />). In addition, the infinitely-divisible distributions (and these alone) can be obtained as limits of the distributions of sums of the form <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328048.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328049.png" /> form a triangular array of independent random variables each with a Poisson distribution, and where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328050.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/p/p073/p073280/p07328051.png" /> are real numbers.
+
\frac{( \lambda t )  ^ {k} }{k!}
 +
  ,\ \
 +
k = 0 , 1 ,\dots
 +
$$
  
====References====
+
(the parameter  $  \lambda $
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  S.D. Poisson,   "Récherches sur la probabilité des jugements en matière criminelle et en matière civile" , Paris  (1837)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  W. Feller,   "An introduction to probability theory and its applications" , '''1–2''' , Wiley (1950–1966)</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top"> L.N. Bol'shev,   N.V. Smirnov,   "Tables of mathematical statistics" , ''Libr. math. tables'' , '''46''' , Nauka (1983)  (In Russian)  (Processed by L.S. Bark and E.S. Kedrova)</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top"> Yu.V. LinnikI.V. Ostrovskii,   "Decomposition of random variables and vectors" , Amer. Math. Soc. (1977) (Translated from Russian)</TD></TR></table>
+
is the mean number of events in unit time), or, more generally, as the distribution of a random number of points in a certain fixed domain of Euclidean space (the parameter of the distribution is proportional to the volume of the domain).
 +
 
 +
Along with the Poisson distribution, as defined above, one considers the so-called generalized or compound Poisson distribution. This is the probability distribution of the sum  $  X _ {1} + \dots + X _  \nu  $
 +
of a random number  $  \nu $
 +
of identically-distributed random variables  $  X _ {1} , X _ {2} ,\dots $(
 +
where $ \nu , X _ {1} , X _ {2} \dots $
 +
are considered to be mutually independent and  $  \nu $
 +
is distributed according to the Poisson distribution with parameter  $  \lambda $).  
 +
The characteristic function  $ \phi ( t) $
 +
of the compound Poisson distribution is
 +
 
 +
$$
 +
\phi ( t=  \mathop{\rm exp} \{ \lambda ( \psi ( t) - 1 ) \} ,
 +
$$
 +
 
 +
where $  \psi ( t) $
 +
is the characteristic function of  $  X _  \nu  $.  
 +
For example, the negative binomial distribution with parameters  $  n $
 +
and $  p $
 +
is a compound Poisson distribution, since one can put
 +
 
 +
$$
 +
\psi ( t) =   
 +
\frac{1} \lambda
 +
\
 +
\mathop{\rm log} 
 +
\frac{1}{1 - q e  ^ {it} }
 +
,\ \
 +
\lambda  =   \mathop{\rm log} \
 +
 
 +
\frac{1}{p}
 +
,\  q = 1 - p .
 +
$$
  
 +
The compound Poisson distributions are infinitely divisible and every [[Infinitely-divisible distribution|infinitely-divisible distribution]] is a limit of compound Poisson distributions (perhaps "shifted" , that is, with characteristic functions of the form  $  \mathop{\rm exp}  ( \lambda _ {n} ( \psi _ {n} ( t) - 1 - i t a _ {n} )) $).
 +
In addition, the infinitely-divisible distributions (and these alone) can be obtained as limits of the distributions of sums of the form  $  h _ {n1} X _ {n1} + \dots + h _ {nk _ {n}  } X _ {nk _ {n}  } - A _ {n} $,
 +
where  $  ( X _ {n1} \dots X _ {nk _ {n}  } ) $
 +
form a triangular array of independent random variables each with a Poisson distribution, and where  $  h _ {nk _ {n}  } > 0 $
 +
and  $  A _ {n} $
 +
are real numbers.
  
 +
====References====
 +
{|
 +
|valign="top"|{{Ref|P}}|| S.D. Poisson, "Recherches sur la probabilité des jugements en matière criminelle et en matière civile", Paris (1837)
 +
|-
 +
|valign="top"|{{Ref|F}}|| W. Feller, [[Feller, "An introduction to probability theory and its  applications"|"An introduction to probability theory and its  applications"]], '''1–2''', Wiley (1950–1966)
 +
|-
 +
|valign="top"|{{Ref|BS}}|| L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics", ''Libr. math. tables'', '''46''', Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova) {{MR|}} {{ZBL|0529.62099}}
 +
|-
 +
|valign="top"|{{Ref|LO}}|| Yu.V. Linnik, I.V. Ostrovskii, "Decomposition of random variables and vectors", Amer. Math. Soc. (1977) (Translated from Russian) {{MR|0428382}} {{ZBL|0358.60020}}
 +
|}
  
 
====Comments====
 
====Comments====
Line 48: Line 149:
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  N.L. Johnson,   S. Kotz,   "Distributions in statistics: discrete distributions" , Houghton Mifflin (1970)</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|JK}}|| N.L. Johnson, S. Kotz, "Distributions in statistics: discrete distributions" , Houghton Mifflin (1970) {{MR|0268996}} {{ZBL|0292.62009}}
 +
|}

Latest revision as of 17:34, 6 January 2024


2020 Mathematics Subject Classification: Primary: 60E99 [MSN][ZBL]

A probability distribution of a random variable $ X $ taking non-negative integer values $ k = 0 , 1 \dots $ with probabilities

$$ {\mathsf P} \{ X = k \} = e ^ {- \lambda } \frac{\lambda ^ {k} }{k!} , $$

where $ \lambda > 0 $ is a parameter. The generating function and the characteristic function of the Poisson distribution are defined by

$$ \phi ( z) = e ^ {\lambda ( z - 1 ) } \ \ \textrm{ and } f ( t) = \ \mathop{\rm exp} [ \lambda ( e ^ {it} - 1 ) ] , $$

respectively. The mean, variance and the semi-invariants of higher order are all equal to $ \lambda $. The distribution function of the Poisson distribution,

$$ F(x) = \sum_{i=0}^{[x]} e^{-\lambda} \frac{\lambda^{i}}{i!}, $$

is given at the points $ k = 0 , 1 ,\dots $ by

$$ F ( k) = \frac{1}{k!} \int\limits _ \lambda ^ \infty y ^ {k} e ^ {- \lambda } d y = 1 - S _ {k+1} ( \lambda ) , $$

where $ S _ {k+} 1 ( \lambda ) $ is the value at the point $ \lambda $ of the gamma-distribution function with parameter $ k + 1 $( or by $ F ( k) = 1 - H _ {2k+} 2 ( 2 \lambda ) $, where $ H _ {2k+} 2 ( 2 \lambda ) $ is the value at the point $ 2 \lambda $ of the "chi-squared" distribution function with $ 2 k + 2 $ degrees of freedom) whence, in particular,

$$ {\mathsf P} \{ X = k \} = S _ {k} ( \lambda ) - S _ {k-} 1 ( \lambda ) . $$

The sum of independent variables $ X _ {1} \dots X _ {n} $ each having a Poisson distribution with parameters $ \lambda _ {1} \dots \lambda _ {n} $ has a Poisson distribution with parameter $ \lambda _ {1} + \dots + \lambda _ {n} $.

Conversely, if the sum $ X _ {1} + X _ {2} $ of two independent random variables $ X _ {1} $ and $ X _ {2} $ has a Poisson distribution, then each random variable $ X _ {1} $ and $ X _ {2} $ is subject to a Poisson distribution (Raikov's theorem). There are general necessary and sufficient conditions for the convergence of the distribution of sums of independent random variables to a Poisson distribution. In the limit, as $ \lambda \rightarrow \infty $, the random variable $ ( X - \lambda ) / \sqrt \lambda $ has the standard normal distribution.

The Poisson distribution was first obtained by S. Poisson (1837) when deriving approximate formulas for the binomial distribution when $ n $( the number of trials) is large and $ p $( the probability of success) is small. See Poisson theorem 2). The Poisson distribution describes many physical phenomena with good approximation (see [F], Vol. 1, Chapt. 6). The Poisson distribution is the limiting case for many discrete distributions such as, for example, the hypergeometric distribution, the negative binomial distribution, the Pólya distribution, and for the distributions arising in problems about the arrangements of particles in cells with a given variation in the parameters. The Poisson distribution also plays an important role in probabilistic models as an exact probability distribution. The nature of the Poisson distribution as an exact probability distribution is discussed more fully in the theory of random processes (see Poisson process), where the Poisson distribution appears as the distribution of the number $ X ( t) $ of certain random events occurring in the course of time $ t $ in a fixed interval:

$$ P \{ X ( t) = k \} = e ^ {- \lambda t } \frac{( \lambda t ) ^ {k} }{k!} ,\ \ k = 0 , 1 ,\dots $$

(the parameter $ \lambda $ is the mean number of events in unit time), or, more generally, as the distribution of a random number of points in a certain fixed domain of Euclidean space (the parameter of the distribution is proportional to the volume of the domain).

Along with the Poisson distribution, as defined above, one considers the so-called generalized or compound Poisson distribution. This is the probability distribution of the sum $ X _ {1} + \dots + X _ \nu $ of a random number $ \nu $ of identically-distributed random variables $ X _ {1} , X _ {2} ,\dots $( where $ \nu , X _ {1} , X _ {2} \dots $ are considered to be mutually independent and $ \nu $ is distributed according to the Poisson distribution with parameter $ \lambda $). The characteristic function $ \phi ( t) $ of the compound Poisson distribution is

$$ \phi ( t) = \mathop{\rm exp} \{ \lambda ( \psi ( t) - 1 ) \} , $$

where $ \psi ( t) $ is the characteristic function of $ X _ \nu $. For example, the negative binomial distribution with parameters $ n $ and $ p $ is a compound Poisson distribution, since one can put

$$ \psi ( t) = \frac{1} \lambda \ \mathop{\rm log} \frac{1}{1 - q e ^ {it} } ,\ \ \lambda = \mathop{\rm log} \ \frac{1}{p} ,\ q = 1 - p . $$

The compound Poisson distributions are infinitely divisible and every infinitely-divisible distribution is a limit of compound Poisson distributions (perhaps "shifted" , that is, with characteristic functions of the form $ \mathop{\rm exp} ( \lambda _ {n} ( \psi _ {n} ( t) - 1 - i t a _ {n} )) $). In addition, the infinitely-divisible distributions (and these alone) can be obtained as limits of the distributions of sums of the form $ h _ {n1} X _ {n1} + \dots + h _ {nk _ {n} } X _ {nk _ {n} } - A _ {n} $, where $ ( X _ {n1} \dots X _ {nk _ {n} } ) $ form a triangular array of independent random variables each with a Poisson distribution, and where $ h _ {nk _ {n} } > 0 $ and $ A _ {n} $ are real numbers.

References

[P] S.D. Poisson, "Recherches sur la probabilité des jugements en matière criminelle et en matière civile", Paris (1837)
[F] W. Feller, "An introduction to probability theory and its applications", 1–2, Wiley (1950–1966)
[BS] L.N. Bol'shev, N.V. Smirnov, "Tables of mathematical statistics", Libr. math. tables, 46, Nauka (1983) (In Russian) (Processed by L.S. Bark and E.S. Kedrova) Zbl 0529.62099
[LO] Yu.V. Linnik, I.V. Ostrovskii, "Decomposition of random variables and vectors", Amer. Math. Soc. (1977) (Translated from Russian) MR0428382 Zbl 0358.60020

Comments

The Poisson distribution frequently occurs in queueing theory.

References

[JK] N.L. Johnson, S. Kotz, "Distributions in statistics: discrete distributions" , Houghton Mifflin (1970) MR0268996 Zbl 0292.62009
How to Cite This Entry:
Poisson distribution. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Poisson_distribution&oldid=18030
This article was adapted from an original article by A.V. Prokhorov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article