Namespaces
Variants
Actions

Transition probabilities

From Encyclopedia of Mathematics
Jump to: navigation, search


The probabilities of transition of a Markov chain $ \xi ( t) $ from a state $ i $ into a state $ j $ in a time interval $ [ s, t] $:

$$ p _ {ij} ( s, t) = {\mathsf P} \{ \xi ( t) = j \mid \xi ( s) = i \} ,\ s< t. $$

In view of the basic property of a Markov chain, for any states $ i, j \in S $ (where $ S $ is the set of all states of the chain) and any $ s < t < u $,

$$ p _ {ij} ( s, u) = \sum _ {k \in S } p _ {ik} ( s, t) p _ {kj} ( t, u). $$

One usually considers homogeneous Markov chains, for which the transition probabilities $ p _ {ij} ( s, t) $ depend on the length of $ [ s, t] $ but not on its position on the time axis:

$$ p _ {ij} ( s, t) = p _ {ij} ( t- s). $$

For any states $ i $ and $ j $ of a homogeneous Markov chain with discrete time, the sequence $ p _ {ij} ( n) $ has a Cesàro limit, i.e.

$$ \lim\limits _ {n \rightarrow \infty } \frac{1}{n} \sum _ { k= 1} ^ { n } p _ {ij} ( k) \geq 0. $$

Subject to certain additional conditions (and also for chains with continuous time), the limit exists also in the usual sense. See Markov chain, ergodic; Markov chain, class of positive states of a.

The transition probabilities $ p _ {ij} ( t) $ for a Markov chain with discrete time are determined by the values of $ p _ {ij} ( 1) $, $ i, j \in S $; for any $ t > 0 $, $ i \in S $,

$$ \sum _ {j \in S } p _ {ij} ( t) = 1. $$

In the case of Markov chains with continuous time it is usually assumed that the transition probabilities satisfy the following additional conditions: All the $ p _ {ij} ( t) $ are measurable as functions of $ t \in ( 0, \infty ) $,

$$ \lim\limits _ {t \downarrow 0 } p _ {ij} ( t) = 0 \ \ ( i \neq j),\ \ \lim\limits _ {t \downarrow 0 } p _ {ii} ( t) = 1 ,\ \ i, j \in S. $$

Under these assumptions the following transition rates exist:

$$ \tag{1 } \lambda _ {ij} = \lim\limits _ {t \downarrow 0 } \frac{1}{t} ( p _ {ij} ( t) - p _ {ij} ( 0)) \leq \infty ,\ \ i, j \in S; $$

if all the $ \lambda _ {ij} $ are finite and if $ \sum _ {j \in S } \lambda _ {ij} = 0 $, $ i \in S $, then the $ p _ {ij} ( t) $ satisfy the Kolmogorov–Chapman system of differential equations

$$ \tag{2 } p _ {ij} ^ \prime ( t) = \sum _ {k \in S } \lambda _ {ik} p _ {kj} ( t),\ \ p _ {ij} ^ \prime ( t) = \sum _ {k \in S } \lambda _ {kj} p _ {ik} ( t) $$

with the initial conditions $ p _ {ii} ( 0) = 1 $, $ p _ {ij} ( 0) = 0 $, $ i \neq j $, $ i, j \in S $ (see also Kolmogorov equation; Kolmogorov–Chapman equation).

If a Markov chain is specified by means of the transition rates (1), then the transition probabilities $ p _ {ij} ( t) $ satisfy the conditions

$$ p _ {ij} ( t) \geq 0,\ \ \sum _ {j \in S } p _ {ij} ( t) \leq 1,\ \ i, j \in S,\ \ t > 0; $$

chains for which $ \sum _ {j \in S } p _ {ij} ( t) < 1 $ for certain $ i \in S $ and $ t > 0 $ are called defective (in this case the solution to (2) is not unique); if $ \sum _ {j \in S } p _ {ij} ( t) = 1 $ for all $ i \in S $ and $ t > 0 $, the chain is called proper.

Example. The Markov chain $ \xi ( t) $ with set of states $ \{ 0, 1 ,\dots \} $ and transition densities

$$ \lambda _ {i,i+1} = - \lambda _ {ii} = \lambda _ {i} > 0,\ \ \lambda _ {ij} = 0 \ \ ( i \neq j \neq i+ 1) $$

(i.e., a pure birth process) is defective if and only if

$$ \sum _ { i= 0} ^ \infty \frac{1}{\lambda _ {i} } < \infty . $$

Let

$$ \tau _ {0n} = \inf \{ {t > 0 } : {\xi ( t) = n ( \xi ( 0) = 0) } \} , $$

$$ \tau = \lim\limits _ {n \rightarrow \infty } \tau _ {0n} ; $$

then

$$ {\mathsf E} \tau = \sum _ { i= 1} ^ \infty \frac{1}{\lambda _ {i} } $$

and for $ {\mathsf E} \tau < \infty $ one has $ {\mathsf P} \{ \tau < \infty \} = 1 $, i.e. the path of $ \xi ( t) $" tends to infinity in a finite time with probability 1" (see also Branching processes, regularity of).

References

[1] K.L. Chung, "Markov chains with stationary probability densities" , Springer (1967)

Comments

For additional references see also Markov chain; Markov process.

In (1), $ \lambda _ {ij} \geq 0 $ if $ i \neq j $ and $ \lambda _ {ii} \leq 0 $.

References

[a1] M. Iosifescu, "Finite Markov processes and their applications" , Wiley (1980)
[a2] D. Revuz, "Markov chains" , North-Holland (1984)
How to Cite This Entry:
Transition probabilities. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Transition_probabilities&oldid=52160
This article was adapted from an original article by A.M. Zubkov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article