Namespaces
Variants
Actions

Difference between revisions of "Markov chain, decomposable"

From Encyclopedia of Mathematics
Jump to: navigation, search
(refs format)
(Tex done)
Line 1: Line 1:
{{MSC|60J10|60J27}}
+
{{TEX|done}}{{MSC|60J10|60J27}}
  
 
[[Category:Markov processes]]
 
[[Category:Markov processes]]
  
A [[Markov chain|Markov chain]] whose [[Transition probabilities|transition probabilities]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062380/m0623801.png" /> have the following property: There are states <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062380/m0623802.png" /> such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062380/m0623803.png" /> for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062380/m0623804.png" />. Decomposability of a Markov chain is equivalent to decomposability of its matrix of transition probabilities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062380/m0623805.png" /> for discrete-time Markov chains, and of its matrix of transition probability densities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062380/m0623806.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062380/m0623807.png" />, for continuous-time Markov chains. The state space of a decomposable Markov chain consists either of inessential states or of more than one class of communicating states (cf. [[Markov chain|Markov chain]]).
+
A [[Markov chain]] whose [[transition probabilities]] $p_{ij}(t)$ have the following property: There are states $i,j$ such that $p_{ij}(t) = 0$ for all $t \ge 0$. Decomposability of a Markov chain is equivalent to decomposability of its matrix of transition probabilities $P = \left( {p_{ij}} \right)$ for a discrete-time Markov chain, and of its matrix of transition probability densities $Q = \left( {p'_{ij}(0)} \right)$ for a continuous-time Markov chain. The state space of a decomposable Markov chain consists either of inessential states or of more than one class of communicating states (cf. [[Markov chain]]).
  
  

Revision as of 20:35, 29 October 2016

2020 Mathematics Subject Classification: Primary: 60J10 Secondary: 60J27 [MSN][ZBL]

A Markov chain whose transition probabilities $p_{ij}(t)$ have the following property: There are states $i,j$ such that $p_{ij}(t) = 0$ for all $t \ge 0$. Decomposability of a Markov chain is equivalent to decomposability of its matrix of transition probabilities $P = \left( {p_{ij}} \right)$ for a discrete-time Markov chain, and of its matrix of transition probability densities $Q = \left( {p'_{ij}(0)} \right)$ for a continuous-time Markov chain. The state space of a decomposable Markov chain consists either of inessential states or of more than one class of communicating states (cf. Markov chain).


Comments

References

[F] W. Feller, "An introduction to probability theory and its applications", 1–2, Wiley (1966)
[Fr] D. Freedman, "Markov chains", Holden-Day (1975) MR0686269 MR0681291 MR0556418 MR0428472 MR0292176 MR0237001 MR0211464 MR0164375 MR0158435 MR0152015 Zbl 0501.60071 Zbl 0501.60069 Zbl 0426.60064 Zbl 0325.60059 Zbl 0322.60057 Zbl 0212.49801 Zbl 0129.30605
[I] M. Iosifescu, "Finite Markov processes and their applications", Wiley (1980) MR0587116 Zbl 0436.60001
[KS] J.G. Kemeny, J.L. Snell, "Finite Markov chains", v. Nostrand (1960) MR1531032 MR0115196 Zbl 0089.13704
[KSK] J.G. Kemeny, J.L. Snell, A.W. Knapp, "Denumerable Markov chains", Springer (1976) MR0407981 Zbl 0348.60090
[Re] D. Revuz, "Markov chains", North-Holland (1975) MR0415773 Zbl 0332.60045
[Ro] V.I. Romanovsky, "Discrete Markov chains", Wolters-Noordhoff (1970) (Translated from Russian) MR0266312 Zbl 0201.20002
[S] E. Seneta, "Non-negative matrices and Markov chains", Springer (1981) MR2209438 Zbl 0471.60001
How to Cite This Entry:
Markov chain, decomposable. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_chain,_decomposable&oldid=26565
This article was adapted from an original article by B.A. Sevast'yanov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article