Namespaces
Variants
Actions

Difference between revisions of "Markov chain, non-decomposable"

From Encyclopedia of Mathematics
Jump to: navigation, search
(MSC|60J10|60J27 Category:Markov processes)
(Tex done)
 
Line 3: Line 3:
 
[[Category:Markov processes]]
 
[[Category:Markov processes]]
  
A [[Markov chain|Markov chain]] whose transition probabilities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624101.png" /> have the following property: For any states <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624102.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624103.png" /> there is a time <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624104.png" /> such that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624105.png" />. The non-decomposability of a Markov chain is equivalent to non-decomposability of its matrix of transition probabilities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624106.png" /> for a discrete-time Markov chain, and of its matrix of transition probability densities <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624107.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/m/m062/m062410/m0624108.png" /> for a continuous-time Markov chain. The state space of a non-decomposable Markov chain consists of one class of communicating states (cf. [[Markov chain|Markov chain]]).
+
A [[Markov chain|Markov chain]] whose transition probabilities $P_{ij}(t)$ have the following property: For any states $i$ and $j$ there is a time $t_{ij}$ such that $p_{ij}(t_{ij}) > 0 $. The non-decomposability of a Markov chain is equivalent to non-decomposability of its matrix of transition probabilities $P = \left( {p_{ij}} \right)$ for a discrete-time Markov chain, and of its matrix of transition probability densities $Q = \left( {p'_{ij}(0)} \right)$ for a continuous-time Markov chain. The state space of a non-decomposable Markov chain consists of one class of communicating states (cf. [[Markov chain|Markov chain]]).
  
  
  
 
====Comments====
 
====Comments====
Cf. also [[Markov chain|Markov chain]] and [[Markov chain, decomposable|Markov chain, decomposable]] for references.
+
Cf. also [[Markov chain]] and [[Markov chain, decomposable]] for references.
 +
 
 +
{{TEX|done}}

Latest revision as of 20:22, 29 October 2016

2020 Mathematics Subject Classification: Primary: 60J10 Secondary: 60J27 [MSN][ZBL]

A Markov chain whose transition probabilities $P_{ij}(t)$ have the following property: For any states $i$ and $j$ there is a time $t_{ij}$ such that $p_{ij}(t_{ij}) > 0 $. The non-decomposability of a Markov chain is equivalent to non-decomposability of its matrix of transition probabilities $P = \left( {p_{ij}} \right)$ for a discrete-time Markov chain, and of its matrix of transition probability densities $Q = \left( {p'_{ij}(0)} \right)$ for a continuous-time Markov chain. The state space of a non-decomposable Markov chain consists of one class of communicating states (cf. Markov chain).


Comments

Cf. also Markov chain and Markov chain, decomposable for references.

How to Cite This Entry:
Markov chain, non-decomposable. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Markov_chain,_non-decomposable&oldid=39529
This article was adapted from an original article by B.A. Sevast'yanov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article