Namespaces
Variants
Actions

Difference between revisions of "Entropy theory of a dynamical system"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(latex details)
 
(4 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 +
<!--
 +
e0357601.png
 +
$#A+1 = 71 n = 0
 +
$#C+1 = 71 : ~/encyclopedia/old_files/data/E035/E.0305760 Entropy theory of a dynamical system
 +
Automatically converted into TeX, above some diagnostics.
 +
Please remove this comment and the {{TEX|auto}} line below,
 +
if TeX found to be correct.
 +
-->
 +
 +
{{TEX|auto}}
 +
{{TEX|done}}
 +
 +
{{MSC|37A35|60G10}}
 +
 +
[[Category:Ergodic theory]]
 +
 
A branch of [[Ergodic theory|ergodic theory]] closely connected with probability theory and information theory. In broad lines, the nature of this connection is as follows.
 
A branch of [[Ergodic theory|ergodic theory]] closely connected with probability theory and information theory. In broad lines, the nature of this connection is as follows.
  
Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357601.png" /> be a dynamical system (usually a [[Measurable flow|measurable flow]] or a [[Cascade|cascade]]) with phase space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357602.png" /> and [[Invariant measure|invariant measure]] <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357603.png" />. Let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357604.png" /> be a measurable function and let <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357605.png" /> be the [[Measurable decomposition|measurable decomposition]] (measurable partition) of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357606.png" /> into inverse images <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357607.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357608.png" />. (For what follows it is sufficient to consider inverse images of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e0357609.png" /> having a countable, and as a rule even finite, number of values, and the corresponding partition <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576010.png" />.) Then
+
Let $  \{ T _ {t} \} $
 +
be a dynamical system (usually a [[Measurable flow|measurable flow]] or a [[Cascade|cascade]]) with phase space $  W $
 +
and [[Invariant measure|invariant measure]] $  \mu $.  
 +
Let $  f : W \rightarrow \mathbf R $
 +
be a measurable function and let $  \xi $
 +
be the [[measurable decomposition]] (measurable partition) of $  W $
 +
into inverse images $  f ^ { - 1 } ( c) $,  
 +
$  c \in \mathbf R $.  
 +
(For what follows it is sufficient to consider inverse images of $  f $
 +
having a countable, and as a rule even finite, number of values, and the corresponding partition $  \xi $.)  
 +
Then
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576011.png" /></td> </tr></table>
+
$$
 +
\{ t \mapsto f ( T _ {t} w ) \}
 +
$$
  
is a stationary [[Stochastic process|stochastic process]] (in the narrow sense of the word) with <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576012.png" /> as space of elementary events. Usually this can be regarded as a process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576013.png" /> the space of elementary events of which is the space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576014.png" /> of sample functions (cf. [[Sample function|Sample function]]) <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576015.png" />, endowed with a suitable measure <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576016.png" />, and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576017.png" />. The mapping
+
is a stationary [[stochastic process]] (in the narrow sense of the word) with $  W $
 +
as space of elementary events. Usually this can be regarded as a process $  \{ X _ {t} ( \omega ) \} $
 +
the space of elementary events of which is the space $  \Omega $
 +
of sample functions (cf. [[Sample function|Sample function]]) $  \omega $,  
 +
endowed with a suitable measure $  \nu $,  
 +
and $  X _ {t} ( \omega ) = \omega ( t) $.  
 +
The mapping
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576018.png" /></td> </tr></table>
+
$$
 +
\pi : W  \rightarrow  \Omega ,\ \
 +
( \pi w ) ( t)  = f ( T _ {t} w )
 +
$$
  
is a homomorphism of measure spaces (see the definition in the article [[Metric isomorphism|Metric isomorphism]]) that carries <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576019.png" /> into the shift <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576020.png" />, where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576021.png" />.
+
is a homomorphism of measure spaces (see the definition in the article [[Metric isomorphism|Metric isomorphism]]) that carries $  \{ T _ {t} \} $
 +
into the shift $  \{ S _ {t} \} $,  
 +
where $  ( S _ {t} \omega ) ( \tau ) = \omega ( t + \tau ) $.
  
The process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576022.png" /> contains some information about the original system <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576023.png" />. This can even be complete information when <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576024.png" /> is an isomorphism. (One says then that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576025.png" /> is a generator for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576026.png" />; if <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576027.png" /> is an automorphism, then the partition is called a one-side generator for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576028.png" /> if it is a generator for the cascade <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576029.png" />, and a two-side generator for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576030.png" /> if it is a generator for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576031.png" />.) However, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576032.png" /> also depends on the choice of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576033.png" />, that is, first of all, on <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576034.png" /> (the specific values of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576035.png" /> on the elements of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576036.png" /> are less important here). Of interest in ergodic theory are those properties of an individual process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576037.png" />, or of a collection of those processes (obtained for various <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576038.png" />), that are properties of the system <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576039.png" /> itself. However, to select such properties was for a long time not easy unless they reduced to known ones.
+
The process $  \{ X _ {t} ( \omega ) \} $
 +
contains some information about the original system $  \{ T _ {t} \} $.  
 +
This can even be complete information when $  \pi $
 +
is an isomorphism. (One says then that $  \xi $
 +
is a generator for $  \{ T _ {t} \} $;  
 +
if $  T $
 +
is an automorphism, then the partition is called a one-side generator for $  T $
 +
if it is a generator for the cascade $  \{ {T  ^ {n} } : {n \geq  0 } \} $,  
 +
and a two-side generator for $  T $
 +
if it is a generator for $  \{ {T  ^ {n} } : {n \in \mathbf Z } \} $.)  
 +
However, $  \{ X _ {t} ( \omega ) \} $
 +
also depends on the choice of $  f $,  
 +
that is, first of all, on $  \xi $(
 +
the specific values of $  f $
 +
on the elements of $  \xi $
 +
are less important here). Of interest in ergodic theory are those properties of an individual process $  \{ X _ {t} ( \omega ) \} $,  
 +
or of a collection of those processes (obtained for various $  \xi $),  
 +
that are properties of the system $  \{ T _ {t} \} $
 +
itself. However, to select such properties was for a long time not easy unless they reduced to known ones.
  
This difficulty was successfully overcome in the middle of the 1950s by A.N. Kolmogorov when he introduced a fundamentally new (non-spectral) invariant, the metric [[Entropy|entropy]] of a dynamical system, and emphasized the role of increasing measurable partitions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576040.png" />, that is, those for which <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576041.png" /> is finer than <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576042.png" /> (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576043.png" />) for <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576044.png" />. (In this way a partition describes the "past" of the process <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576045.png" />, see also [[K-system(2)|<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576046.png" />-system]]; [[Exact endomorphism|Exact endomorphism]].) The elaboration of this range of problems (including that of the existence and properties of generating partitions) is the object of the entropy theory of dynamical systems in the form in which it was put together in the middle of the 1960s (see [[#References|[1]]]). A substantial addition was the more complete and somewhat more special theory of D. Ornstein in which auxiliary stochastic processes <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576047.png" /> are used in a more direct way (see [[#References|[2]]]). In view of the need to ensure the invariance under metric isomorphisms in both the "Kolmogorov" and the "Ornstein" entropy theory of dynamical systems, probability and information-theoretical ideas pervade the area in an essentially transformed form.
+
This difficulty was successfully overcome in the middle of the 1950s by A.N. Kolmogorov when he introduced a fundamentally new (non-spectral) invariant, the metric [[Entropy|entropy]] of a dynamical system, and emphasized the role of increasing measurable partitions $  \eta $,  
 +
that is, those for which $  T _ {t} \eta $
 +
is finer than $  \eta $(
 +
$  \mathop{\rm mod}  0 $)  
 +
for $  t > 0 $.  
 +
(In this way a partition describes the "past" of the process $  \{ X _ {t} ( \omega ) \} $,  
 +
see also [[K-system(2)| $  K $-
 +
system]]; [[Exact endomorphism|Exact endomorphism]].) The elaboration of this range of problems (including that of the existence and properties of generating partitions) is the object of the entropy theory of dynamical systems in the form in which it was put together in the middle of the 1960s (see {{Cite|R}}). A substantial addition was the more complete and somewhat more special theory of D. Ornstein in which auxiliary stochastic processes $  \{ X _ {t} ( \omega ) \} $
 +
are used in a more direct way (see {{Cite|O}}). In view of the need to ensure the invariance under metric isomorphisms in both the "Kolmogorov" and the "Ornstein" entropy theory of dynamical systems, probability and information-theoretical ideas pervade the area in an essentially transformed form.
  
Two conditions of "regularity" type of a stochastic process occurring in the entropy theory of dynamical systems may serve as examples. One of them leads to the definition of a <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576048.png" />-system. The other, more restrictive one, a very weak Bernoulli property, turns out to be necessary and sufficient for a shift in the space of sample functions to be isomorphic to a [[Bernoulli automorphism|Bernoulli automorphism]]. It can be verified in a number of examples, the original definitions of which have no relation to stochastic processes.
+
Two conditions of "regularity" type of a stochastic process occurring in the entropy theory of dynamical systems may serve as examples. One of them leads to the definition of a $  K $-
 +
system. The other, more restrictive one, a very weak Bernoulli property, turns out to be necessary and sufficient for a shift in the space of sample functions to be isomorphic to a [[Bernoulli automorphism|Bernoulli automorphism]]. It can be verified in a number of examples, the original definitions of which have no relation to stochastic processes.
  
 
====References====
 
====References====
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  V.A. Rokhlin,   "Lectures on the entropy theory of measure-preserving transformations" ''Russian Math. Surveys'' , '''22''' : 5 (1967) pp. 1–52 ''Uspekhi Mat. Nauk'' , '''22''' : 5 (1967) pp. 3–56</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  D. Ornstein,   "Ergodic theory, randomness, and dynamical systems" , Yale Univ. Press (1974)</TD></TR></table>
+
{|
 
+
|valign="top"|{{Ref|R}}|| V.A. Rokhlin, "Lectures on the entropy theory of measure-preserving transformations" ''Russian Math. Surveys'' , '''22''' : 5 (1967) pp. 1–52 ''Uspekhi Mat. Nauk'' , '''22''' : 5 (1967) pp. 3–56 {{MR|}} {{ZBL|0174.45501}}
See also the references to [[K-system(2)|<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576049.png" />-system]]; [[Entropy|Entropy]]; [[Ergodic theory|Ergodic theory]].
+
|-
 
+
|valign="top"|{{Ref|O}}|| D. Ornstein, "Ergodic theory, randomness, and dynamical systems" , Yale Univ. Press (1974) {{MR|0447525}} {{ZBL|0296.28016}}
 +
|}
  
 +
See also the references to [[K-system(2)| $  K $-
 +
system]]; [[Entropy|Entropy]]; [[Ergodic theory|Ergodic theory]].
  
 
====Comments====
 
====Comments====
Entropy in the theory of dynamical systems is defined as follows (cf. also [[Entropy|Entropy]]). For every measurable partition <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576050.png" /> of a probability space <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576051.png" />, the entropy <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576052.png" /> of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576053.png" /> is defined as
+
Entropy in the theory of dynamical systems is defined as follows (cf. also [[Entropy|Entropy]]). For every measurable partition $  \xi = \{ A _ {1} \dots A _ {m} \} $
 +
of a probability space $  ( W, {\mathcal A} , \mu ) $,  
 +
the entropy $  H ( \xi ) $
 +
of $  \xi $
 +
is defined as
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576054.png" /></td> </tr></table>
+
$$
 +
H ( \xi )  = - \sum_{k=1}^ { m } \mu ( A _ {k} )  \mathop{\rm log}  \mu ( A _ {k} )
 +
$$
  
(it is assumed that <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576055.png" />). The base of the logarithm can be any positive number, but as a rule one takes logarithms to the base 2 or <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576056.png" />.
+
(it is assumed that $  0  \mathop{\rm log}  0 = 0 $).  
 +
The base of the logarithm can be any positive number, but as a rule one takes logarithms to the base 2 or e $.
  
Then define the entropy <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576057.png" /> of a measure-preserving transformation <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576058.png" /> (i.e. of a cascade) with respect to a partition <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576059.png" /> by
+
Then define the entropy $  h ( T, \xi ) $
 +
of a measure-preserving transformation $  T $(
 +
i.e. of a cascade) with respect to a partition $  \xi $
 +
by
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576060.png" /></td> </tr></table>
+
$$
 +
h ( T, \xi )  = \lim\limits _ {n \rightarrow \infty } 
 +
\frac{1}{n}
 +
H ( \xi  ^ {n} ) ,
 +
$$
  
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576061.png" />, the common refinement of the partitions <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576062.png" />. Finally, the entropy of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576063.png" /> is defined as
+
where $  \xi  ^ {n} = \lor_{i=1}^ {n-1} T ^ { - i } \xi $,  
 +
the common refinement of the partitions $  \xi , T ^ { - 1 } \xi \dots T ^ { - n } \xi $.  
 +
Finally, the entropy of $  T $
 +
is defined as
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576064.png" /></td> </tr></table>
+
$$
 +
h ( T)  = \sup  h ( T , \xi ),
 +
$$
  
where the supremum is over all finite measurable partitions of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576065.png" />.
+
where the supremum is over all finite measurable partitions of $  W $.
  
Since for a flow <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576066.png" /> in <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576067.png" /> one has <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576068.png" /> for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576069.png" />, one usually defines the entropy of a flow <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576070.png" /> by
+
Since for a flow $  \{ T _ {t} \} $
 +
in $  W $
 +
one has $  h ( T _ {t} ) = | t | h ( T _ {1} ) $
 +
for all $  t \in \mathbf R $,  
 +
one usually defines the entropy of a flow $  \{ T _ {t} \} $
 +
by
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/e/e035/e035760/e03576071.png" /></td> </tr></table>
+
$$
 +
h ( \{ T _ {t} \} )  = h ( T _ {1} ).
 +
$$
  
 
====References====
 
====References====
<table><TR><TD valign="top">[a1]</TD> <TD valign="top">  I.P. [I.P. Kornfel'd] Cornfel'd,   S.V. Fomin,   Ya.G. Sinai,   "Ergodic theory" , Springer (1982) (Translated from Russian)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top"R. Mañé,   "Ergodic theory and differentiable dynamics" , Springer (1987)</TD></TR></table>
+
{|
 +
|valign="top"|{{Ref|CFS}}|| I.P. Cornfel'd, S.V. Fomin, Ya.G. Sinai, "Ergodic theory" , Springer (1982) (Translated from Russian) {{MR|832433}} {{ZBL|}}
 +
|-
 +
|valign="top"|{{Ref|M}}|| R. Mañé, "Ergodic theory and differentiable dynamics" , Springer (1987) {{MR|0889254}} {{ZBL|0616.28007}}
 +
|}

Latest revision as of 19:23, 13 January 2024


2020 Mathematics Subject Classification: Primary: 37A35 Secondary: 60G10 [MSN][ZBL]

A branch of ergodic theory closely connected with probability theory and information theory. In broad lines, the nature of this connection is as follows.

Let $ \{ T _ {t} \} $ be a dynamical system (usually a measurable flow or a cascade) with phase space $ W $ and invariant measure $ \mu $. Let $ f : W \rightarrow \mathbf R $ be a measurable function and let $ \xi $ be the measurable decomposition (measurable partition) of $ W $ into inverse images $ f ^ { - 1 } ( c) $, $ c \in \mathbf R $. (For what follows it is sufficient to consider inverse images of $ f $ having a countable, and as a rule even finite, number of values, and the corresponding partition $ \xi $.) Then

$$ \{ t \mapsto f ( T _ {t} w ) \} $$

is a stationary stochastic process (in the narrow sense of the word) with $ W $ as space of elementary events. Usually this can be regarded as a process $ \{ X _ {t} ( \omega ) \} $ the space of elementary events of which is the space $ \Omega $ of sample functions (cf. Sample function) $ \omega $, endowed with a suitable measure $ \nu $, and $ X _ {t} ( \omega ) = \omega ( t) $. The mapping

$$ \pi : W \rightarrow \Omega ,\ \ ( \pi w ) ( t) = f ( T _ {t} w ) $$

is a homomorphism of measure spaces (see the definition in the article Metric isomorphism) that carries $ \{ T _ {t} \} $ into the shift $ \{ S _ {t} \} $, where $ ( S _ {t} \omega ) ( \tau ) = \omega ( t + \tau ) $.

The process $ \{ X _ {t} ( \omega ) \} $ contains some information about the original system $ \{ T _ {t} \} $. This can even be complete information when $ \pi $ is an isomorphism. (One says then that $ \xi $ is a generator for $ \{ T _ {t} \} $; if $ T $ is an automorphism, then the partition is called a one-side generator for $ T $ if it is a generator for the cascade $ \{ {T ^ {n} } : {n \geq 0 } \} $, and a two-side generator for $ T $ if it is a generator for $ \{ {T ^ {n} } : {n \in \mathbf Z } \} $.) However, $ \{ X _ {t} ( \omega ) \} $ also depends on the choice of $ f $, that is, first of all, on $ \xi $( the specific values of $ f $ on the elements of $ \xi $ are less important here). Of interest in ergodic theory are those properties of an individual process $ \{ X _ {t} ( \omega ) \} $, or of a collection of those processes (obtained for various $ \xi $), that are properties of the system $ \{ T _ {t} \} $ itself. However, to select such properties was for a long time not easy unless they reduced to known ones.

This difficulty was successfully overcome in the middle of the 1950s by A.N. Kolmogorov when he introduced a fundamentally new (non-spectral) invariant, the metric entropy of a dynamical system, and emphasized the role of increasing measurable partitions $ \eta $, that is, those for which $ T _ {t} \eta $ is finer than $ \eta $( $ \mathop{\rm mod} 0 $) for $ t > 0 $. (In this way a partition describes the "past" of the process $ \{ X _ {t} ( \omega ) \} $, see also $ K $- system; Exact endomorphism.) The elaboration of this range of problems (including that of the existence and properties of generating partitions) is the object of the entropy theory of dynamical systems in the form in which it was put together in the middle of the 1960s (see [R]). A substantial addition was the more complete and somewhat more special theory of D. Ornstein in which auxiliary stochastic processes $ \{ X _ {t} ( \omega ) \} $ are used in a more direct way (see [O]). In view of the need to ensure the invariance under metric isomorphisms in both the "Kolmogorov" and the "Ornstein" entropy theory of dynamical systems, probability and information-theoretical ideas pervade the area in an essentially transformed form.

Two conditions of "regularity" type of a stochastic process occurring in the entropy theory of dynamical systems may serve as examples. One of them leads to the definition of a $ K $- system. The other, more restrictive one, a very weak Bernoulli property, turns out to be necessary and sufficient for a shift in the space of sample functions to be isomorphic to a Bernoulli automorphism. It can be verified in a number of examples, the original definitions of which have no relation to stochastic processes.

References

[R] V.A. Rokhlin, "Lectures on the entropy theory of measure-preserving transformations" Russian Math. Surveys , 22 : 5 (1967) pp. 1–52 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 3–56 Zbl 0174.45501
[O] D. Ornstein, "Ergodic theory, randomness, and dynamical systems" , Yale Univ. Press (1974) MR0447525 Zbl 0296.28016

See also the references to $ K $- system; Entropy; Ergodic theory.

Comments

Entropy in the theory of dynamical systems is defined as follows (cf. also Entropy). For every measurable partition $ \xi = \{ A _ {1} \dots A _ {m} \} $ of a probability space $ ( W, {\mathcal A} , \mu ) $, the entropy $ H ( \xi ) $ of $ \xi $ is defined as

$$ H ( \xi ) = - \sum_{k=1}^ { m } \mu ( A _ {k} ) \mathop{\rm log} \mu ( A _ {k} ) $$

(it is assumed that $ 0 \mathop{\rm log} 0 = 0 $). The base of the logarithm can be any positive number, but as a rule one takes logarithms to the base 2 or $ e $.

Then define the entropy $ h ( T, \xi ) $ of a measure-preserving transformation $ T $( i.e. of a cascade) with respect to a partition $ \xi $ by

$$ h ( T, \xi ) = \lim\limits _ {n \rightarrow \infty } \frac{1}{n} H ( \xi ^ {n} ) , $$

where $ \xi ^ {n} = \lor_{i=1}^ {n-1} T ^ { - i } \xi $, the common refinement of the partitions $ \xi , T ^ { - 1 } \xi \dots T ^ { - n } \xi $. Finally, the entropy of $ T $ is defined as

$$ h ( T) = \sup h ( T , \xi ), $$

where the supremum is over all finite measurable partitions of $ W $.

Since for a flow $ \{ T _ {t} \} $ in $ W $ one has $ h ( T _ {t} ) = | t | h ( T _ {1} ) $ for all $ t \in \mathbf R $, one usually defines the entropy of a flow $ \{ T _ {t} \} $ by

$$ h ( \{ T _ {t} \} ) = h ( T _ {1} ). $$

References

[CFS] I.P. Cornfel'd, S.V. Fomin, Ya.G. Sinai, "Ergodic theory" , Springer (1982) (Translated from Russian) MR832433
[M] R. Mañé, "Ergodic theory and differentiable dynamics" , Springer (1987) MR0889254 Zbl 0616.28007
How to Cite This Entry:
Entropy theory of a dynamical system. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Entropy_theory_of_a_dynamical_system&oldid=19074
This article was adapted from an original article by D.V. Anosov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article