Namespaces
Variants
Actions

Metric entropy

From Encyclopedia of Mathematics
Jump to: navigation, search


of a dynamical system

One of the most important invariants in ergodic theory. Basic is the concept of the entropy $ h ( S) $ of an endomorphism $ S $( see Metric isomorphism) of a Lebesgue space $ ( X , \mu ) $. For any finite measurable decomposition (measurable partition) $ \xi $ the limit

$$ h ( S , \xi ) = \lim\limits _ {n \rightarrow \infty } \ \frac{1}{n} H ( \xi _ {S} ^ {n} ) , $$

$$ \xi _ {S} ^ {n} = \xi \lor S ^ {-1} \xi \lor \dots \lor S ^ {- n+ 1} \xi $$

(the entropy of $ \xi $ in unit time relative to $ S $) exists, where $ H ( \xi ) $ is the entropy (cf. Entropy of a measurable decomposition) of $ \xi $, and $ \xi \lor \eta $ is the partition whose elements are the intersections of the elements of $ \xi $ and $ \eta $. (This definition carries over verbatim to $ \xi $ with $ H ( \xi ) < \infty $; by another method $ h ( S , \xi ) $ can be defined for any measurable $ \xi $.) The entropy $ h ( S) $ is defined as the least upper bound of the $ h ( S , \xi ) $ over all possible finite measurable $ \xi $. (It may be $ \infty $; the use of all $ \xi $ with $ H ( \xi ) < \infty $ or of all measurable $ \xi $ yields the same entropy.)

Originally the entropy was defined by A.N. Kolmogorov somewhat differently (see ); the version given above came later (see [2]). In the basic case of an aperiodic automorphism of a Lebesgue space the definitions are ultimately equivalent [3].

It turns out that $ h ( S ^ {n} ) = n h ( S) $, and if $ S $ is an automorphism, then $ h ( S ^ {-1} ) = h ( S) $. Therefore, the entropy of a cascade $ \{ S ^ {n} \} $ is naturally taken to be $ h ( S) $. For a measurable flow $ \{ S _ {t} \} $ it turns out that $ h ( S _ {t} ) = | t | h( S _ {1} ) $. Therefore the entropy of a flow is naturally taken to be $ h ( S _ {1} ) $. The definition of the entropy for other transformation groups with an invariant measure is somewhat different. (It does not reduce to the entropy of a single transformation in the group; see [5], [6].) There are modifications of the entropy for the case of an infinite invariant measure [7]; another modification is the $ A $- entropy (where $ A = \{ k _ {n} \} $ is an ascending sequence of natural numbers), which is obtained when $ \xi _ {S} ^ {n} $ is replaced by

$$ S ^ {- k _ {1} } \xi \lor \dots \lor S ^ {- k _ {n} } \xi $$

and $ \lim\limits $ by $ \overline{\lim\limits}\; $( see [8]).

The entropy is a metric isomorphism invariant of dynamical systems and is fundamentally different from the earlier-known invariants, which are basically connected with the spectrum of a dynamical system. In particular, by means of the entropy of Bernoulli automorphisms (cf. Bernoulli automorphism; see ) it was first established that there exist non-isomorphic ergodic systems with the same continuous spectrum (which contrasts with the situation for a discrete spectrum). In a wider setting the role of the entropy is related to the fact that a new trend arose in ergodic theory: the entropy theory of dynamical systems (see [3], [4], and Ergodic theory).

The entropy provides a tool for characterizing the rate of mixing of sets of small measure (more accurately, the collection of those that form the partition). Side-by-side with this "global" role, the entropy also plays a "local" role, which is established by Breiman's ergodic theorem (an individual ergodic theorem of information theory): For ergodic $ S $ and almost-all $ x $,

$$ \frac{1}{n} | \mathop{\rm log} \mu ( C _ {\xi _ {X} ^ {s} } ( x) ) | \rightarrow h ( S , \xi ) \ \textrm{ for } n \rightarrow \infty , $$

where $ C _ \eta ( x) $ is the element of the partition $ \eta $ containing $ x $ and the logarithm is taken to the same base as in the definition of $ H $( see , [4]). (Breiman's theorem is true for $ \xi $ with $ H ( \xi ) < \infty $[10], but, generally speaking, not for countable $ \xi $ with $ H ( \xi ) = \infty $[11]; there are variants for non-ergodic $ S $( see [4], [12]) and an infinite $ \mu $[13]. A weaker assertion on the convergence in the sense of $ l _ {1} $ has been proved for a certain general class of transformation groups [6].)

For smooth dynamical systems with a smooth invariant measure a connection has been established between the entropy and the Lyapunov characteristic exponent of the equations in variations (see [14][16]).

The name "entropy" is explained by the analogy between the entropy of dynamical systems and that in information theory and statistical physics, right up to the fact that in certain examples these entropies are the same (see, for example, [4], [17]). The analogy with statistical physics was one of the stimuli for introducing in ergodic theory (even in a not-purely metric context and for topological dynamical systems, cf. Topological dynamical system) new concepts such as "Gibbsian measures" , the "topological pressure" (an analogue to the free energy) and the "variational principle" for the latter (see the references to $ Y $- system; Topological entropy).

References

[1a] A.N. Kolmogorov, "A new metric invariant of transitive dynamical systems, and Lebesgue space automorphisms" Dokl. Akad. Nauk SSSR , 119 : 5 (1958) pp. 861–864 (In Russian)
[1b] A.N. Kolmogorov, "On entropy per unit time as a metric invariant of automorphisms" Dokl. Akad. Nauk SSSR , 124 : 4 (1959) pp. 754–755 (In Russian)
[2] Ya.G. Sinai, "On the notion of entropy of dynamical systems" Dokl. Akad. Nauk SSSR , 124 : 4 (1959) pp. 768–771 (In Russian)
[3] V.A. Rokhlin, "Lectures on the entropy theory of transformations with invariant measure" Russian Math. Surveys , 22 : 5 (1967) pp. 1–52 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 3–56
[4] P. Billingsley, "Ergodic theory and information" , Wiley (1965)
[5] A.V. Safonov, "Information parts in groups" Math. USSR. Izv. , 22 (1984) pp. 393–398 Izv. Akad. Nauk SSSR Ser. Mat. , 47 : 2 (1983) pp. 421–426
[6] J.C. Kieffer, "A generalized Shannon–McMillan theorem for the action of an amenable group on a probability space" Ann. of Probab. , 3 : 6 (1975) pp. 1031–1037
[7] V. Krengel, "Entropy of conservative transformations" Z. Wahrscheinlichkeitstheor. Verw. Geb. , 7 : 3 (1967) pp. 161–181
[8] A.G. Kushnirenko, "Metric invariants of entropy type" Russian Math. Surveys , 22 : 5 (1967) pp. 53–61 Uspekhi Mat. Nauk , 22 : 5 (1967) pp. 37–65
[9a] L. Breiman, "The individual ergodic theorem of information theory" Ann. Math. Stat. , 28 : 3 (1957) pp. 809–811
[9b] L. Breiman, "Correction to "The individual ergodic theorem of information theory" " Ann. Math. Stat. , 31 : 3 (1960) pp. 809–810
[10] K.L. Chung, "A note on the ergodic theorem of information theory" Ann. Math. Stat. , 32 : 3 (1961) pp. 612–614
[11] B.S. Pitskeĺ, "Nonuniform distribution of entropy for processes with a countable set of states" Probl. Peredatsi Inform. , 12 : 2 (1976) pp. 98–103 (In Russian)
[12] A. Ionesco-Tulcea, "Contributions to information theory for abstract alphabets" Arkiv for Mat. , 4 : 2–3 (1961) pp. 235–247
[13] E.M. Klimko, L. Sucheston, "On convergence of information in spaces with infinite invariant measure" Z. Wahrscheinlichkeitstheor. Verw. Geb. , 10 : 3 (1968) pp. 226–235
[14] V.M. Millionshchikov, "A formula for the entropy of smooth dynamical systems" Differential Eq. , 12 (1976) pp. 1527–1530 Differents. Uravnen. , 12 : 12 (1976) pp. 2188–2192
[15] Ya.B. Pesin, "Characteristic Lyapunov exponents, and smooth ergodic theory" Russian Math. Surveys , 32 : 4 (1977) pp. 55–114 Uspekhi Mat. Nauk , 32 : 4 (1977) pp. 55–112
[16] R. Mañé, "A proof of Pesin's formula" Ergod. Th. and Dynam. Syst. , 1 : 1 (1981) pp. 95–102
[17] D.W. Robinson, D. Ruelle, "Mean entropy of states in classical statistical mechanics" Comm. Math. Phys. , 5 : 4 (1967) pp. 288–300

Comments

Instead of $ A $- entropy the term sequence entropy is used in the English literature. See e.g. [a1], § 4.11. For several useful recent references concerning the computation of entropy, see [a2].

References

[a1] P. Walters, "An introduction to ergodic theory" , Springer (1982)
[a2] M.P. Wojtkowski, "Measure theoretic entropy of the system of hard spheres" Ergod. Th. and Dynam. Syst. , 8 (1988) pp. 133–153
How to Cite This Entry:
Metric entropy. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Metric_entropy&oldid=50924
This article was adapted from an original article by D.V. Anosov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article