Namespaces
Variants
Actions

Information theory

From Encyclopedia of Mathematics
Revision as of 17:28, 7 February 2011 by 127.0.0.1 (talk) (Importing text file)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The branch of applied mathematics and cybernetics related to the mathematical description and estimation of the quality of the transmission, preservation, extraction, and classification of information. The term "information theory" , which arose in the 1950s, still (1988) has no unique and generally-accepted interpretation. In various sources the series of branches included in information theory is defined differently, and in a logical interpretation one needs to include also in information theory certain parts of science that are traditionally not included. An important feature unifying various branches of science related to information theory is the extensive use of statistical methods. This is brought about by the fact that the process of extraction of information is connected with reducing the indefiniteness in the knowledge of some object, and the natural numerical measure of indefiniteness of an event is its probability.

The most important part of information theory in any treatment is the theory of information transmission (cf. Information, transmission of). Often, especially in pure mathematical papers, the term "information theory" is used as a synonym of the "theory of information transmission" . The theory of information transmission is concerned with optimum and near-optimum methods of transmission of information over a communication channel under the assumption that the methods of encoding the message into an input signal and of decoding the output into a message may vary within wide ranges.

The birth of the theory of information transmission is related to the name of C. Shannon, who proposed in 1948 the solution to the basic problem of finding the transmission rate of information that can be attained by an optimum method of coding and decoding such that the probability of an error in the transmission is arbitrarily small. This optimal transmission rate, called the capacity of the channel (cf. Transmission rate of a channel), can be expressed in terms of the amount of information, a quantity introduced by Shannon (cf. Information, amount of). The notion of the amount of information is extremely important and finds numerous important applications, also in other branches of information theory.

Problems related to the optimal way of preservation of information do not differ, in principle, from problems of optimal transmission of information, since preservation can be regarded as transmission in time rather than in space.

The basic theorems in the theory of information transmission have the character of existence theorems. In them, the existence of optimal encoding and decoding methods is proved, but there are no methods indicated for the construction or technical realization of these results. Therefore, subsequently coding theory underwent an extensive development. In coding theory one attempts to construct concrete and relatively simple encoding and decoding algorithms that are close in their possibilities to the optimal algorithms whose existence has been proved in the theory of information transmission. Coding theory distinguishes itself by the fact that it uses, in addition to statistical methods, deep algebraic and combinatorial ideas in order to construct concrete codes.

Usually, the total set of investigations in which the theory of statistical methods is applied to the description of ways of transforming signals at the output and input of a communication channel is also said to belong to information theory. From the mathematical point of view these are just applications of mathematical statistics (principally of the statistics of stochastic processes), the theory of prediction and filtering of stationary stochastic processes, the theory of games, etc. Thus, this branch of information theory does not use any specific mathematical tools and approaches in its development certain other branches of applied probability theory.

The theory of pattern recognition is also often included in information theory. In it one develops algorithms for distributing objects over certain classes of objects which are only described on an intuitive level and have no clear mathematical description. Such algorithms always include learning processes with respect to a list of objects that are previously classified by human beings.

The attempts to determine the bounds of information theory, starting from a generally-accepted definition of it, and to include in it all branches of mathematics that deal with the notion of information in its general-lexical treatment, would lead to an unjustified, at least in its present stage, extension of the concept of information theory. In particular, all of mathematical statistics deals with problems of information extraction, the theory of algorithms deals with problems of processing information, the theory of formal languages deals with problems of information description, etc.

The concepts of information theory and its applications are extremely varied. Up till now (1988) the totality of the science of information is a set of separate scientific disciplines, each of which is related to the study of one aspect of this concept. Despite intensive attempts the unification of these scientific disciplines is a relatively-slow process, and the creation of one all-embracing information theory is a matter for the distant future.

References

[1] R. Gallagher, "Information theory and reliable communication" , Wiley (1968)
[2] A.N. Kolmogorov, "Three approaches to the definition of the concept of "amount of information" " Probl. Peredachi Inform. , 1 : 1 (1965) pp. 3–11 (In Russian)
[3] A.A. Kharkevich, "Channels with noise" , Moscow (1965) (In Russian)
[4] L. Brillouin, "Science and information theory" , Acad. Press (1956)
[5] C. Cherry, "On human communication" , M.I.T. (1957)
[6] A.M. Yaglom, I.M. Yaglom, "Probabilité et information" , Dunod (1959) (Translated from Russian)
[7] J.M. Wozencraft, I.M. Jacobs, "Principles of communication engineering" , Wiley (1965)
[8] B.R. Levin, "Theoretical foundations of statistical radiotechnics" , Moscow (1974) (In Russian)
How to Cite This Entry:
Information theory. R.L. DobrushinV.V. Prelov (originator), Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Information_theory&oldid=18981
This text originally appeared in Encyclopedia of Mathematics - ISBN 1402006098