Namespaces
Variants
Actions

Transmission rate of a channel

From Encyclopedia of Mathematics
Jump to: navigation, search


An information-theoretic measure of the ability to transmit information over a communication channel. Let $ \eta $ and $ \widetilde \eta $ be random variables connected in a communication channel $ ( Q, V) $. Then the transmission rate $ C $ of this channel is defined by the equation

$$ \tag{1 } C = \sup I ( \eta , \widetilde \eta ), $$

where $ I ( \eta , \widetilde \eta ) $ is the amount of information (cf. Information, amount of) in $ \widetilde \eta $ relative to $ \eta $, and the supremum is taken over all pairs of random variables $ ( \eta , \widetilde \eta ) $ connected in the channel $ ( Q, V) $. In the case when the input and output signals $ \eta = \{ {\eta ( t) } : {- \infty < t < \infty } \} $ and $ \widetilde \eta = \{ {\widetilde \eta ( t) } : {- \infty < t < \infty } \} $ are random processes in continuous or discrete time, the transmission rate of the channel is usually understood to mean the mean transmission rate of the channel, taken in unit time or over one symbol of the transmitted signal, that is, by definition one sets

$$ \tag{2 } C = \lim\limits _ {T - t \rightarrow \infty } { \frac{1}{T - t } } \ \sup I ( \eta _ {t} ^ {T} , \widetilde \eta {} _ {t} ^ {T} ), $$

if the limit exists; here the supremum is taken over all possible pairs of random variables $ \eta _ {t} ^ {T} = \{ {\eta ( s) } : {t < s \leq T } \} $, $ \widetilde \eta {} _ {t} ^ {T} = \{ {\widetilde \eta ( s) } : {t < s \leq T } \} $ connected in the corresponding segment of the given channel. The existence of the limit (2) has been proved for a wide class of channels, for example for a homogeneous channel with a finite memory and non-vanishing transition probabilities.

It is known that for a sufficiently wide class (for example, for the channels with finite memory mentioned above) the following holds:

$$ \tag{3 } C = \sup \ \left ( \lim\limits _ {T - t \rightarrow \infty } \ { \frac{1}{T - t } } I ( \eta _ {t} ^ {T} , \widetilde \eta {} _ {t} ^ {T} ) \right ) , $$

where the supremum is taken over all pairs of stationarily-related random processes $ \eta ( t) $, $ \widetilde \eta ( t) $, $ - \infty < t < \infty $, such that for any $ - \infty < t < T < \infty $ the random variables $ \eta _ {t} ^ {T} = \{ {\eta ( s) } : {t < s \leq T } \} $ and $ \widetilde \eta {} _ {t} ^ {T} = \{ {\widetilde \eta ( s) } : {t < s \leq T } \} $ are connected in the corresponding segment of the channel under consideration. Thus, (3) shows that the transmission rate of the channel is the same as the maximum possible transmission rate of information (cf. Information, transmission rate of) along this channel.

An explicit calculation of transmission rates is therefore of considerable interest. For example, for a channel $ ( Q , V) $ whose input and output signals take values in the Euclidean $ n $- dimensional space $ \mathbf R ^ {n} $, with transition function $ Q ( y, \cdot ) $ defined by a density $ q ( y, \widetilde{y} ) $( with respect to the Lebesgue measure), $ y, \widetilde{y} \in \mathbf R ^ {n} $, and with the constraint $ V $ consisting of boundedness of the mean square power of the input signal, $ {\mathsf E} | \eta | ^ {2} \leq S $( where $ | \eta | $ is the length of the vector $ \eta $ in $ \mathbf R ^ {n} $), $ S > 0 $ being a fixed constant, the following results are known (see ).

1) Let $ q ( y, \widetilde{y} ) = q ( \widetilde{y} - y) $, that is, one considers a channel with additive noise such that the output signal $ \widetilde \eta $ is equal to the sum $ \widetilde \eta = \eta + \zeta $ of the input signal $ \eta $ and a noise $ \zeta $ independent of it, and let $ {\mathsf E} | \zeta | ^ {2} = N $. Then as $ N \rightarrow 0 $( under weak additional conditions) the following asymptotic formula holds:

$$ C = - h ( \zeta ) + { \frac{n}{2} } \mathop{\rm log} { \frac{2 \pi eS }{n} } + { \frac{n \mathop{\rm log} }{2S} } N + o ( N), $$

where $ h ( \zeta ) $ is the differential entropy of $ \zeta $ and $ o ( N)/N \rightarrow 0 $ as $ N \rightarrow 0 $. This formula corresponds to the case of little noise.

2) Let $ q ( y, \widetilde{y} ) $ be arbitrary but let $ S \rightarrow 0 $. Then

$$ C = \left ( \sup _ { y } \frac{\phi ( y) }{| y | } \right ) S + o ( S), $$

where

$$ \phi ( y) = \ \int\limits _ {\mathbf R ^ {n} } q ( y, \widetilde{y} ) \mathop{\rm log} \ \frac{q ( y, \widetilde{y} ) }{q ( 0, y) } \ d \widetilde{y} . $$

See also , – cited under Communication channel.

References

[1a] V.V. Prelov, "The asymptotic channel capacity for a continuous channel with small additive noise" Problems Inform. Transmission , 5 : 2 (1969) pp. 23–27 Probl. Peredachi Inform. , 5 : 2 (1969) pp. 31–36
[1b] V.V. Prelov, "Asymptotic behavior of the capacity of a continuous channel with large nonadditive noise" Problems Inform. Transmission , 8 : 4 (1972) pp. 285–289 Probl. Peredachi Inform. , 8 : 4 (1972) pp. 22–27

Comments

References

[a1] P. Billingsley, "Ergodic theory and information" , Wiley (1965)
[a2] R.B. Ash, "Information theory" , Interscience (1965)
[a3] A.M. Yaglom, I.M. Yaglom, "Probabilité et information" , Dunod (1959) (Translated from Russian)
How to Cite This Entry:
Transmission rate of a channel. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Transmission_rate_of_a_channel&oldid=49020
This article was adapted from an original article by R.L. DobrushinV.V. Prelov (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article