Namespaces
Variants
Actions

Bernstein inequality

From Encyclopedia of Mathematics
Revision as of 16:06, 24 July 2012 by Jjg (talk | contribs) (TeX, Refs, MSC)
Jump to: navigation, search

2020 Mathematics Subject Classification: Primary: 60E15 Secondary: 26D05 [MSN][ZBL]

$ \newcommand{\expect}{\mathbb{E}} \newcommand{\prob}{\mathbb{P}} \newcommand{\abs}[1]{\left|#1\right|} $

Bernstein's inequality in probability theory is a more precise formulation of the classical Chebyshev inequality in probability theory, proposed by S.N. Bernshtein [Be2] in 1911; it permits one to estimate the probability of large deviations by a monotone decreasing exponential function. In fact, if the equations \[ \expect X_j=0,\quad \expect X_j^2=b_j,\quad j=1,\ldots,n, \] hold for the independent random variables $X_1,\ldots,X_n$ with \[ \expect\abs{X_j}^l \leq \frac{b_j}{2}H^{l-2}l! \] (where $l>2$ and $H$ is a constant independent of $j$), then the following inequality of Bernstein (where $r>0$) is valid for the sum $S_n=X_1+\cdots+X_n$: \begin{equation}\label{eq1} \prob\left( \abs{S_n} > r \right) \leq 2\exp\left( - \frac{r^2}{2(B_n + Hr)} \right), \end{equation} where $B_b = \sum b_j$. For identically-distributed bounded random variables $X_j$ ($\expect X_j = 0$, $\expect X_j^2 = \sigma^2$ and $\abs{X_j}\leq L$, $j=1,\ldots,n$) inequality \ref{eq1} takes its simplest form: \begin{equation}\label{eq2} \prob\left( \abs{S_n} > t\sigma\sqrt{n} \right) \leq 2\exp\left( - \frac{t^2}{2(1 + a/3)} \right), \end{equation} where $a = Lt/\sqrt{n}\sigma$. A.N. Kolmogorov gave a lower estimate of the probability in \ref{eq1}. The Bernstein–Kolmogorov estimates are used, in particular, in proving the law of the iterated logarithm. Some idea of the accuracy of \ref{eq2} may be obtained by comparing it with the approximate value of the left-hand side of \ref{eq2} which is obtained by the central limit theorem in the form \[ \frac{2}{\sqrt{2\pi}}\int_t^\infty \mathrm{e}^{-u^2/2}\,\mathrm{d}u = \frac{2}{\sqrt{2\pi t}} \left( 1-\frac{\theta}{t^2} \right) \mathrm{e}^{-t^2/2}, \] where $0<\theta<1$. Subsequent to 1967, Bernstein's inequalities were extended to include multi-dimensional and infinite-dimensional cases.

References

[Be2] S.N. Bernshtein, "Probability theory", Moscow-Leningrad (1946) (In Russian)
[Be3] A.N. [A.N. Kolmogorov] Kolmogoroff, "Ueber das Gesetz des iterierten Logarithmus" Math. Ann., 101 (1929) pp. 126–135
[Ni] W. Hoeffding, "Probability inequalities for sums of independent random variables" J. Amer. Statist. Assoc., 58 (1963) pp. 13–30
[Yu] V.V. Yurinskii, "Exponential inequalities for sums of random vectors" J. Multivariate Anal., 6 (1976) pp. 473–499


A.V. Prokhorov

Bernstein's inequality for the derivative of a trigonometric or algebraic polynomial gives an estimate of this derivative in terms of the polynomial itself. If $T_n(x)$ is a trigonometric polynomial of degree not exceeding $n$ and if \[ M = \max_{0 \leq x \leq 2\pi} \abs{T_n(x)}, \] then the following inequalities are valid for all $x$ (cf. [Be2]): \[ \abs{T_n^{(r)}(x)} \leq Mn^r, \] where $T_n^{(r)}$ is the $r$th derivative of $T_n$. These estimates cannot be improved, since the number $M=1$ for \[ T_n(x) = \cos n(x-x_0) \] is sharp: \[ \max_{0 \leq x \leq 2\pi} \abs{T_n^{(r)}(x)} = n^r. \] Bernstein's inequality for trigonometric polynomials is a special case of the following theorem [Be3]: If $f(x)$ is an entire function of order no greater than $\sigma$ and if \[ M = \sup_{-\infty < x < \infty} \abs{f(x)}, \] then one has \[ \sup_{-\infty < x < \infty} \abs{f^{(r)}(x)} \leq M\sigma^r \quad (r=1,2,\ldots). \] Bernstein's inequality for an algebraic polynomial has the following form [Be2]: If the polynomial \[ P_n(x) = \sum_{k=0}^n \alpha_k x^k \] satifies the condition \[ \abs{P_n(x)} \leq M, \quad a \leq x \leq b, \] then its derivative $P_n^\prime(x)$ has the property \[ \abs{P_n^\prime(x)} \leq \frac{Mn}{\sqrt{(x-a)(x-b)}}, \quad a \leq x \leq b, \] which cannot be improved. As was noted by S.N. Bernshtein [Be2], this inequality is a consequence of the proof of the Markov inequality given by A.A. Markov.

Bernstein's inequalities are in fact employed in proving converse theorems in the theory of approximation of functions. There are a number of generalizations of Bernstein's inequality, in particular for entire functions in several variables.

References

[Be2] S.N. [S.N. Bernshtein] Bernstein, "Sur l'ordre de la meilleure approximation des fonctions continues par des polynômes" Acad. R. Belgique, Cl. Sci. Mém. Coll. 4. Sér. II, 4 (1922)
[Be3] S.N. [S.N. Bernshtein] Bernstein, "Sur une propriété des fonctions entières" C.R. Acad. Sci. Paris, 176 (1923) pp. 1603–1605
[Ni] S.M. Nikol'skii, "Approximation of functions of several variables and imbedding theorems", Springer (1975) (Translated from Russian)


N.P. Korneichuk, V.P. Motornyi

Comments

References

[Lo] G.G. Lorentz, "Approximation of functions", Holt, Rinehart & Winston (1966) pp. Chapt. 2
[Na] I.P. Natanson, "Constructive function theory", 1–3, F. Ungar (1964–1965) (Translated from Russian)
How to Cite This Entry:
Bernstein inequality. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Bernstein_inequality&oldid=27197
This article was adapted from an original article by A.V. Prokhorov, N.P. Korneichuk, V.P. Motornyi (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article