Namespaces
Variants
Actions

Difference between revisions of "Bernstein inequality"

From Encyclopedia of Mathematics
Jump to: navigation, search
(Importing text file)
 
(TeX, Refs, MSC)
Line 1: Line 1:
Bernstein's inequality in probability theory is a more precise formulation of the classical [[Chebyshev inequality in probability theory|Chebyshev inequality in probability theory]], proposed by S.N. Bernshtein [[#References|[1]]] in 1911; it permits one to estimate the probability of large deviations by a monotone decreasing exponential function (cf. [[Probability of large deviations|Probability of large deviations]]). In fact, if the equations
+
{{MSC|60E15|26D05}}
 +
{{TEX|done}}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157001.png" /></td> </tr></table>
+
$
 +
\newcommand{\expect}{\mathbb{E}}
 +
\newcommand{\prob}{\mathbb{P}}
 +
\newcommand{\abs}[1]{\left|#1\right|}
 +
$
  
hold for the independent random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157002.png" /> with
+
Bernstein's inequality in probability theory is a more precise formulation of the classical [[Chebyshev inequality in probability    theory|Chebyshev inequality in probability theory]], proposed by S.N.  Bernshtein {{Cite|Be2}} in 1911; it permits one to estimate the [[Probability of large deviations|probability of large deviations]] by a monotone decreasing exponential function. In fact, if the equations
 +
\[
 +
\expect X_j=0,\quad
 +
\expect X_j^2=b_j,\quad
 +
j=1,\ldots,n,
 +
\]
 +
hold for the independent random variables $X_1,\ldots,X_n$ with
 +
\[
 +
\expect\abs{X_j}^l \leq \frac{b_j}{2}H^{l-2}l!
 +
\]
 +
(where $l>2$ and $H$ is a constant independent of $j$), then the following inequality of Bernstein (where $r>0$) is valid for the sum $S_n=X_1+\cdots+X_n$:
 +
\begin{equation}\label{eq1}
 +
\prob\left( \abs{S_n} > r \right)
 +
\leq
 +
2\exp\left(
 +
- \frac{r^2}{2(B_n + Hr)}
 +
\right),
 +
\end{equation}
 +
where $B_b = \sum b_j$. For identically-distributed bounded random variables $X_j$ ($\expect X_j = 0$, $\expect X_j^2 = \sigma^2$ and $\abs{X_j}\leq L$, $j=1,\ldots,n$) inequality \ref{eq1} takes its simplest form:
 +
\begin{equation}\label{eq2}
 +
\prob\left( \abs{S_n} > t\sigma\sqrt{n} \right)
 +
\leq
 +
2\exp\left(
 +
- \frac{t^2}{2(1 + a/3)}
 +
\right),
 +
\end{equation}
 +
where $a = Lt/\sqrt{n}\sigma$. A.N. Kolmogorov gave a lower estimate of the probability in \ref{eq1}. The Bernstein–Kolmogorov estimates are used, in particular, in proving the [[Law of the iterated logarithm|law of the iterated logarithm]]. Some idea of the accuracy of \ref{eq2} may be obtained by comparing it with the approximate value of the left-hand side of \ref{eq2} which is obtained by the [[Central limit theorem|central limit theorem]] in the form
 +
\[
 +
\frac{2}{\sqrt{2\pi}}\int_t^\infty
 +
\mathrm{e}^{-u^2/2}\,\mathrm{d}u
 +
=
 +
\frac{2}{\sqrt{2\pi t}}
 +
\left(
 +
1-\frac{\theta}{t^2}
 +
\right)
 +
\mathrm{e}^{-t^2/2},
 +
\]
 +
where $0<\theta<1$. Subsequent to 1967, Bernstein's inequalities were extended to include multi-dimensional and infinite-dimensional cases.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157003.png" /></td> </tr></table>
+
====References====  
  
(where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157004.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157005.png" /> is a constant independent of <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157006.png" />), then the following inequality of Bernstein (where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157007.png" />) is valid for the sum <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157008.png" />:
+
{|
 +
|-
 +
|valign="top"|{{Ref|Be2}}||valign="top"| S.N. Bernshtein, "Probability theory", Moscow-Leningrad (1946) (In Russian)
 +
|-
 +
|valign="top"|{{Ref|Be3}}||valign="top"| A.N. [A.N. Kolmogorov] Kolmogoroff, "Ueber das Gesetz des iterierten Logarithmus" ''Math. Ann.'', '''101''' (1929) pp. 126–135
 +
|-
 +
|valign="top"|{{Ref|Ni}}||valign="top"| W. Hoeffding, "Probability inequalities for sums of independent random variables" ''J. Amer. Statist. Assoc.'', '''58''' (1963) pp. 13–30
 +
|-
 +
|valign="top"|{{Ref|Yu}}||valign="top"| V.V. Yurinskii, "Exponential inequalities for sums of random vectors" ''J. Multivariate Anal.'', '''6''' (1976) pp. 473–499
 +
|-
 +
|}
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b0157009.png" /></td> <td valign="top" style="width:5%;text-align:right;">(1)</td></tr></table>
 
 
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570010.png" />. For identically-distributed bounded random variables <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570011.png" /> (<img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570012.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570013.png" /> and <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570014.png" />, <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570015.png" />) inequality (1) takes its simplest form:
 
 
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570016.png" /></td> <td valign="top" style="width:5%;text-align:right;">(2)</td></tr></table>
 
 
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570017.png" />. A.N. Kolmogorov gave a lower estimate of the probability in (1). The Bernstein–Kolmogorov estimates are used, in particular, in proving the [[Law of the iterated logarithm|law of the iterated logarithm]]. Some idea of the accuracy of (2) may be obtained by comparing it with the approximate value of the left-hand side of (2) which is obtained by the [[Central limit theorem|central limit theorem]] in the form
 
 
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570018.png" /></td> </tr></table>
 
 
where <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570019.png" />. Subsequent to 1967, Bernstein's inequalities were extended to include multi-dimensional and infinite-dimensional cases.
 
 
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  S.N. Bernshtein,  "Probability theory" , Moscow-Leningrad  (1946)  (In Russian)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  A.N. [A.N. Kolmogorov] Kolmogoroff,  "Ueber das Gesetz des iterierten Logarithmus"  ''Math. Ann.'' , '''101'''  (1929)  pp. 126–135</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  W. Hoeffding,  "Probability inequalities for sums of independent random variables"  ''J. Amer. Statist. Assoc.'' , '''58'''  (1963)  pp. 13–30</TD></TR><TR><TD valign="top">[4]</TD> <TD valign="top">  V.V. Yurinskii,  "Exponential inequalities for sums of random vectors"  ''J. Multivariate Anal.'' , '''6'''  (1976)  pp. 473–499</TD></TR></table>
 
  
 
''A.V. Prokhorov''
 
''A.V. Prokhorov''
  
Bernstein's inequality for the derivative of a trigonometric or algebraic polynomial gives an estimate of this derivative in terms of the polynomial itself. If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570020.png" /> is a trigonometric polynomial of degree not exceeding <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570021.png" /> and if
+
Bernstein's inequality for the derivative of a trigonometric or algebraic polynomial gives an estimate of this derivative in terms of the polynomial itself. If $T_n(x)$ is a trigonometric polynomial of degree not exceeding $n$ and if
 
+
\[
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570022.png" /></td> </tr></table>
+
M = \max_{0 \leq x \leq 2\pi} \abs{T_n(x)},
 
+
\]
then the following inequalities are valid for all <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570023.png" /> (cf. [[#References|[1]]]):
+
then the following inequalities are valid for all $x$ (cf. {{Cite|Be2}}):
 
+
\[
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570024.png" /></td> </tr></table>
+
\abs{T_n^{(r)}(x)} \leq Mn^r,
 
+
\]
These estimates cannot be improved, since the number <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570025.png" /> for
+
where $T_n^{(r)}$ is the $r$th derivative of $T_n$. These estimates cannot be improved, since the number $M=1$ for
 
+
\[
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570026.png" /></td> </tr></table>
+
T_n(x) = \cos n(x-x_0)
 
+
\]
 
is sharp:
 
is sharp:
 
+
\[
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570027.png" /></td> </tr></table>
+
\max_{0 \leq x \leq 2\pi} \abs{T_n^{(r)}(x)} = n^r.
 
+
\]
Bernstein's inequality for trigonometric polynomials is a special case of the following theorem [[#References|[2]]]: If <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570028.png" /> is an entire function of order <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570029.png" /> and if
+
Bernstein's inequality for trigonometric polynomials is a special case of the following theorem {{Cite|Be3}}: If $f(x)$ is an entire function of order no greater than $\sigma$ and if
 
+
\[
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570030.png" /></td> </tr></table>
+
M = \sup_{-\infty < x < \infty} \abs{f(x)},
 
+
\]
 
then one has
 
then one has
 
+
\[
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570031.png" /></td> </tr></table>
+
\sup_{-\infty < x < \infty} \abs{f^{(r)}(x)} \leq M\sigma^r
 
+
\quad
Bernstein's inequality for an algebraic polynomial has the following form [[#References|[1]]]: If the polynomial
+
(r=1,2,\ldots).
 
+
\]
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570032.png" /></td> </tr></table>
+
Bernstein's inequality for an algebraic polynomial has the following form {{Cite|Be2}}: If the polynomial
 
+
\[
 +
P_n(x) = \sum_{k=0}^n \alpha_k x^k
 +
\]
 
satifies the condition
 
satifies the condition
 +
\[
 +
\abs{P_n(x)} \leq M,
 +
\quad
 +
a \leq x \leq b,
 +
\]
 +
then its derivative $P_n^\prime(x)$ has the property
 +
\[
 +
\abs{P_n^\prime(x)} \leq
 +
\frac{Mn}{\sqrt{(x-a)(x-b)}},
 +
\quad
 +
a \leq x \leq b,
 +
\]
 +
which cannot be improved. As was noted by S.N. Bernshtein {{Cite|Be2}}, this inequality is a consequence of the proof of the [[Markov inequality]] given by A.A. Markov.
  
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570033.png" /></td> </tr></table>
+
Bernstein's inequalities are in fact employed in proving converse theorems in the theory of approximation of functions. There are a number of generalizations of Bernstein's inequality, in particular for entire functions in several variables.
 
 
then its derivative <img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570034.png" /> has the property
 
 
 
<table class="eq" style="width:100%;"> <tr><td valign="top" style="width:94%;text-align:center;"><img align="absmiddle" border="0" src="https://www.encyclopediaofmath.org/legacyimages/b/b015/b015700/b01570035.png" /></td> </tr></table>
 
 
 
which cannot be improved. As was noted by S.N. Bernshtein [[#References|[1]]], this inequality is a consequence of the proof of the [[Markov inequality|Markov inequality]] given by A.A. Markov.
 
  
Bernstein's inequalities are in fact employed in proving converse theorems in the theory of approximation of functions. There are a number of generalizations of Bernstein's inequality, in particular for entire functions in several variables.
+
====References====
 +
{|
 +
|-
 +
|valign="top"|{{Ref|Be2}}||valign="top"| S.N. [S.N.  Bernshtein] Bernstein, "Sur l'ordre de la meilleure approximation des fonctions continues par des polynômes" ''Acad. R. Belgique, Cl. Sci. Mém. Coll. 4. Sér. II'', '''4''' (1922)
 +
|-
 +
|valign="top"|{{Ref|Be3}}||valign="top"| S.N. [S.N. Bernshtein] Bernstein, "Sur une propriété des fonctions entières" ''C.R. Acad. Sci. Paris'', '''176''' (1923) pp. 1603–1605
 +
|-
 +
|valign="top"|{{Ref|Ni}}||valign="top"| S.M. Nikol'skii, "Approximation of functions of several variables and imbedding theorems", Springer (1975) (Translated from Russian)
 +
|-
 +
|}
  
====References====
 
<table><TR><TD valign="top">[1]</TD> <TD valign="top">  S.N. [S.N. Bernshtein] Bernstein,  "Sur l'ordre de la meilleure approximation des fonctions continues par des polynômes"  ''Acad. R. Belgique, Cl. Sci. Mém. Coll. 4. Sér. II'' , '''4'''  (1922)</TD></TR><TR><TD valign="top">[2]</TD> <TD valign="top">  S.N. [S.N. Bernshtein] Bernstein,  "Sur une propriété des fonctions entières"  ''C.R. Acad. Sci. Paris'' , '''176'''  (1923)  pp. 1603–1605</TD></TR><TR><TD valign="top">[3]</TD> <TD valign="top">  S.M. Nikol'skii,  "Approximation of functions of several variables and imbedding theorems" , Springer  (1975)  (Translated from Russian)</TD></TR></table>
 
  
''N.P. KorneichukV.P. Motornyi''
+
''N.P. Korneichuk, V.P. Motornyi''
  
 
====Comments====
 
====Comments====
  
 
+
====References====  
====References====
+
{|
<table><TR><TD valign="top">[a1]</TD> <TD valign="top"I.P. Natanson,   "Constructive function theory" , '''1–3''' , F. Ungar (1964–1965) (Translated from Russian)</TD></TR><TR><TD valign="top">[a2]</TD> <TD valign="top">  G.G. Lorentz,  "Approximation of functions" , Holt, Rinehart &amp; Winston  (1966)  pp. Chapt. 2</TD></TR></table>
+
|-
 +
|valign="top"|{{Ref|Lo}}||valign="top"| G.G. Lorentz, "Approximation of functions", Holt, Rinehart &amp; Winston (1966) pp. Chapt. 2
 +
|-
 +
|valign="top"|{{Ref|Na}}||valign="top"| I.P. Natanson, "Constructive function theory", '''1–3''', F. Ungar (1964–1965) (Translated from Russian)
 +
|-
 +
|}

Revision as of 18:06, 24 July 2012

2010 Mathematics Subject Classification: Primary: 60E15 Secondary: 26D05 [MSN][ZBL]

$ \newcommand{\expect}{\mathbb{E}} \newcommand{\prob}{\mathbb{P}} \newcommand{\abs}[1]{\left|#1\right|} $

Bernstein's inequality in probability theory is a more precise formulation of the classical Chebyshev inequality in probability theory, proposed by S.N. Bernshtein [Be2] in 1911; it permits one to estimate the probability of large deviations by a monotone decreasing exponential function. In fact, if the equations \[ \expect X_j=0,\quad \expect X_j^2=b_j,\quad j=1,\ldots,n, \] hold for the independent random variables $X_1,\ldots,X_n$ with \[ \expect\abs{X_j}^l \leq \frac{b_j}{2}H^{l-2}l! \] (where $l>2$ and $H$ is a constant independent of $j$), then the following inequality of Bernstein (where $r>0$) is valid for the sum $S_n=X_1+\cdots+X_n$: \begin{equation}\label{eq1} \prob\left( \abs{S_n} > r \right) \leq 2\exp\left( - \frac{r^2}{2(B_n + Hr)} \right), \end{equation} where $B_b = \sum b_j$. For identically-distributed bounded random variables $X_j$ ($\expect X_j = 0$, $\expect X_j^2 = \sigma^2$ and $\abs{X_j}\leq L$, $j=1,\ldots,n$) inequality \ref{eq1} takes its simplest form: \begin{equation}\label{eq2} \prob\left( \abs{S_n} > t\sigma\sqrt{n} \right) \leq 2\exp\left( - \frac{t^2}{2(1 + a/3)} \right), \end{equation} where $a = Lt/\sqrt{n}\sigma$. A.N. Kolmogorov gave a lower estimate of the probability in \ref{eq1}. The Bernstein–Kolmogorov estimates are used, in particular, in proving the law of the iterated logarithm. Some idea of the accuracy of \ref{eq2} may be obtained by comparing it with the approximate value of the left-hand side of \ref{eq2} which is obtained by the central limit theorem in the form \[ \frac{2}{\sqrt{2\pi}}\int_t^\infty \mathrm{e}^{-u^2/2}\,\mathrm{d}u = \frac{2}{\sqrt{2\pi t}} \left( 1-\frac{\theta}{t^2} \right) \mathrm{e}^{-t^2/2}, \] where $0<\theta<1$. Subsequent to 1967, Bernstein's inequalities were extended to include multi-dimensional and infinite-dimensional cases.

References

[Be2] S.N. Bernshtein, "Probability theory", Moscow-Leningrad (1946) (In Russian)
[Be3] A.N. [A.N. Kolmogorov] Kolmogoroff, "Ueber das Gesetz des iterierten Logarithmus" Math. Ann., 101 (1929) pp. 126–135
[Ni] W. Hoeffding, "Probability inequalities for sums of independent random variables" J. Amer. Statist. Assoc., 58 (1963) pp. 13–30
[Yu] V.V. Yurinskii, "Exponential inequalities for sums of random vectors" J. Multivariate Anal., 6 (1976) pp. 473–499


A.V. Prokhorov

Bernstein's inequality for the derivative of a trigonometric or algebraic polynomial gives an estimate of this derivative in terms of the polynomial itself. If $T_n(x)$ is a trigonometric polynomial of degree not exceeding $n$ and if \[ M = \max_{0 \leq x \leq 2\pi} \abs{T_n(x)}, \] then the following inequalities are valid for all $x$ (cf. [Be2]): \[ \abs{T_n^{(r)}(x)} \leq Mn^r, \] where $T_n^{(r)}$ is the $r$th derivative of $T_n$. These estimates cannot be improved, since the number $M=1$ for \[ T_n(x) = \cos n(x-x_0) \] is sharp: \[ \max_{0 \leq x \leq 2\pi} \abs{T_n^{(r)}(x)} = n^r. \] Bernstein's inequality for trigonometric polynomials is a special case of the following theorem [Be3]: If $f(x)$ is an entire function of order no greater than $\sigma$ and if \[ M = \sup_{-\infty < x < \infty} \abs{f(x)}, \] then one has \[ \sup_{-\infty < x < \infty} \abs{f^{(r)}(x)} \leq M\sigma^r \quad (r=1,2,\ldots). \] Bernstein's inequality for an algebraic polynomial has the following form [Be2]: If the polynomial \[ P_n(x) = \sum_{k=0}^n \alpha_k x^k \] satifies the condition \[ \abs{P_n(x)} \leq M, \quad a \leq x \leq b, \] then its derivative $P_n^\prime(x)$ has the property \[ \abs{P_n^\prime(x)} \leq \frac{Mn}{\sqrt{(x-a)(x-b)}}, \quad a \leq x \leq b, \] which cannot be improved. As was noted by S.N. Bernshtein [Be2], this inequality is a consequence of the proof of the Markov inequality given by A.A. Markov.

Bernstein's inequalities are in fact employed in proving converse theorems in the theory of approximation of functions. There are a number of generalizations of Bernstein's inequality, in particular for entire functions in several variables.

References

[Be2] S.N. [S.N. Bernshtein] Bernstein, "Sur l'ordre de la meilleure approximation des fonctions continues par des polynômes" Acad. R. Belgique, Cl. Sci. Mém. Coll. 4. Sér. II, 4 (1922)
[Be3] S.N. [S.N. Bernshtein] Bernstein, "Sur une propriété des fonctions entières" C.R. Acad. Sci. Paris, 176 (1923) pp. 1603–1605
[Ni] S.M. Nikol'skii, "Approximation of functions of several variables and imbedding theorems", Springer (1975) (Translated from Russian)


N.P. Korneichuk, V.P. Motornyi

Comments

References

[Lo] G.G. Lorentz, "Approximation of functions", Holt, Rinehart & Winston (1966) pp. Chapt. 2
[Na] I.P. Natanson, "Constructive function theory", 1–3, F. Ungar (1964–1965) (Translated from Russian)
How to Cite This Entry:
Bernstein inequality. Encyclopedia of Mathematics. URL: http://www.encyclopediaofmath.org/index.php?title=Bernstein_inequality&oldid=27197
This article was adapted from an original article by A.V. Prokhorov, N.P. Korneichuk, V.P. Motornyi (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article