Namespaces
Variants
Actions

Janson inequality

From Encyclopedia of Mathematics
Jump to: navigation, search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

There are a couple of inequalities referred to as "Janson inequality" . They provide exponential upper bounds for the probability that a sum of dependent zero-one random variables (cf. also Random variable) equals zero. The underlying probability space corresponds to selecting a random subset $\Gamma _ { \mathbf{p} }$ of a finite set $\Gamma$, where $\mathbf{p} = \{ p _ { i } : i \in \Gamma \}$, in such a way that the elements are chosen independently with $\mathsf{P} ( i \in \Gamma _ { \mathbf{p} } ) = p _ { i }$ for each $ i \in \Gamma$.

Let $\mathcal{S}$ be a family of subsets of $\Gamma$ and, for every $A \in \mathcal S$, let $I _ { A }$ be equal to one if $A \subseteq \Gamma _ { p }$ and be zero otherwise. Then $X = \sum _ { A \in \mathcal S } I _ { A }$ counts those elements of $\mathcal{S}$ which are entirely contained in $\Gamma _ { \mathbf{p} }$. Set

\begin{equation*} \lambda = \mathsf{E} ( X ), \end{equation*}

\begin{equation*} \Delta = \frac { 1 } { 2 } \sum _ { A \neq B , A \bigcap B \neq \emptyset } \mathsf{E} ( I _ { A } I _ { B } ) , \overline { \Delta } = \lambda + 2 \Delta. \end{equation*}

Then

\begin{equation} \tag{a1} \mathsf{P} ( X = 0 ) \leq \operatorname { exp } ( - \lambda + \Delta ) \end{equation}

and, which is better for $\Delta > \lambda / 2$,

\begin{equation} \tag{a2} \mathsf{P} ( X = 0 ) \leq \operatorname { exp } \left( - \frac { \lambda ^ { 2 } } { \overline{\Delta} } \right). \end{equation}

Research leading to these inequalities was motivated by a ground-breaking proof of B. Bollobás [a2], who, in order to estimate the chromatic number of a random graph, used martingales (cf. also Graph colouring; Graph, random; Martingale) to show that the probability of not containing a large clique is very small. Bollobás presented his proof at the opening day of the "Third Conference on Random Graphs (Poznań, July 1987)" . By the end of the meeting S. Janson found a proof of inequality (a1) based on Laplace transforms (cf. also Laplace transform), while T. Łuczak proved a related, less explicit estimate using martingales. The latter result was restricted to a special, though pivotal, context of small subgraphs of random graphs. Soon after, R. Boppana and J. Spencer [a1] gave another proof, resembling the proof of the Lovász local lemma, of the following version of (a1):

\begin{equation} \tag{a3} \mathsf{P} ( X = 0 ) \leq \operatorname { exp } \left\{ \frac { \Delta } { 1 - \epsilon } \right\} \prod _ { A } ( 1 - \mathsf{E} I _ { A } ), \end{equation}

where $\epsilon = \operatorname { max } \mathsf{E} I_A$. This version puts emphasis on comparison with the independent case.

Under modest symmetry conditions, inequality (a2) appeared in [a3], while the general version was proved in [a4]. See also [a6] for another proof of (a2), but with an extra factor $1/2$ in the exponent.

The quantity $\Delta$ is a measure of the pairwise dependence between the $I _ { A }$s. If $\Delta = o ( \lambda )$, then the exponents in (a1) and (a2) are both equal to $- \mathsf{E} X ( 1 + o ( 1 ) )$, matching asymptotically a lower bound obtained via the FKG inequality, provided further $\max p _ { i } \rightarrow 0$.

Example.

Let $n$ be an integer, $n \geq 3$, and let $\Gamma$ be the set of all two-element subsets of $\{ 1 , \ldots , n \}$. With all $p _ { i } = p = p ( n )$, $i = 1 , \ldots , \left( \begin{array} { l } { n } \\ { 2 } \end{array} \right)$, the random subset $\Gamma _ { \mathbf{p} }$ is a random graph $\mathbf{G} ( n , p )$. Let $\mathcal{S}$ be the family of all triples of pairs of the form $\{ i j , i k , j k \}$, $1 \leq i < j < k \leq n$. Then $X$ is the number of triangles in $\mathbf{G} ( n , p )$, $\lambda = \left( \begin{array} { l } { n } \\ { 3 } \end{array} \right) p ^ { 3 }$ and $\Delta = \left( \begin{array} { l } { n } \\ { 4 } \end{array} \right) \left( \begin{array} { l } { 4 } \\ { 2 } \end{array} \right) p ^ { 5 }$. Thus, if $p = o ( n ^ { - 1 / 2 } )$, then $\operatorname { ln } \mathsf{P} ( X = 0 ) \sim - \lambda$, while for $p = \Omega ( n ^ { - 1 / 2 } )$, inequality (a2) yields $\mathsf{P} ( X = 0 ) \leq e ^ { - \Omega ( 1 / ( n p ^ { 2 } ) ) }$. As long as $p = o ( 1 )$, $\operatorname { var } ( X ) \sim \overline { \Delta }$, and the above exponential bounds strengthen the polynomial bound

\begin{equation*} \mathsf{P} ( X = 0 ) \leq \frac { \operatorname { var } ( X ) } { \lambda } \end{equation*}

obtained by the method of second moments, i.e. by a corollary of the Chebyshev inequality. To illustrate the strength of (a1), fix $p = 10 ^ { 5 } n ^ { - 2 / 3 }$ and assume that $n$ is divisible by $100$. Then (a1) easily implies that with probability tending to one, more than $99 \%$ of the vertices of $\mathbf{G} ( n , p )$ are covered by vertex-disjoint triangles. Indeed, otherwise there would be a subset of $n/100$ vertices spanning no triangle. By (a1), the probability that this may happen is smaller than

\begin{equation*} 2 ^ { n } \operatorname { exp } \left\{ - \left( \begin{array} { c } { n / 100 } \\ { 3 } \end{array} \right) p ^ { 3 } + O ( n ^ { 4 } p ^ { 5 } ) \right\} = o ( 1 ). \end{equation*}

In 1990, Janson [a4] generalized inequality (a2), obtaining an estimate of the lower tail of the distribution of $X$. With $\phi ( x ) = ( 1 + x ) \operatorname { ln } ( 1 + x ) - x$, for $0 \leq t \leq \lambda$,

\begin{equation} \tag{a4} \mathsf P ( X \leq \lambda - t ) \leq \operatorname { exp } \left( - \frac { \phi ( - t / \lambda ) \lambda ^ { 2 } } { \overline { \Delta } } \right) \leq \operatorname { exp } \left( - \frac { t ^ { 2 } } { 2 \overline { \Delta } } \right). \end{equation}

When $\mathcal{S}$ consists of mutually disjoint sets, $X$ is a sum of independent zero-one random variables and (a4) coincides with the (one-sided) Chernoff bound. No corresponding upper tail estimate is true in general.

Also in 1990, W.C.S. Suen [a7] proved a related inequality, which remains valid for a general probability space (and not only for random subsets). In his setting, $X$ is a sum of arbitrary indicators $I _ { \alpha }$, and the bound hinges on the dependency graph induced by them. Suen's inequality has been revived and extended in [a5].

Fore more details on this subject see [a8].

References

[a1] R. Boppana, J. Spencer, "A useful elementary correlation inequality" J. Combin. Th. A , 50 (1989) pp. 305–307
[a2] B. Bollobás, "The chromatic number of random graphs" Combinatorica , 8 (1988) pp. 49–56
[a3] S. Janson, T. Łuczak, A. Ruciński, "An exponential bound for the probability of nonexistence of a specified subgraph in a random graph" M. Karoński (ed.) J. Jaworski (ed.) A. Ruciński (ed.) , Random Graphs '87 , Wiley (1990) pp. 73–87
[a4] S. Janson, "Poisson approximation for large deviations" Random Struct. Algor. , 1 (1990) pp. 221–230
[a5] S. Janson, "New versions of Suen's correlation inequality" Random Struct. Algor. , 13 (1998) pp. 467–483
[a6] J. Spencer, "Threshold functions for extension statements" J. Combin. Th. A , 53 (1990) pp. 286–305
[a7] W.C.S. Suen, "A correlation inequality and a Poisson limit theorem for nonoverlapping balanced subgraphs of a random graph" Random Struct. Algor. , 1 (1990) pp. 231–242
[a8] S. Janson, T. Łuczak, A. Ruciński, "Random graphs" , Wiley (2000)
How to Cite This Entry:
Janson inequality. Encyclopedia of Mathematics. URL: http://encyclopediaofmath.org/index.php?title=Janson_inequality&oldid=55421
This article was adapted from an original article by A. Ruciński (originator), which appeared in Encyclopedia of Mathematics - ISBN 1402006098. See original article