It was recently suggested to me that the proof of Khintchine’s inequality is one which I should think about deeply, and I would especially like to explore the relationship between Khintchine’s and Rudin’s inequality. This will be the first of a series of short posts about these inequalities.

We begin with Khintchine’s inequality. We have a set of real numbers and a corresponding set of independent random variables taking the values with equal probability. Khintchine’s inequality shows that we can control the sum in any norm by the norm of the coefficients . The slogan here is “changing the signs of a sum is well-behaved on average”.

More precisely, for any , we have the following bound

The constant depends only on , and we shall obtain an explicit value below. The inequality is perhaps more suggestive if we observe that , where the norm is taken over the probability space, so that we can write the inequality as

Hence for random variables of this shape we have very good control over all norms.

We now give the proof, which is a surprisingly straightforward combination of exponential means and Markov’s inequality, polished off with a simple integral. First note that the independence of the combined with the elementary inequality gives

For any applying Markov’s inequality and setting gives the inequality

Recalling the distributional definition of norms, that is,

and making the substitution to do the integration, we compute that

and the proof is complete. Let me just record for future use the following explicit form which follows from the proof above together with Stirling’s formula:

where is some absolute constant.

The proof above is taken from Thomas Wolff’s excellent lecture notes on harmonic analysis. Of course, the proof also holds for complex with appropriate modulus signs scattered about. Using duality and the fact that equality holds for we also get a similar lower bound.

### Like this:

Like Loading...

I could not proof your elementrary inequality