We call a random variable {X} (which is just a measurable function {X:\Sigma\rightarrow\mathbb{R}} where {\Sigma} is a probability measure space) symmetric if {X} is identically distributed to {-X}. Symmetric random variables are often a lot more pleasant to handle. Much of what is true for symmetric random variables is also true in general, but both the proofs and statements are lot more complicated. For this reason, it would be very useful to know that we can always assume, in some sense, that a random variable is symmetric. This process, finding a related random variable which is symmetric, is known as symmetrization. With a good understanding of this process many proofs in probability become vastly simpler: one can first reduce to the symmetric case, and then just do it by hand.

Fix a random variable {X}. One obvious related random variable which is symmetric is {X-X'}, where {X'} is a random variable identical in distribution to {X}. We denote this by {X^s}, and it is clearly symmetric. Furthermore, note that {\mathbb{E}(X^s)=0}.

Not only is {X^s} symmetric, it is also closely related to the original random variable {X}, as its definition suggests. For example, for every {x} and {a} we have

\displaystyle  \begin{array}{rcl} \mathbb{P}(\lvert X^s\rvert\geq x)&=&\mathbb{P}(\lvert X-a-(X'-a)\rvert\geq x)\\ &\leq&2\mathbb{P}(\lvert X-a\rvert\geq x/2)\end{array}

and so the tail probability for {X} is roughly bounded below by that of {X^s}. If {\mathbb{E}(X)=0} originally, then {X} and {X^s} are particularly strongly related. For example, for any {r\geq 1} and real number {x},

\displaystyle \lvert x\rvert^r=\lvert x+\mathbb{E}(X)\rvert^r\leq \mathbb{E}\lvert x+X\rvert^r,

and hence in particular,

\displaystyle \mathbb{E}\lvert X\rvert^r\leq\mathbb{E}\lvert X^s\rvert^r.

As a quick illustration of the power of symmetric random variables, we mention the following.

Lemma 1 If {X} and {Y} are independent random variables with finite moments of order {r\geq 2} and {Y} is symmetric then

\displaystyle \mathbb{E}\lvert X\rvert^r+\mathbb{E}\lvert Y\rvert^r\leq\mathbb{E}\lvert X+Y\rvert^r.

To prove this, we use a consequence of Clarkson’s inequality: for any {x,y\in\mathbb{R}} we have

\displaystyle \lvert x+y\rvert^r+\lvert x-y\rvert^r\geq 2(\lvert x\rvert^r+\lvert y\rvert^r).

The lemma follows after taking expectations of both sides and noting that {X+Y} is identically distributed to {X-Y}.

This entry was posted in Uncategorized. Bookmark the permalink.

1 Response to Symmetrization

  1. Nalini says:

    Could you please tell why the random variable X-X’ is called a symmetric random variable?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s