In this post I will derive an oft used inequality in Theoretical computer science, using the Chernoff technique. The following function turns up in a few places in combinatorial geometry and other places.

Here we assume since otherwise the function is trivially seen to be equal to . Here we will show that,

Two easy interpretations of are as follows. First, it counts the number of points in the hamming cube whose hamming distance from the origin is at most . This interpretation also allows us to derive the nice identity,

for any . In the above expression we obey the convention when . To see this, interpret the above sum as the distance from a fixed point whose distance from the origin is . The is the distance from and the inner summation counts the number of points at distance . Then noting that the hamming cube looks the same from every point so the number of points of distance at most is the same from the origin or from . Hence the identity.

This definition has a nice geometric interpretation because is also the number of points from inside any ball of radius . The second interpretation is as follows. Consider the discrete probability space with the measure . Let be independent random variables defined as for . Let . Then we have,

for any . As such we have,

.

We will use this interpretation to derive our inequality.

Lemma: .

Proof: Our first observation is that we may take . The function is an increasing function of . Denoting this by and extending it to a function on we have by logarithmic differentiation,

Thus for we have . On the other hand trivially.

Thus assuming is . Now we have the following identity,

so it sufficies to upper bound . We now use the Chernoff technique. For any ,

Now for any we have,

and so we have,

We try to optimize the above expression by finding the which minimizes the expression above. Noting that we have, by differentiation and observing the sign of the derivative, that the minimizing the expression is given by,