Markov Inequality

Markov Inequality puts bound on how much probability can be at arbitrarily large values once its mean is specified.


Suppose that X is a random variable such that P(X ≥ 0) = 1. Then for every real number t > 0,

$$P(X \ge t) \le {E(X) \over t}$$

Proof: for convecience, we can assume that X random variable has a discrete distribution. For a discre distribution we have:

$$E(X) = \sum_{x} xf(x) = \sum_{x < t} xf(x) + \sum_{x \ge t} xf(x)$$

Since \(X \ge 0\), all the terms in the summations are nonnegative, therefore:

$$E(X) \ge \sum_{x \ge t} xf(x) \ge \sum_{x \ge t} tf(x) = tP(X \ge t)$$

And divide both sides by t > 0 to obtain the Markov inequality.


The Markov inequality is primarily of interest for large values of t. In fact, when \(t \le E(X)\), the inequality is of no interest, since it is known that \(P(X \le t) \le 1\). However, it is found from the Markov inequality that for every nonnegative random variable X whose mean is 1, the maximum possible value of \(P(X \ge 100)\) is 0.01. Furthermore, it can be verified that this maximum value is actually attained by every random variable X for which P(X = 0) = 0.99 and P(X = 100) = 0.01.

Donate