How to Prove Markov’s Inequality and Chebyshev’s Inequality

Probability problems

Problem 759

(a) Let $X$ be a random variable that takes only non-negative values. Prove that for any $a > 0$,
\[P(X \geq a) \leq \frac{E[X]}{a}.\] This inequality is called Markov’s inequality.

(b) Let $X$ be a random variable with finite mean $\mu$ and variance $\sigma^2$. Prove that for any $a >0$,
\[P\left(|X – \mu| \geq a \right) \leq \frac{\sigma^2}{a^2}.\] This inequality is called Chebyshev’s inequality.

LoadingAdd to solve later

Sponsored Links


Solution.

We give two proofs of Markov’s inequality.

First Proof of Markov’s Inequality

For the first proof, let us assume that $X$ is a discrete random variable. The case when $X$ is a continuous random variable is identical except summations are replaced by integrals. The mean $E[X]$ is by definition
\begin{align*}
E[X] &= \sum_{x, p(x)>0} x p(x).
\end{align*}
Here, each term $xp(x)$ is a non-negative number as $X$ is non-negative and $p(x)$ is a probability. Thus, omitting some terms reduces the sum.

Hence we have
\begin{align*}
E[X] &= \sum_{x} x p(x) \geq \sum_{x \geq a} x p(x).
\end{align*}
(We omitted those $x$ such that $x \lt a$.)


Now, since $x \geq a$, we have
\[\sum_{x \geq a} x p(x) \geq \sum_{x \geq a} a p(x) = a \sum_{x \geq a} p(x).\] Note that
\[\sum_{x \geq a} p(x) = P(X \geq a).\] It follows that we obtain
\[E[X] \geq a P(X \geq a).\] Dividing this by $a>0$, we obtain the Markov’s inequality
\[P(X \geq a) \leq \frac{E[X]}{a}.\]

Second Proof of Markov’s Inequality

Let us give an alternative proof. We define a new random variable $I$ by
\begin{align*}
I =
\begin{cases}
1 & \text{ if } X \geq a\\
0 & \text{ otherwise}.
\end{cases}
\end{align*}
(This is called an indicator variable for the event $X \geq a$.)

When $X \geq a$, we have $I = 1$. Thus,
\[\frac{X}{a} \geq 1 = I.\] If, on the other hand, $X \lt a$, then as both $X$ and $a$ are non-negative, we have
\[\frac{X}{a} \geq 0 = I.\] Therefore, in either case, we have the inequality
\[\frac{X}{a} \geq I. \]

This implies the inequality of their expected values
\[E\left[\frac{X}{a}\right] \geq E[I].\tag{*}\] By linearity of expected value, we see that
\[E\left[\frac{X}{a}\right] = \frac{E[X]}{a}.\] Also, we have
\begin{align*}
E[I] &= 0\cdot p(0) + 1\cdot p(1)\\
&= p(1)\\
&= P(X \geq a)
\end{align*}

It follows from (*) that
\[\frac{E[X]}{a} \geq P(X \geq a).\] This completes the proof of Markov’s inequality.

Proof of Chebyshev’s Inequality

The proof of Chebyshev’s inequality relies on Markov’s inequality.
Note that $|X – \mu| \geq a$ is equivalent to $(X-\mu)^2 \geq a^2$. Let us put
\[Y = (X-\mu)^2.\] Then $Y$ is a non-negative random variable.

Applying Markov’s inequality with $Y$ and constant $a^2$ gives
\begin{align*}
P(Y \geq a^2) \leq \frac{E[Y]}{a^2}.
\end{align*}

Now, the definition of the variance of $X$ yields that
\[E[Y]=E[(X-\mu)^2] = V[X] = \sigma^2.\]

Combining these computations gives
\begin{align*}
P(|X-\mu| \geq a) &= P((X-\mu)^2 \geq a^2)\\[6pt] &= P(Y \geq a^2)\\[6pt] &\leq \frac{E[Y]}{a^2}\\[6pt] &= \frac{\sigma^2}{a^2},
\end{align*}
which concludes the proof of Chebyshev’s inequality.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Upper Bound of the Variance When a Random Variable is BoundedUpper Bound of the Variance When a Random Variable is Bounded Let $c$ be a fixed positive number. Let $X$ be a random variable that takes values only between $0$ and $c$. This implies the probability $P(0 \leq X \leq c) = 1$. Then prove the next inequality about the variance $V(X)$. \[V(X) \leq \frac{c^2}{4}.\] Proof. Recall that […]
  • Given the Variance of a Bernoulli Random Variable, Find Its ExpectationGiven the Variance of a Bernoulli Random Variable, Find Its Expectation Suppose that $X$ is a random variable with Bernoulli distribution $B_p$ with probability parameter $p$. Assume that the variance $V(X) = 0.21$. We further assume that $p > 0.5$. (a) Find the probability $p$. (b) Find the expectation $E(X)$. Hint. Recall that if $X$ […]
  • Expectation, Variance, and Standard Deviation of Bernoulli Random VariablesExpectation, Variance, and Standard Deviation of Bernoulli Random Variables A random variable $X$ is said to be a Bernoulli random variable if its probability mass function is given by \begin{align*} P(X=0) &= 1-p\\ P(X=1) & = p \end{align*} for some real number $0 \leq p \leq 1$. (1) Find the expectation of the Bernoulli random variable $X$ […]
  • Can a Student Pass By Randomly Answering Multiple Choice Questions?Can a Student Pass By Randomly Answering Multiple Choice Questions? A final exam of the course Probability 101 consists of 10 multiple-choice questions. Each question has 4 possible answers and only one of them is a correct answer. To pass the course, 8 or more correct answers are necessary. Assume that a student has not studied probability at all and […]
  • Expected Value and Variance of Exponential Random VariableExpected Value and Variance of Exponential Random Variable Let $X$ be an exponential random variable with parameter $\lambda$. (a) For any positive integer $n$, prove that \[E[X^n] = \frac{n}{\lambda} E[X^{n-1}].\] (b) Find the expected value of $X$. (c) Find the variance of $X$. (d) Find the standard deviation of […]
  • Linearity of Expectations E(X+Y) = E(X) + E(Y)Linearity of Expectations E(X+Y) = E(X) + E(Y) Let $X, Y$ be discrete random variables. Prove the linearity of expectations described as \[E(X+Y) = E(X) + E(Y).\] Solution. The joint probability mass function of the discrete random variables $X$ and $Y$ is defined by \[p(x, y) = P(X=x, Y=y).\] Note that the […]
  • Coupon Collecting Problem: Find the Expectation of Boxes to Collect All ToysCoupon Collecting Problem: Find the Expectation of Boxes to Collect All Toys A box of some snacks includes one of five toys. The chances of getting any of the toys are equally likely and independent of the previous results. (a) Suppose that you buy the box until you complete all the five toys. Find the expected number of boxes that you need to buy. (b) […]
  • How to Use the Z-table to Compute Probabilities of Non-Standard Normal DistributionsHow to Use the Z-table to Compute Probabilities of Non-Standard Normal Distributions Let $X\sim \mathcal{N}(\mu, \sigma)$ be a normal random variable with parameter $\mu=6$ and $\sigma^2=4$. Find the following probabilities using the Z-table below. (a) Find $P(X \lt 7)$. (b) Find $P(X \lt 3)$. (c) Find $P(4.5 \lt X \lt 8.5)$. The Z-table is […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Probability
Probability problems
How to Use the Z-table to Compute Probabilities of Non-Standard Normal Distributions

Let $X\sim \mathcal{N}(\mu, \sigma)$ be a normal random variable with parameter $\mu=6$ and $\sigma^2=4$. Find the following probabilities using the...

Close