Expectation, Variance, and Standard Deviation of Bernoulli Random Variables

Problem 747

A random variable $X$ is said to be a Bernoulli random variable if its probability mass function is given by
\begin{align*}
P(X=0) &= 1-p\\
P(X=1) & = p
\end{align*}
for some real number $0 \leq p \leq 1$.

(1) Find the expectation of the Bernoulli random variable $X$ with probability $p$.

As $X$ is a Bernoulli random variable, it takes only two values $0$ or $1$.
Thus, by definition of expectation, we obtain
\begin{align*}
E[X] &= \sum_{i=0}^1 P(X=i)x\\
&= P(X=0) \cdot 0 + P(X=1) \cdot 1\\
&= (1-p) \cdot 0 + p \cdot 1\\
&= p.
\end{align*}
Hence, the expectation of the Bernoulli random variable $X$ with parameter $p$ is $E[X] = p$.

Solution of (2)

We calculate the variance of the Bernoulli random variable $X$ using the definition of a variance. Namely, the variance of $X$ is defined as
\[V(X) = E[X^2] – \left(E[X]\right)^2.\]
Here is an observation that makes the computation simpler: As the Bernoulli random variable takes only the values $0$ or $1$, it follows that $X^2 = X$. Thus, the variance can be computed as follows.
\begin{align*}
V(X) &= E[X^2] – \left(E[X]\right)^2 && \text{by definition of variance}\\
&= E[X] – \left(E[X]\right)^2 && \text{by observation $X^2=X$}\\
&= p – p^2 && \text{by result of (1)}\\
&= p(1-p)
\end{align*}

Thus, the variance of the Bernoulli random variable $X$ with parameter $p$ is given by
\[V(X) = p(1-p).\]

Solution of (3)

The standard deviation is obtained by taking the square root of the variance. Hence, using the result of (2), the standard deviation of the Bernoulli random variable $X$ with parameter $p$ is
\[\sigma(X) = \sqrt{p(1-p)}.\]

Related Problem

Problem.
Suppose that $X$ is a random variable with Bernoulli distribution $B_p$ with probability parameter $p$.

Assume that the variance $V(X) = 0.21$. We further assume that $p > 0.5$.

Given the Variance of a Bernoulli Random Variable, Find Its Expectation
Suppose that $X$ is a random variable with Bernoulli distribution $B_p$ with probability parameter $p$.
Assume that the variance $V(X) = 0.21$. We further assume that $p > 0.5$.
(a) Find the probability $p$.
(b) Find the expectation $E(X)$.
Hint.
Recall that if $X$ […]

Can a Student Pass By Randomly Answering Multiple Choice Questions?
A final exam of the course Probability 101 consists of 10 multiple-choice questions. Each question has 4 possible answers and only one of them is a correct answer. To pass the course, 8 or more correct answers are necessary. Assume that a student has not studied probability at all and […]

Expected Value and Variance of Exponential Random Variable
Let $X$ be an exponential random variable with parameter $\lambda$.
(a) For any positive integer $n$, prove that
\[E[X^n] = \frac{n}{\lambda} E[X^{n-1}].\]
(b) Find the expected value of $X$.
(c) Find the variance of $X$.
(d) Find the standard deviation of […]

Coupon Collecting Problem: Find the Expectation of Boxes to Collect All Toys
A box of some snacks includes one of five toys. The chances of getting any of the toys are equally likely and independent of the previous results.
(a) Suppose that you buy the box until you complete all the five toys. Find the expected number of boxes that you need to buy.
(b) […]

Upper Bound of the Variance When a Random Variable is Bounded
Let $c$ be a fixed positive number. Let $X$ be a random variable that takes values only between $0$ and $c$. This implies the probability $P(0 \leq X \leq c) = 1$. Then prove the next inequality about the variance $V(X)$.
\[V(X) \leq \frac{c^2}{4}.\]
Proof.
Recall that […]

How to Prove Markov’s Inequality and Chebyshev’s Inequality
(a) Let $X$ be a random variable that takes only non-negative values. Prove that for any $a > 0$,
\[P(X \geq a) \leq \frac{E[X]}{a}.\]
This inequality is called Markov's inequality.
(b) Let $X$ be a random variable with finite mean $\mu$ and variance $\sigma^2$. Prove that […]

Linearity of Expectations E(X+Y) = E(X) + E(Y)
Let $X, Y$ be discrete random variables. Prove the linearity of expectations described as
\[E(X+Y) = E(X) + E(Y).\]
Solution.
The joint probability mass function of the discrete random variables $X$ and $Y$ is defined by
\[p(x, y) = P(X=x, Y=y).\]
Note that the […]

Condition that a Function Be a Probability Density Function
Let $c$ be a positive real number. Suppose that $X$ is a continuous random variable whose probability density function is given by
\begin{align*}
f(x) = \begin{cases}
\frac{1}{x^3} & \text{ if } x \geq c\\
0 & \text{ if } x < […]

## 1 Response

[…] For proofs of the formulas, see that post Expectation, Variance, and Standard Deviation of Bernoulli Random Variables. […]