Interchangeability of Limits and Probability of Increasing or Decreasing Sequence of Events

Probability problems

Problem 744

A sequence of events $\{E_n\}_{n \geq 1}$ is said to be increasing if it satisfies the ascending condition
\[E_1 \subset E_2 \subset \cdots \subset E_n \subset \cdots.\] Also, a sequence $\{E_n\}_{n \geq 1}$ is called decreasing if it satisfies the descending condition
\[E_1 \supset E_2 \supset \cdots \supset E_n \supset \cdots.\]

When $\{E_n\}_{n \geq 1}$ is an increasing sequence, we define a new event denoted by $\lim_{n \to \infty} E_n$ by
\[\lim_{n \to \infty} E_n := \bigcup_{n=1}^{\infty} E_n.\]

Also, when $\{E_n\}_{n \geq 1}$ is a decreasing sequence, we define a new event denoted by $\lim_{n \to \infty} E_n$ by
\[\lim_{n \to \infty} E_n := \bigcap_{n=1}^{\infty} E_n.\]

(1) Suppose that $\{E_n\}_{n \geq 1}$ is an increasing sequence of events. Then prove the equality of probabilities
\[\lim_{n \to \infty} P(E_n) = P\left(\lim_{n \to \infty} E_n \right).\] Hence, the limit and the probability are interchangeable.

(2) Suppose that $\{E_n\}_{n \geq 1}$ is a decreasing sequence of events. Then prove the equality of probabilities
\[\lim_{n \to \infty} P(E_n) = P\left(\lim_{n \to \infty} E_n \right). \] LoadingAdd to solve later

Sponsored Links

Proof.

Proof of (1)

Let $\{E_n\}_{n \geq 1}$ be an increasing sequence of events. Then we define new events $F_n$ as follows.
\begin{align*}
F_1 &= E_1\\
F_n & = E_n \setminus E_{n-1} && \text{ for any } n \geq 2
\end{align*}
The event $F_n$ is depicted as a yellow region in the figure below.

definition of the set F_n

From the figure, we can see that the events $\{F_n\}_{n \geq 1}$ are mutually exclusive, that is, $F_iF_j = \emptyset$ for any $i \neq j$.
Furthermore, note that
\[\bigcup_{i=1}^n F_i = \bigcup_{i=1}^n E_i \tag{*}\] for any $n$, and
\[\bigcup_{i=1}^{\infty} F_i = \bigcup_{i=1}^{\infty} E_i. \tag{**}\]

Now we proceed to the main part and prove the equality
\[\lim_{n \to \infty} P(E_n) = P \left(\lim_{n \to \infty} E_n \right).\] We start with the right-hand-side and show that it is equal to the left-hand-side.
\begin{align*}
P\left(\lim_{n \to \infty} E_i \right) &= P \left(\bigcup_{i=1}^{\infty} E_n \right) && \text{ definition of } \lim_{n \to \infty} E_n\\[6pt] &= P \left(\bigcup_{i=1}^{\infty} F_i \right) && \text{ by } (**) \\[6pt] &= \sum_{i=1}^{\infty} P(F_i) && \text{ by one of the axioms of probability} \\
& && \text{ with mutually exclusive events } \{F_n\}\\[6pt] &= \lim_{n \to \infty} \sum_{i=1}^n P(F_i) \\[6pt] &= \lim_{n \to \infty} P\left( \bigcup_{i=1}^n F_i \right) && \text{ by additivity with mutually exclusive events } \{F_n\}\\[6pt] &= \lim_{n \to \infty} P\left(\bigcup_{i=1}^n E_i \right) && \text{ by } (*)\\[6pt] &= \lim_{n \to \infty} P(E_n) && \text{as $\{E_n\}$ is increasing}.
\end{align*}
This proves the desired equality.

Proof of (2)

Now, we consider a decreasing sequence of events $\{E_n\}_{n\geq 1}$.
As $\{E_n\}_{n\geq 1}$ is decreasing, its complement $\{E_n^c\}_{n \geq 1}$ is increasing. Thus, by the result of part (1), we see that
\[\lim_{n \to \infty} P(E_n^c) = P \left(\lim_{n \to \infty} E_n^c \right). \tag{***}\] By definition, we have
\[\lim_{n \to \infty} E_n^c = \bigcup_{n=1}^{\infty} E_n^c = \left(\bigcap_{n=1}^{\infty} E_n \right)^c.\]

It follows that the right-hand-side of the equality (***) becomes
\begin{align*}
P \left(\lim_{n \to \infty} E_n^c \right) &= P\left(\left(\bigcap_{n=1}^{\infty} E_n \right)^c \right)\\[6pt] &= 1 – P\left(\bigcap_{n=1}^{\infty} E_n \right)\\[6pt] &= 1 – P\left(\lim_{n \to \infty} E_n \right).
\end{align*}
On the other hand, the left-hand-side of the equality (***) becomes
\begin{align*}
\lim_{n \to \infty} P(E_n^c) &= \lim_{n \to \infty} \left( 1 – P(E_n) \right)\\[6pt] &= 1 – \lim_{n \to \infty} P(E_n).
\end{align*}

Combining these results yields
\[1 – \lim_{n \to \infty} P(E_n) = 1 – P\left(\lim_{n \to \infty} E_n \right).\] Equivalently, we have
\[\lim_{n \to \infty} P(F_n) = P\left(\lim_{n \to \infty} E_n \right),\] which is the desired equality. This completes the proof.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Complement of Independent Events are IndependentComplement of Independent Events are Independent Let $E$ and $F$ be independent events. Let $F^c$ be the complement of $F$. Prove that $E$ and $F^c$ are independent as well. Solution. Note that $E\cap F$ and $E \cap F^c$ are disjoint and $E = (E \cap F) \cup (E \cap F^c)$. It follows that \[P(E) = P(E \cap F) + P(E […]
  • Linearity of Expectations E(X+Y) = E(X) + E(Y)Linearity of Expectations E(X+Y) = E(X) + E(Y) Let $X, Y$ be discrete random variables. Prove the linearity of expectations described as \[E(X+Y) = E(X) + E(Y).\] Solution. The joint probability mass function of the discrete random variables $X$ and $Y$ is defined by \[p(x, y) = P(X=x, Y=y).\] Note that the […]
  • Lower and Upper Bounds of the Probability of the Intersection of Two EventsLower and Upper Bounds of the Probability of the Intersection of Two Events Let $A, B$ be events with probabilities $P(A)=2/5$, $P(B)=5/6$, respectively. Find the best lower and upper bound of the probability $P(A \cap B)$ of the intersection $A \cap B$. Namely, find real numbers $a, b$ such that \[a \leq P(A \cap B) \leq b\] and $P(A \cap B)$ could […]
  • Coupon Collecting Problem: Find the Expectation of Boxes to Collect All ToysCoupon Collecting Problem: Find the Expectation of Boxes to Collect All Toys A box of some snacks includes one of five toys. The chances of getting any of the toys are equally likely and independent of the previous results. (a) Suppose that you buy the box until you complete all the five toys. Find the expected number of boxes that you need to buy. (b) […]
  • Probability Problems about Two DiceProbability Problems about Two Dice Two fair and distinguishable six-sided dice are rolled. (1) What is the probability that the sum of the upturned faces will equal $5$? (2) What is the probability that the outcome of the second die is strictly greater than the first die? Solution. The sample space $S$ is […]
  • Sequence Converges to the Largest Eigenvalue of a MatrixSequence Converges to the Largest Eigenvalue of a Matrix Let $A$ be an $n\times n$ matrix. Suppose that $A$ has real eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$ with corresponding eigenvectors $\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n$. Furthermore, suppose that \[|\lambda_1| > |\lambda_2| \geq \cdots \geq […]
  • Probabilities of An Infinite Sequence of Die RollingProbabilities of An Infinite Sequence of Die Rolling Consider an infinite series of events of rolling a fair six-sided die. Assume that each event is independent of each other. For each of the below, determine its probability. (1) At least one die lands on the face 5 in the first $n$ rolls. (2) Exactly $k$ dice land on the face 5 […]
  • What is the Probability that All Coins Land Heads When Four Coins are Tossed If…?What is the Probability that All Coins Land Heads When Four Coins are Tossed If…? Four fair coins are tossed. (1) What is the probability that all coins land heads? (2) What is the probability that all coins land heads if the first coin is heads? (3) What is the probability that all coins land heads if at least one coin lands […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Probability
Probability problems
Linearity of Expectations E(X+Y) = E(X) + E(Y)

Let $X, Y$ be discrete random variables. Prove the linearity of expectations described as \[E(X+Y) = E(X) + E(Y).\]

Close