Sequence Converges to the Largest Eigenvalue of a Matrix

Problems and Solutions of Eigenvalue, Eigenvector in Linear Algebra

Problem 403

Let $A$ be an $n\times n$ matrix. Suppose that $A$ has real eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$ with corresponding eigenvectors $\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n$.
Furthermore, suppose that
\[|\lambda_1| > |\lambda_2| \geq \cdots \geq |\lambda_n|.\] Let
\[\mathbf{x}_0=c_1\mathbf{u}_1+c_2\mathbf{u}_2+\cdots+c_n\mathbf{u}_n\] for some real numbers $c_1, c_2, \dots, c_n$ and $c_1\neq 0$.

Define
\[\mathbf{x}_{k+1}=A\mathbf{x}_k \text{ for } k=0, 1, 2,\dots\] and let
\[\beta_k=\frac{\mathbf{x}_k\cdot \mathbf{x}_{k+1}}{\mathbf{x}_k \cdot \mathbf{x}_k}=\frac{\mathbf{x}_k^{\trans} \mathbf{x}_{k+1}}{\mathbf{x}_k^{\trans} \mathbf{x}_k}.\]

Prove that
\[\lim_{k\to \infty} \beta_k=\lambda_1.\]

 
LoadingAdd to solve later

Proof.

We have
\begin{align*}
x_k=A\mathbf{x}_{k-1}=A^2\mathbf{x}_{k-2}=\cdots=A^k\mathbf{x}_0
\end{align*}
by applying the definition of $\mathbf{x}_k$ successively.
Then we have
\begin{align*}
\mathbf{x}_k&=A^k\mathbf{x}_0\\
&=A^k(c_1\mathbf{u}_1+c_2\mathbf{u}_2+\cdots+c_n\mathbf{u}_n)\\
&=c_1A^k\mathbf{u}_1+c_2A^k\mathbf{u}_2+\cdots+c_nA^k\mathbf{u}_n\\
&=c_1\lambda_1^k\mathbf{u}_1+c_2\lambda_2^k\mathbf{u}_2+\cdots+c_n\lambda_n^k\mathbf{u}_n,
\end{align*}
where the last step follows from the equalities $A^k\mathbf{u}_i=\lambda_i^k \mathbf{u}_i$, which in turn follows from the definitions of eigenvalues and eigenvectors $A\mathbf{u}_i=\lambda_i\mathbf{u}_i$.
Using the sum notation, we simply write it as
\[\mathbf{x}_k=\sum_{i=1}^nc_i\lambda_i^k\mathbf{u}_i.\] This formula is true for any nonnegative integer $k$.


Let us next compute the dot product $\mathbf{x}_k\cdot \mathbf{x}_k$. We have
\begin{align*}
\mathbf{x}_k\cdot \mathbf{x}_k&=\left(\, \sum_{i=1}^nc_i\lambda_i^k\mathbf{u}_i \,\right)\cdot \left(\, \sum_{j=1}^nc_j\lambda_j^k\mathbf{u}_j \,\right)\\
&=\sum_{i=1}^n\sum_{j=1}^nc_ic_j\lambda_i^k\lambda_j^k\mathbf{u}_i \cdot \mathbf{u}_j\\
&=\lambda_1^{2k}\sum_{i=1}^n\sum_{j=1}^nc_ic_j \left(\, \frac{\lambda_i}{\lambda_1} \,\right)^k\left(\, \frac{\lambda_j}{\lambda_1} \,\right)^k\mathbf{u}_i \cdot \mathbf{u}_j.
\end{align*}
Since $\lambda_1$ is strictly larger than any other eigenvalues, we have
\begin{align*}
\lim_{k \to \infty} \left(\, \frac{\lambda_i}{\lambda_1} \,\right)^k
=\begin{cases} 1 & \text{ if } i=1\\
0 & \text{ if } i=2, 3,\dots, n.
\end{cases}
\end{align*}
It follows that
\begin{align*}
\lim_{k \to \infty}\mathbf{x}_k\cdot \mathbf{x}_k=\lambda_1^{2k}c_1^2 \mathbf{u}_1\cdot \mathbf{u}_1.
\end{align*}


Similarly, we compute
\begin{align*}
\mathbf{x}_k\cdot \mathbf{x}_{k+1}&=\left(\, \sum_{i=1}^nc_i\lambda_i^k\mathbf{u}_i \,\right)\cdot \left(\, \sum_{j=1}^nc_j\lambda_j^{k+1}\mathbf{u}_j \,\right)\\
&=\sum_{i=1}^n\sum_{j=1}^nc_ic_j\lambda_i^k\lambda_j^{k+1}\mathbf{u}_i \cdot \mathbf{u}_j\\
&=\lambda_1^{2k+1}\sum_{i=1}^n\sum_{j=1}^nc_ic_j \left(\, \frac{\lambda_i}{\lambda_1} \,\right)^k\left(\, \frac{\lambda_j}{\lambda_1} \,\right)^{k+1}\mathbf{u}_i \cdot \mathbf{u}_j.
\end{align*}

Thus we obtain
\begin{align*}
\lim_{k\to \infty} \mathbf{x}_k\cdot \mathbf{x}_{k+1}=\lambda_1^{2k+1}c_1^2\mathbf{u}_1\cdot \mathbf{u}_1.
\end{align*}


Combining these computations, we have
\begin{align*}
\lim_{k\to \infty}\beta_k&=\frac{\lim_{k\to \infty}\mathbf{x}_k\cdot \mathbf{x}_{k+1}}{\lim_{k\to \infty}\mathbf{x}_k \cdot \mathbf{x}_k}\\[6pt] &=\frac{\lambda_1^{2k+1}c_1^2 \mathbf{u}_1\cdot \mathbf{u}_1}{\lambda_1^{2k}c_1^2\mathbf{u}_1\cdot \mathbf{u}_1}\\[6pt] &=\lambda_1
\end{align*}
as required. This completes the proof.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Eigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is EvenEigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is Even Let $A$ be a real skew-symmetric matrix, that is, $A^{\trans}=-A$. Then prove the following statements. (a) Each eigenvalue of the real skew-symmetric matrix $A$ is either $0$ or a purely imaginary number. (b) The rank of $A$ is even.   Proof. (a) Each […]
  • Orthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct EigenvaluesOrthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct Eigenvalues Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$. Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$. (Nagoya University, Linear Algebra Final Exam Problem)   Hint. Two […]
  • Unit Vectors and Idempotent MatricesUnit Vectors and Idempotent Matrices A square matrix $A$ is called idempotent if $A^2=A$. (a) Let $\mathbf{u}$ be a vector in $\R^n$ with length $1$. Define the matrix $P$ to be $P=\mathbf{u}\mathbf{u}^{\trans}$. Prove that $P$ is an idempotent matrix. (b) Suppose that $\mathbf{u}$ and $\mathbf{v}$ be […]
  • Eigenvalues of a Hermitian Matrix are Real NumbersEigenvalues of a Hermitian Matrix are Real Numbers Show that eigenvalues of a Hermitian matrix $A$ are real numbers. (The Ohio State University Linear Algebra Exam Problem)   We give two proofs. These two proofs are essentially the same. The second proof is a bit simpler and concise compared to the first one. […]
  • Find the Limit of a MatrixFind the Limit of a Matrix Let \[A=\begin{bmatrix} \frac{1}{7} & \frac{3}{7} & \frac{3}{7} \\ \frac{3}{7} &\frac{1}{7} &\frac{3}{7} \\ \frac{3}{7} & \frac{3}{7} & \frac{1}{7} \end{bmatrix}\] be $3 \times 3$ matrix. Find \[\lim_{n \to \infty} A^n.\] (Nagoya University Linear […]
  • There is at Least One Real Eigenvalue of an Odd Real MatrixThere is at Least One Real Eigenvalue of an Odd Real Matrix Let $n$ be an odd integer and let $A$ be an $n\times n$ real matrix. Prove that the matrix $A$ has at least one real eigenvalue.   We give two proofs. Proof 1. Let $p(t)=\det(A-tI)$ be the characteristic polynomial of the matrix $A$. It is a degree $n$ […]
  • Rotation Matrix in Space and its Determinant and EigenvaluesRotation Matrix in Space and its Determinant and Eigenvalues For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta & -\sin\theta & 0 \\ \sin\theta &\cos\theta &0 \\ 0 & 0 & 1 \end{bmatrix}.\] (a) Find the determinant of the matrix $A$. (b) Show that $A$ is an […]
  • Sherman-Woodbery Formula for the Inverse MatrixSherman-Woodbery Formula for the Inverse Matrix Let $\mathbf{u}$ and $\mathbf{v}$ be vectors in $\R^n$, and let $I$ be the $n \times n$ identity matrix. Suppose that the inner product of $\mathbf{u}$ and $\mathbf{v}$ satisfies \[\mathbf{v}^{\trans}\mathbf{u}\neq -1.\] Define the matrix […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Find Eigenvalues and Eigenvectors. MIT Linear Algebra homework problem and solution
Find All the Eigenvalues and Eigenvectors of the 6 by 6 Matrix

Find all the eigenvalues and eigenvectors of the matrix \[A=\begin{bmatrix} 10001 & 3 & 5 & 7 &9 & 11...

Close