Prove that the Length $\|A^n\mathbf{v}\|$ is As Small As We Like.

Linear Algebra exam problems and solutions at University of California, Berkeley

Problem 381

Consider the matrix
\[A=\begin{bmatrix}
3/2 & 2\\
-1& -3/2
\end{bmatrix} \in M_{2\times 2}(\R).\]

(a) Find the eigenvalues and corresponding eigenvectors of $A$.

(b) Show that for $\mathbf{v}=\begin{bmatrix}
1 \\
0
\end{bmatrix}\in \R^2$, we can choose $n$ large enough so that the length $\|A^n\mathbf{v}\|$ is as small as we like.

(University of California, Berkeley, Linear Algebra Final Exam Problem)
 
FavoriteLoadingAdd to solve later

Sponsored Links

Proof.

(a) Find the eigenvalues and corresponding eigenvectors of $A$.

To find the eigenvalues of $A$, we compute the characteristic polynomial $p(t)$ of $A$.
We have
\begin{align*}
p(t)&=\det(A-tI)\\
&=\begin{vmatrix}
3/2-t & 2\\
-1& -3/2-t
\end{vmatrix}\\
&=t^2-1/4.
\end{align*}
Since the eigenvalues are roots of the characteristic polynomials, the eigenvalues of $A$ are $\pm 1/2$.

Next we find the eigenvectors corresponding to eigenvalue $1/2$.
These are the solutions of $(A-\frac{1}{2}I)\mathbf{x}=\mathbf{0}$.
We have
\begin{align*}
A-\frac{1}{2}I=\begin{bmatrix}
1 & 2\\
-1& -2
\end{bmatrix}
\xrightarrow{R_2+R_1}
\begin{bmatrix}
1 & 2\\
0& 0
\end{bmatrix}.
\end{align*}
Thus, the solution $\mathbf{x}$ satisfies $x_1=-2x_2$, and the eigenvectors are
\[\mathbf{x}=x_2\begin{bmatrix}
-2 \\
1
\end{bmatrix},\] where $x_2$ is a nonzero scalar.

Similarly, we find the eigenvectors corresponding to the eigenvalue $-1/2$.
We solve $(A+\frac{1}{2}I)\mathbf{x}=\mathbf{0}$.
We have
\begin{align*}
A+\frac{1}{2}I=\begin{bmatrix}
2 & 2\\
-1& -1
\end{bmatrix}
\xrightarrow[\text{then } R_2+R_1]{\frac{1}{2}R_1}
\begin{bmatrix}
1 & 1\\
0& 0
\end{bmatrix}.
\end{align*}
Thus, we have $x_1=-x_2$, and the eigenvectors are
\[\mathbf{x}=x_2\begin{bmatrix}
-1 \\
1
\end{bmatrix},\] where $x_2$ is a nonzero scalar.

(b) We can choose $n$ large enough so that the length $\|A^n\mathbf{v}\|$ is as small as we like.

We express the vector $\mathbf{v}=\begin{bmatrix}
1 \\
0
\end{bmatrix}$ as a linear combination of eigenvectors $\begin{bmatrix}
-2 \\
1
\end{bmatrix}$ and $\begin{bmatrix}
-1 \\
1
\end{bmatrix}$ corresponding to eigenvalues $1/2$ and $-1/2$, respectively.
Let
\[\begin{bmatrix}
1 \\
0
\end{bmatrix}=c_1\begin{bmatrix}
-2 \\
1
\end{bmatrix}+c_2\begin{bmatrix}
-1 \\
1
\end{bmatrix}\] for some scalars $c_1, c_2$.
Solving this for $c_1, c_2$, we find that $c_1=-1$ and $c_2=1$, and thus we have
\[\begin{bmatrix}
1 \\
0
\end{bmatrix}=-\begin{bmatrix}
-2 \\
1
\end{bmatrix}+\begin{bmatrix}
-1 \\
1
\end{bmatrix}.\] Then for any positive integer $n$, we have
\begin{align*}
A^n\begin{bmatrix}
1 \\
0
\end{bmatrix}&=-A^n\begin{bmatrix}
-2 \\
1
\end{bmatrix}+A^n\begin{bmatrix}
-1 \\
1
\end{bmatrix}\\
&=-\left(\, \frac{1}{2} \,\right)^n\begin{bmatrix}
-2 \\
1
\end{bmatrix}+\left(\, -\frac{1}{2} \,\right)^n\begin{bmatrix}
-1 \\
1
\end{bmatrix}\\
&=\left(\, \frac{1}{2} \,\right)^n\begin{bmatrix}
2-(-1)^n \\
1+(-1)^n
\end{bmatrix}
\end{align*}
Note that in the second equality we used the following fact: If $A\mathbf{x}=\lambda \mathbf{x}$, then $A^n\mathbf{x}=\lambda^n \mathbf{x}$.

Then the length is
\begin{align*}
\left \| A^n\begin{bmatrix}
1 \\
0
\end{bmatrix}\right \|&=\left(\, \frac{1}{2} \,\right)^n \sqrt{\left(\, 2-(-1)^n \,\right)^2+\left(\, 1+(-1)^n \,\right)^2}\\
& \leq \left(\, \frac{1}{2} \,\right)^n \sqrt{3^2+2^2}\\
&= \sqrt{14}\left(\, \frac{1}{2} \,\right)^n \to 0 \text{ as $n$ tends to infinity}.
\end{align*}
Therefore, we can choose $n$ large enough so that the length $\|A^n\mathbf{v}\|$ is as small as we wish.


FavoriteLoadingAdd to solve later

Sponsored Links

More from my site

  • A Matrix Having One Positive Eigenvalue and One Negative EigenvalueA Matrix Having One Positive Eigenvalue and One Negative Eigenvalue Prove that the matrix \[A=\begin{bmatrix} 1 & 1.00001 & 1 \\ 1.00001 &1 &1.00001 \\ 1 & 1.00001 & 1 \end{bmatrix}\] has one positive eigenvalue and one negative eigenvalue. (University of California, Berkeley Qualifying Exam Problem)   Solution. Let us put […]
  • Eigenvalues of a Hermitian Matrix are Real NumbersEigenvalues of a Hermitian Matrix are Real Numbers Show that eigenvalues of a Hermitian matrix $A$ are real numbers. (The Ohio State University Linear Algebra Exam Problem)   We give two proofs. These two proofs are essentially the same. The second proof is a bit simpler and concise compared to the first one. […]
  • A Matrix Equation of a Symmetric Matrix and the Limit of its SolutionA Matrix Equation of a Symmetric Matrix and the Limit of its Solution Let $A$ be a real symmetric $n\times n$ matrix with $0$ as a simple eigenvalue (that is, the algebraic multiplicity of the eigenvalue $0$ is $1$), and let us fix a vector $\mathbf{v}\in \R^n$. (a) Prove that for sufficiently small positive real $\epsilon$, the equation […]
  • Find the Rank of the Matrix $A+I$ if Eigenvalues of $A$ are $1, 2, 3, 4, 5$Find the Rank of the Matrix $A+I$ if Eigenvalues of $A$ are $1, 2, 3, 4, 5$ Let $A$ be an $n$ by $n$ matrix with entries in complex numbers $\C$. Its only eigenvalues are $1,2,3,4,5$, possibly with multiplicities. What is the rank of the matrix $A+I_n$, where $I_n$ is the identity $n$ by $n$ matrix. (UCB-University of California, Berkeley, […]
  • Square Root of an Upper Triangular Matrix. How Many Square Roots Exist?Square Root of an Upper Triangular Matrix. How Many Square Roots Exist? Find a square root of the matrix \[A=\begin{bmatrix} 1 & 3 & -3 \\ 0 &4 &5 \\ 0 & 0 & 9 \end{bmatrix}.\] How many square roots does this matrix have? (University of California, Berkeley Qualifying Exam)   Proof. We will find all matrices $B$ such that […]
  • If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set?If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set? Suppose that $A$ is a real $n\times n$ matrix. (a) Is it true that $A$ must commute with its transpose? (b) Suppose that the columns of $A$ (considered as vectors) form an orthonormal set. Is it true that the rows of $A$ must also form an orthonormal set? (University of […]
  • Simple Commutative Relation on MatricesSimple Commutative Relation on Matrices Let $A$ and $B$ are $n \times n$ matrices with real entries. Assume that $A+B$ is invertible. Then show that \[A(A+B)^{-1}B=B(A+B)^{-1}A.\] (University of California, Berkeley qualifying exam) Proof. Let $P=A+B$. Then $B=P-A$. Using these, we express the given […]
  • Inner Product, Norm, and Orthogonal VectorsInner Product, Norm, and Orthogonal Vectors Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Linear Algebra
Math exam problems and solutions at Harvard University
Determinant of Matrix whose Diagonal Entries are 6 and 2 Elsewhere

Find the determinant of the following matrix \[A=\begin{bmatrix} 6 & 2 & 2 & 2 &2 \\ 2 & 6...

Close