A Matrix Equation of a Symmetric Matrix and the Limit of its Solution

Linear Algebra exam problems and solutions at University of California, Berkeley

Problem 457

Let $A$ be a real symmetric $n\times n$ matrix with $0$ as a simple eigenvalue (that is, the algebraic multiplicity of the eigenvalue $0$ is $1$), and let us fix a vector $\mathbf{v}\in \R^n$.

(a) Prove that for sufficiently small positive real $\epsilon$, the equation
\[A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}\] has a unique solution $\mathbf{x}=\mathbf{x}(\epsilon) \in \R^n$.

(b) Evaluate
\[\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)\] in terms of $\mathbf{v}$, the eigenvectors of $A$, and the inner product $\langle\, ,\,\rangle$ on $\R^n$.

 
(University of California, Berkeley, Linear Algebra Qualifying Exam)

LoadingAdd to solve later

Sponsored Links


Proof.

(a) Prove that $A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}$has a unique solution $\mathbf{x}=\mathbf{x}(\epsilon) \in \R^n$.

Recall that the eigenvalues of a real symmetric matrices are all real numbers and it is diagonalizable by an orthogonal matrix.

Note that the equation $A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}$ can be written as
\[(A+\epsilon I)\mathbf{x}=\mathbf{v}, \tag{*}\] where $I$ is the $n\times n$ identity matrix. Thus to show that the equation (*) has a unique solution, it suffices to show that the matrix $A+\epsilon I$ is invertible.

Since $A$ is diagonalizable, there exists an invertible matrix $S$ such that
\[S^{-1}AS=\begin{bmatrix}
\lambda_1 & 0 & \cdots & 0 \\
0 & \lambda_2 & \cdots & 0\\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n
\end{bmatrix},\] where $\lambda_i$ are eigenvalues of $A$.
Since the algebraic multiplicity of $0$ is $1$, without loss of generality, we may assume that $\lambda_1=0$ and $\lambda_i, i > 1$ are nonzero.

Then we have
\begin{align*}
S^{-1}(A+\epsilon I)S&=S^{-1}AS+\epsilon I=\begin{bmatrix}
\epsilon & 0 & \cdots & 0 \\
0 & \epsilon+\lambda_2 & \cdots & 0\\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \epsilon+\lambda_n
\end{bmatrix}.
\end{align*}


If $\epsilon > 0$ is smaller than the lengths of $|\lambda_i|, i > 1$, then none of the diagonal entries $\epsilon+ \lambda_i$ are zero.

Hence we have
\begin{align*}
\det(A+\epsilon I)&=\det(S)^{-1}\det(A+\epsilon I)\det(S)\\
&=\det\left(\, S^{-1}(A+\epsilon I) S \,\right)\\
&=\epsilon(\epsilon+\lambda_2)\cdots (\epsilon+\lambda_n)\neq 0.
\end{align*}
Since $\det(A+\epsilon I)\neq 0$, it yields that $A$ is invertible, hence the equation (*) has a unique solution
\[\mathbf{x}(\epsilon)=(A+\epsilon I)^{-1}\mathbf{v}.\]

Remark

This result is in general true for any square matrix.
Instead of using the diagonalization, we can use the triangulation of a matrix.

(b) Evaluate $\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)$

As noted earlier that a real symmetric matrix can be diagonalizable by an orthogonal matrix.
This means that there is an eigenvector $\mathbf{v}_i$ corresponding to the eigenvalue $\lambda_i$ for each $i$ such that the eigenvectors $\mathbf{v}_i$ form an orthonormal basis of $\R^n$.
That is,
\begin{align*}
A\mathbf{v}_i=\lambda_i \mathbf{v}_i \\
\langle \mathbf{v}_i,\mathbf{v}_j \rangle=\delta_{i,j},
\end{align*}
where $\delta_{i,j}$ is the Kronecker delta symbol, where $\delta_{i,i}=1, \delta_{i,j}=0$ if $i\neq j$.
From this, we deduce that
\begin{align*}
(A+\epsilon I)\mathbf{v}_i=(\lambda_i+\epsilon)\mathbf{v}_i\\
(A+\epsilon I)^{-1}\mathbf{v}_i=\frac{1}{\lambda_i+\epsilon}\mathbf{v}_i. \tag{**}
\end{align*}
Using the basis $\{\mathbf{v}_i\}$, we write
\[\mathbf{v}=\sum_{i=1}^nc_i \mathbf{v}_i\] for some $c_i\in \R$.


Then we compute
\begin{align*}
A\mathbf{x}(\epsilon)&=A(A+\epsilon I)^{-1}\mathbf{v} && \text{by part (a)}\\
&=A(A+\epsilon I)^{-1}\left(\, \sum_{i=1}^nc_i \mathbf{v}_i \,\right)\\
&=\sum_{i=1}^n c_iA(A+\epsilon I)^{-1}\mathbf{v}_i\\
&=\sum _{i=1}^n c_iA\left(\, \frac{1}{\lambda_i+\epsilon}\mathbf{v}_i \,\right) && \text{by (**)}\\
&=\sum_{i=1}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i && \text{since $A\mathbf{v}_i=\lambda_i\mathbf{v}_i$}\\
&=\sum_{i=2}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i && \text{since $\lambda_1=0$}.
\end{align*}


Therefore we have
\begin{align*}
\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)&=\lim_{\epsilon \to 0^+}\left(\, \mathbf{v}-A\mathbf{x}(\epsilon) \,\right)\\
&=\mathbf{v}-\lim_{\epsilon \to 0^+}\left(\, A\mathbf{x}(\epsilon) \,\right)\\
&= \sum_{i=1}^nc_i\mathbf{v}_i-\lim_{\epsilon \to 0^+}\left(\, \sum_{i=2}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i \,\right)\\
&=\sum_{i=1}c_i \mathbf{v}_i-\sum_{i=2}^n c_i \mathbf{v}_i\\
&=c_1\mathbf{v}_1.
\end{align*}
Using the orthonormality of the basis $\{\mathbf{v}_i\}$, we have
\[\langle\mathbf{v}, \mathbf{v}_1 \rangle=\sum_{i=1}^n \langle c_i\mathbf{v}_i, \mathbf{v}_1 \rangle=c_1.\]

Hence the required expression is
\[\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)=\langle\mathbf{v}, \mathbf{v}_1 \rangle\mathbf{v}_1,\] where $\mathbf{v}_1$ is the unit eigenvector corresponding to the eigenvalue $0$.


LoadingAdd to solve later

Sponsored Links

More from my site

  • If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set?If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set? Suppose that $A$ is a real $n\times n$ matrix. (a) Is it true that $A$ must commute with its transpose? (b) Suppose that the columns of $A$ (considered as vectors) form an orthonormal set. Is it true that the rows of $A$ must also form an orthonormal set? (University of […]
  • A Matrix Having One Positive Eigenvalue and One Negative EigenvalueA Matrix Having One Positive Eigenvalue and One Negative Eigenvalue Prove that the matrix \[A=\begin{bmatrix} 1 & 1.00001 & 1 \\ 1.00001 &1 &1.00001 \\ 1 & 1.00001 & 1 \end{bmatrix}\] has one positive eigenvalue and one negative eigenvalue. (University of California, Berkeley Qualifying Exam Problem)   Solution. Let us put […]
  • Simple Commutative Relation on MatricesSimple Commutative Relation on Matrices Let $A$ and $B$ are $n \times n$ matrices with real entries. Assume that $A+B$ is invertible. Then show that \[A(A+B)^{-1}B=B(A+B)^{-1}A.\] (University of California, Berkeley Qualifying Exam) Proof. Let $P=A+B$. Then $B=P-A$. Using these, we express the given […]
  • Prove that the Length $\|A^n\mathbf{v}\|$ is As Small As We Like.Prove that the Length $\|A^n\mathbf{v}\|$ is As Small As We Like. Consider the matrix \[A=\begin{bmatrix} 3/2 & 2\\ -1& -3/2 \end{bmatrix} \in M_{2\times 2}(\R).\] (a) Find the eigenvalues and corresponding eigenvectors of $A$. (b) Show that for $\mathbf{v}=\begin{bmatrix} 1 \\ 0 \end{bmatrix}\in \R^2$, we can choose […]
  • Inequality Regarding Ranks of MatricesInequality Regarding Ranks of Matrices Let $A$ be an $n \times n$ matrix over a field $K$. Prove that \[\rk(A^2)-\rk(A^3)\leq \rk(A)-\rk(A^2),\] where $\rk(B)$ denotes the rank of a matrix $B$. (University of California, Berkeley, Qualifying Exam) Hint. Regard the matrix as a linear transformation $A: […]
  • Inequality about Eigenvalue of a Real Symmetric MatrixInequality about Eigenvalue of a Real Symmetric Matrix Let $A$ be an $n\times n$ real symmetric matrix. Prove that there exists an eigenvalue $\lambda$ of $A$ such that for any vector $\mathbf{v}\in \R^n$, we have the inequality \[\mathbf{v}\cdot A\mathbf{v} \leq \lambda \|\mathbf{v}\|^2.\]     Proof. Recall […]
  • A Square Root Matrix of a Symmetric MatrixA Square Root Matrix of a Symmetric Matrix Answer the following two questions with justification. (a) Does there exist a $2 \times 2$ matrix $A$ with $A^3=O$ but $A^2 \neq O$? Here $O$ denotes the $2 \times 2$ zero matrix. (b) Does there exist a $3 \times 3$ real matrix $B$ such that $B^2=A$ […]
  • Square Root of an Upper Triangular Matrix. How Many Square Roots Exist?Square Root of an Upper Triangular Matrix. How Many Square Roots Exist? Find a square root of the matrix \[A=\begin{bmatrix} 1 & 3 & -3 \\ 0 &4 &5 \\ 0 & 0 & 9 \end{bmatrix}.\] How many square roots does this matrix have? (University of California, Berkeley Qualifying Exam)   Proof. We will find all matrices $B$ such that […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Diagonalization Problems and Solutions in Linear Algebra
Diagonalize the 3 by 3 Matrix if it is Diagonalizable

Determine whether the matrix \[A=\begin{bmatrix} 0 & 1 & 0 \\ -1 &0 &0 \\ 0 & 0 & 2...

Close