If Matrices Commute $AB=BA$, then They Share a Common Eigenvector

Problems and Solutions of Eigenvalue, Eigenvector in Linear Algebra

Problem 608

Let $A$ and $B$ be $n\times n$ matrices and assume that they commute: $AB=BA$.
Then prove that the matrices $A$ and $B$ share at least one common eigenvector.

 
LoadingAdd to solve later

Sponsored Links

Proof.

Let $\lambda$ be an eigenvalue of $A$ and let $\mathbf{x}$ be an eigenvector corresponding to $\lambda$. That is, we have $A\mathbf{x}=\lambda \mathbf{x}$.
Then we claim that the vector $\mathbf{v}:=B\mathbf{x}$ belongs to the eigenspace $E_{\lambda}$ of $\lambda$.
In fact, as $AB=BA$ we have
\begin{align*}
A\mathbf{v}=AB\mathbf{x}=BA\mathbf{x} =\lambda B\mathbf{x}=\lambda \mathbf{v}.
\end{align*}
Hence $\mathbf{v}\in E_{\lambda}$.


Now, let $\{\mathbf{x}_1, \dots, \mathbf{x}_k\}$ be an eigenbasis of the eigenspace $E_{\lambda}$.
Set $\mathbf{v}_i=B\mathbf{x}_i$ for $i=1, \dots, k$.
The above claim yields that $\mathbf{v}_i \in E_{\lambda}$, and hence we can write
\[\mathbf{v}_i=c_{1i} \mathbf{x}_1+c_{2i}\mathbf{x}_2+\cdots+c_{ki}\mathbf{x}_k \tag{*}\] for some scalars $c_{1i}, c_{2i}, \dots, c_{ki}$.


Extend the basis $\{\mathbf{x}_1, \dots, \mathbf{x}_k\}$ of $E_{\lambda}$ to a basis
\[\{\mathbf{x}_1, \dots, \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n\}\] of $\R^n$ by adjoining vectors $\mathbf{x}_{k+1}, \dots, \mathbf{x}_n$.


Then we obtain using (*)
\begin{align*}
&B [\mathbf{x}_1,\dots , \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n]\\
&=[B\mathbf{x}_1,\dots , B\mathbf{x}_k, B\mathbf{x}_{k+1}, \dots, B\mathbf{x}_n]\\
&=[\mathbf{v}_1,\dots , \mathbf{v}_k, B\mathbf{x}_{k+1}, \dots, B\mathbf{x}_n]\\[6pt] &=[\mathbf{x}_1,\dots , \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n] \left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right],\tag{**}
\end{align*}

where $C=(c_{ij})$ is the $k\times k$ matrix whose entries are the coefficients $c_{ij}$ of the linear combination (*), $O$ is the $(n-k) \times k$ zero matrix, $D$ is a $k \times (n-k)$ matrix, and $F$ is an $(n-k) \times (n-k)$ matrix.


Let $P=[\mathbf{x}_1,\dots , \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n]$.
As the column vectors of $P$ are linearly independent, $P$ is invertible.

From (**), we obtain
\[P^{-1}BP=\left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right].\] It follows that
\begin{align*}
\det(B-tI)&=\det(P^{-1}BP-tI)=\left|\begin{array}{c|c}
C-tI & D\\
\hline
O & F-tI
\end{array}
\right|=\det(C-tI)\det(F-tI).
\end{align*}


Let $\mu$ be an eigenvalue of the matrix $C$ and let $\mathbf{a}$ be an eigenvector corresponding to $\mu$.
Then as $\det(C-\mu I)=0$, we see that $\det(B-\mu I)=0$ and $\mu$ is an eigenvalue of $B$.

Write
\[\mathbf{a}=\begin{bmatrix}
a_1 \\
a_2 \\
\vdots \\
a_k
\end{bmatrix}\neq \mathbf{0}\] and define a new vector by
\[\mathbf{y}=a_1\mathbf{x}_1+\cdots +a_k \mathbf{x}_k\in E_{\lambda}.\] Then $\mathbf{y}$ is an eigenvector in $E_{\lambda}$ since it is a nonzero (as $\mathbf{a}\neq \mathbf{0}$) linear combination of the basis $E_{\lambda}$.


Multiplying $BP=P\left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right]$ from (**) by the $n$-dimensional vector
\[\begin{bmatrix}
\mathbf{a} \\
0 \\
\vdots\\
0
\end{bmatrix}=\begin{bmatrix}
a_1 \\
\vdots \\
a_k \\
0 \\
\vdots\\
0
\end{bmatrix}\] on the right, we have
\[BP\begin{bmatrix}
\mathbf{a} \\
0 \\
\vdots\\
0
\end{bmatrix}=P\left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right] \begin{bmatrix}
\mathbf{a} \\
0 \\
\vdots\\
0
\end{bmatrix}.\]


The left hand side is equal to
\[B[a_1\mathbf{x}_1+\cdots+a_k \mathbf{x}_k]=B\mathbf{y}.\]

On the other hand, the right hand side is equal to
\begin{align*}
P\begin{bmatrix}
C \mathbf{a}\\
\mathbf{0}
\end{bmatrix}
=P\begin{bmatrix}
\mu \mathbf{a} \\
\mathbf{0}
\end{bmatrix}=\mu P\begin{bmatrix}
\mathbf{a} \\
\mathbf{0}
\end{bmatrix}
=\mu [a_1\mathbf{x}_1+\cdots+a_k \mathbf{x}_k]=\mu \mathbf{y}.
\end{align*}


Therefore, we obtain
\[B\mathbf{y}=\mu \mathbf{y}.\] This proves that the vector $\mathbf{y}$ is eigenvector of both $A$ and $B$.
Hence, this completes the proof.


LoadingAdd to solve later

Sponsored Links

More from my site

  • All the Eigenvectors of a Matrix Are Eigenvectors of Another MatrixAll the Eigenvectors of a Matrix Are Eigenvectors of Another Matrix Let $A$ and $B$ be an $n \times n$ matrices. Suppose that all the eigenvalues of $A$ are distinct and the matrices $A$ and $B$ commute, that is $AB=BA$. Then prove that each eigenvector of $A$ is an eigenvector of $B$. (It could be that each eigenvector is an eigenvector for […]
  • Determine Eigenvalues, Eigenvectors, Diagonalizable From a Partial Information of a MatrixDetermine Eigenvalues, Eigenvectors, Diagonalizable From a Partial Information of a Matrix Suppose the following information is known about a $3\times 3$ matrix $A$. \[A\begin{bmatrix} 1 \\ 2 \\ 1 \end{bmatrix}=6\begin{bmatrix} 1 \\ 2 \\ 1 \end{bmatrix}, \quad A\begin{bmatrix} 1 \\ -1 \\ 1 […]
  • Common Eigenvector of Two Matrices and Determinant of CommutatorCommon Eigenvector of Two Matrices and Determinant of Commutator Let $A$ and $B$ be $n\times n$ matrices. Suppose that these matrices have a common eigenvector $\mathbf{x}$. Show that $\det(AB-BA)=0$. Steps. Write down eigenequations of $A$ and $B$ with the eigenvector $\mathbf{x}$. Show that AB-BA is singular. A matrix is […]
  • Is an Eigenvector of a Matrix an Eigenvector of its Inverse?Is an Eigenvector of a Matrix an Eigenvector of its Inverse? Suppose that $A$ is an $n \times n$ matrix with eigenvalue $\lambda$ and corresponding eigenvector $\mathbf{v}$. (a) If $A$ is invertible, is $\mathbf{v}$ an eigenvector of $A^{-1}$? If so, what is the corresponding eigenvalue? If not, explain why not. (b) Is $3\mathbf{v}$ an […]
  • Common Eigenvector of Two Matrices $A, B$ is Eigenvector of $A+B$ and $AB$.Common Eigenvector of Two Matrices $A, B$ is Eigenvector of $A+B$ and $AB$. Let $\lambda$ be an eigenvalue of $n\times n$ matrices $A$ and $B$ corresponding to the same eigenvector $\mathbf{x}$. (a) Show that $2\lambda$ is an eigenvalue of $A+B$ corresponding to $\mathbf{x}$. (b) Show that $\lambda^2$ is an eigenvalue of $AB$ corresponding to […]
  • Diagonalize the 3 by 3 Matrix if it is DiagonalizableDiagonalize the 3 by 3 Matrix if it is Diagonalizable Determine whether the matrix \[A=\begin{bmatrix} 0 & 1 & 0 \\ -1 &0 &0 \\ 0 & 0 & 2 \end{bmatrix}\] is diagonalizable. If it is diagonalizable, then find the invertible matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.   How to […]
  • Find a Basis of the Eigenspace Corresponding to a Given EigenvalueFind a Basis of the Eigenspace Corresponding to a Given Eigenvalue Let \[A=\begin{bmatrix} 1 & 2 & 1 \\ -1 &4 &1 \\ 2 & -4 & 0 \end{bmatrix}.\] The matrix $A$ has an eigenvalue $2$. Find a basis of the eigenspace $E_2$ corresponding to the eigenvalue $2$. (The Ohio State University, Linear Algebra Final Exam […]
  • Idempotent Matrices are DiagonalizableIdempotent Matrices are Diagonalizable Let $A$ be an $n\times n$ idempotent matrix, that is, $A^2=A$. Then prove that $A$ is diagonalizable.   We give three proofs of this problem. The first one proves that $\R^n$ is a direct sum of eigenspaces of $A$, hence $A$ is diagonalizable. The second proof proves […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Linear Algebra
Ohio State University exam problems and solutions in mathematics
Find a Basis of the Subspace Spanned by Four Polynomials of Degree 3 or Less

Let $\calP_3$ be the vector space of all polynomials of degree $3$ or less. Let \[S=\{p_1(x), p_2(x), p_3(x), p_4(x)\},\] where...

Close