If Matrices Commute $AB=BA$, then They Share a Common Eigenvector

Problems and Solutions of Eigenvalue, Eigenvector in Linear Algebra

Problem 608

Let $A$ and $B$ be $n\times n$ matrices and assume that they commute: $AB=BA$.
Then prove that the matrices $A$ and $B$ share at least one common eigenvector.

 
LoadingAdd to solve later

Sponsored Links

Proof.

Let $\lambda$ be an eigenvalue of $A$ and let $\mathbf{x}$ be an eigenvector corresponding to $\lambda$. That is, we have $A\mathbf{x}=\lambda \mathbf{x}$.
Then we claim that the vector $\mathbf{v}:=B\mathbf{x}$ belongs to the eigenspace $E_{\lambda}$ of $\lambda$.
In fact, as $AB=BA$ we have
\begin{align*}
A\mathbf{v}=AB\mathbf{x}=BA\mathbf{x} =\lambda B\mathbf{x}=\lambda \mathbf{v}.
\end{align*}
Hence $\mathbf{v}\in E_{\lambda}$.


Now, let $\{\mathbf{x}_1, \dots, \mathbf{x}_k\}$ be an eigenbasis of the eigenspace $E_{\lambda}$.
Set $\mathbf{v}_i=B\mathbf{x}_i$ for $i=1, \dots, k$.
The above claim yields that $\mathbf{v}_i \in E_{\lambda}$, and hence we can write
\[\mathbf{v}_i=c_{1i} \mathbf{x}_1+c_{2i}\mathbf{x}_2+\cdots+c_{ki}\mathbf{x}_k \tag{*}\] for some scalars $c_{1i}, c_{2i}, \dots, c_{ki}$.


Extend the basis $\{\mathbf{x}_1, \dots, \mathbf{x}_k\}$ of $E_{\lambda}$ to a basis
\[\{\mathbf{x}_1, \dots, \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n\}\] of $\R^n$ by adjoining vectors $\mathbf{x}_{k+1}, \dots, \mathbf{x}_n$.


Then we obtain using (*)
\begin{align*}
&B [\mathbf{x}_1,\dots , \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n]\\
&=[B\mathbf{x}_1,\dots , B\mathbf{x}_k, B\mathbf{x}_{k+1}, \dots, B\mathbf{x}_n]\\
&=[\mathbf{v}_1,\dots , \mathbf{v}_k, B\mathbf{x}_{k+1}, \dots, B\mathbf{x}_n]\\[6pt] &=[\mathbf{x}_1,\dots , \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n] \left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right],\tag{**}
\end{align*}

where $C=(c_{ij})$ is the $k\times k$ matrix whose entries are the coefficients $c_{ij}$ of the linear combination (*), $O$ is the $(n-k) \times k$ zero matrix, $D$ is a $k \times (n-k)$ matrix, and $F$ is an $(n-k) \times (n-k)$ matrix.


Let $P=[\mathbf{x}_1,\dots , \mathbf{x}_k, \mathbf{x}_{k+1}, \dots, \mathbf{x}_n]$.
As the column vectors of $P$ are linearly independent, $P$ is invertible.

From (**), we obtain
\[P^{-1}BP=\left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right].\] It follows that
\begin{align*}
\det(B-tI)&=\det(P^{-1}BP-tI)=\left|\begin{array}{c|c}
C-tI & D\\
\hline
O & F-tI
\end{array}
\right|=\det(C-tI)\det(F-tI).
\end{align*}


Let $\mu$ be an eigenvalue of the matrix $C$ and let $\mathbf{a}$ be an eigenvector corresponding to $\mu$.
Then as $\det(C-\mu I)=0$, we see that $\det(B-\mu I)=0$ and $\mu$ is an eigenvalue of $B$.

Write
\[\mathbf{a}=\begin{bmatrix}
a_1 \\
a_2 \\
\vdots \\
a_k
\end{bmatrix}\neq \mathbf{0}\] and define a new vector by
\[\mathbf{y}=a_1\mathbf{x}_1+\cdots +a_k \mathbf{x}_k\in E_{\lambda}.\] Then $\mathbf{y}$ is an eigenvector in $E_{\lambda}$ since it is a nonzero (as $\mathbf{a}\neq \mathbf{0}$) linear combination of the basis $E_{\lambda}$.


Multiplying $BP=P\left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right]$ from (**) by the $n$-dimensional vector
\[\begin{bmatrix}
\mathbf{a} \\
0 \\
\vdots\\
0
\end{bmatrix}=\begin{bmatrix}
a_1 \\
\vdots \\
a_k \\
0 \\
\vdots\\
0
\end{bmatrix}\] on the right, we have
\[BP\begin{bmatrix}
\mathbf{a} \\
0 \\
\vdots\\
0
\end{bmatrix}=P\left[\begin{array}{c|c}
C & D\\
\hline
O & F
\end{array}
\right] \begin{bmatrix}
\mathbf{a} \\
0 \\
\vdots\\
0
\end{bmatrix}.\]


The left hand side is equal to
\[B[a_1\mathbf{x}_1+\cdots+a_k \mathbf{x}_k]=B\mathbf{y}.\]

On the other hand, the right hand side is equal to
\begin{align*}
P\begin{bmatrix}
C \mathbf{a}\\
\mathbf{0}
\end{bmatrix}
=P\begin{bmatrix}
\mu \mathbf{a} \\
\mathbf{0}
\end{bmatrix}=\mu P\begin{bmatrix}
\mathbf{a} \\
\mathbf{0}
\end{bmatrix}
=\mu [a_1\mathbf{x}_1+\cdots+a_k \mathbf{x}_k]=\mu \mathbf{y}.
\end{align*}


Therefore, we obtain
\[B\mathbf{y}=\mu \mathbf{y}.\] This proves that the vector $\mathbf{y}$ is eigenvector of both $A$ and $B$.
Hence, this completes the proof.


LoadingAdd to solve later

Sponsored Links

More from my site

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Ohio State University exam problems and solutions in mathematics
Find a Basis of the Subspace Spanned by Four Polynomials of Degree 3 or Less

Let $\calP_3$ be the vector space of all polynomials of degree $3$ or less. Let \[S=\{p_1(x), p_2(x), p_3(x), p_4(x)\},\] where...

Close