# Two Eigenvectors Corresponding to Distinct Eigenvalues are Linearly Independent

## Problem 187

Let $A$ be an $n\times n$ matrix. Suppose that $\lambda_1, \lambda_2$ are distinct eigenvalues of the matrix $A$ and let $\mathbf{v}_1, \mathbf{v}_2$ be eigenvectors corresponding to $\lambda_1, \lambda_2$, respectively.

Show that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent.

## Proof.

To show that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent, consider a linear combination
$c_1 \mathbf{v}_1+c_2\mathbf{v}_2=\mathbf{0}, \tag{*}$ where $c_1, c_2$ are complex numbers.
Our goal is to show that $c_1=c_2=0$.

By the definitions of eigenvalues and eigenvectors we have
$A\mathbf{v}_1=\lambda_1 \mathbf{v}_1 \text{ and } A\mathbf{v}_2 =\lambda_2 \mathbf{v}_2.$ Multiplying $A$ and (*), we have
\begin{align*}
\mathbf{0}&=A\cdot \mathbf{0}\\
&=A(c_1 \mathbf{v}_1+c_2\mathbf{v}_2)\\
&=c_1 A\mathbf{v}_1+c_2A\mathbf{v}_2\\
&=c_1\lambda_1 \mathbf{v}_1+c_2\lambda_2\mathbf{v}_2. \tag{**}
\end{align*}

Now we multiply $\lambda_2$ and (*) and we obtain
$\mathbf{0}=c_1\lambda_2\mathbf{v}_1+c_2\lambda_2\mathbf{v}_2.$ We subtract (**) from the last expression, and get
\begin{align*}
\mathbf{0}&=(c_1\lambda_2\mathbf{v}_1+c_2\lambda_2\mathbf{v}_2)
-(c_1\lambda_1 \mathbf{v}_1+c_2\lambda_2\mathbf{v}_2)\\
&=c_1(\lambda_2-\lambda_1)\mathbf{v}_1.
\end{align*}

Recall that an eigenvector is by definition a nonzero vector, and hence $\mathbf{v}_1\neq \mathbf{0}$.
Thus we must have
$c_1(\lambda_2-\lambda_1)=0.$ Since $\lambda_1$ and $\lambda_2$ are distinct, we must have $c_1=0$.

Substituting $c_1=0$ into (*), we also see that $c_2=0$ since $\mathbf{v}_2\neq \mathbf{0}$.
Therefore, the values of $c_1$ and $c_2$ are both zero, and hence the eigenvectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent.

## Related Question.

As an application of this problem, try the next problem, which provides a geometrical meaning of eigenvalues.

Problem. Let $T:\R^2 \to \R^2$ be a linear transformation and let $A$ be the matrix representation of $T$ with respect to the standard basis of $\R^2$.

Prove that the following two statements are equivalent.

1. There are exactly two distinct lines $L_1, L_2$ in $\R^2$ passing through the origin that are mapped onto themselves:
$T(L_1)=L_1 \text{ and } T(L_2)=L_2.$
2. The matrix $A$ has two distinct nonzero real eigenvalues.

For a proof of this problem, see the post “A Linear Transformation Preserves Exactly Two Lines If and Only If There are Two Real Non-Zero Eigenvalues“.

### 3 Responses

1. Alan says:

There is a typo on the first line of the proof. “To show that the vectors v1,v2 are linearly dependent” should say independent.

• Yu says:

Dear Alan,

Thank you for finding the typo. I fixed it. Thank you!

1. 06/23/2017

[…] general eigenvectors corresponding to distinct eigenvalues are linearly independent. Thus, $mathbf{v}_1, mathbf{v}_2$ are linearly independent. Hence the lines $L_1, L_2$ spanned by […]

##### Is the Determinant of a Matrix Additive?

Let $A$ and $B$ be $n\times n$ matrices, where $n$ is an integer greater than $1$. Is it true that...

Close