Two Eigenvectors Corresponding to Distinct Eigenvalues are Linearly Independent
Problem 187
Let $A$ be an $n\times n$ matrix. Suppose that $\lambda_1, \lambda_2$ are distinct eigenvalues of the matrix $A$ and let $\mathbf{v}_1, \mathbf{v}_2$ be eigenvectors corresponding to $\lambda_1, \lambda_2$, respectively.
Show that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent.
To show that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent, consider a linear combination
\[c_1 \mathbf{v}_1+c_2\mathbf{v}_2=\mathbf{0}, \tag{*}\]
where $c_1, c_2$ are complex numbers.
Our goal is to show that $c_1=c_2=0$.
By the definitions of eigenvalues and eigenvectors we have
\[A\mathbf{v}_1=\lambda_1 \mathbf{v}_1 \text{ and } A\mathbf{v}_2 =\lambda_2 \mathbf{v}_2.\]
Multiplying $A$ and (*), we have
\begin{align*}
\mathbf{0}&=A\cdot \mathbf{0}\\
&=A(c_1 \mathbf{v}_1+c_2\mathbf{v}_2)\\
&=c_1 A\mathbf{v}_1+c_2A\mathbf{v}_2\\
&=c_1\lambda_1 \mathbf{v}_1+c_2\lambda_2\mathbf{v}_2. \tag{**}
\end{align*}
Now we multiply $\lambda_2$ and (*) and we obtain
\[\mathbf{0}=c_1\lambda_2\mathbf{v}_1+c_2\lambda_2\mathbf{v}_2.\]
We subtract (**) from the last expression, and get
\begin{align*}
\mathbf{0}&=(c_1\lambda_2\mathbf{v}_1+c_2\lambda_2\mathbf{v}_2)
-(c_1\lambda_1 \mathbf{v}_1+c_2\lambda_2\mathbf{v}_2)\\
&=c_1(\lambda_2-\lambda_1)\mathbf{v}_1.
\end{align*}
Recall that an eigenvector is by definition a nonzero vector, and hence $\mathbf{v}_1\neq \mathbf{0}$.
Thus we must have
\[c_1(\lambda_2-\lambda_1)=0.\]
Since $\lambda_1$ and $\lambda_2$ are distinct, we must have $c_1=0$.
Substituting $c_1=0$ into (*), we also see that $c_2=0$ since $\mathbf{v}_2\neq \mathbf{0}$.
Therefore, the values of $c_1$ and $c_2$ are both zero, and hence the eigenvectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent.
Related Question.
As an application of this problem, try the next problem, which provides a geometrical meaning of eigenvalues.
Problem. Let $T:\R^2 \to \R^2$ be a linear transformation and let $A$ be the matrix representation of $T$ with respect to the standard basis of $\R^2$.
Prove that the following two statements are equivalent.
There are exactly two distinct lines $L_1, L_2$ in $\R^2$ passing through the origin that are mapped onto themselves:
\[T(L_1)=L_1 \text{ and } T(L_2)=L_2.\]
The matrix $A$ has two distinct nonzero real eigenvalues.
Linear Combination of Eigenvectors is Not an Eigenvector
Suppose that $\lambda$ and $\mu$ are two distinct eigenvalues of a square matrix $A$ and let $\mathbf{x}$ and $\mathbf{y}$ be eigenvectors corresponding to $\lambda$ and $\mu$, respectively.
If $a$ and $b$ are nonzero numbers, then prove that $a \mathbf{x}+b\mathbf{y}$ is not an […]
Given Eigenvectors and Eigenvalues, Compute a Matrix Product (Stanford University Exam)
Suppose that $\begin{bmatrix}
1 \\
1
\end{bmatrix}$ is an eigenvector of a matrix $A$ corresponding to the eigenvalue $3$ and that $\begin{bmatrix}
2 \\
1
\end{bmatrix}$ is an eigenvector of $A$ corresponding to the eigenvalue $-2$.
Compute $A^2\begin{bmatrix}
4 […]
Given All Eigenvalues and Eigenspaces, Compute a Matrix Product
Let $C$ be a $4 \times 4$ matrix with all eigenvalues $\lambda=2, -1$ and eigensapces
\[E_2=\Span\left \{\quad \begin{bmatrix}
1 \\
1 \\
1 \\
1
\end{bmatrix} \quad\right \} \text{ and } E_{-1}=\Span\left \{ \quad\begin{bmatrix}
1 \\
2 \\
1 \\
1
[…]
Any Vector is a Linear Combination of Basis Vectors Uniquely
Let $B=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a basis for a vector space $V$ over a scalar field $K$. Then show that any vector $\mathbf{v}\in V$ can be written uniquely as
\[\mathbf{v}=c_1\mathbf{v}_1+c_2\mathbf{v}_2+c_3\mathbf{v}_3,\]
where $c_1, c_2, c_3$ are […]
Linear Independent Vectors and the Vector Space Spanned By Them
Let $V$ be a vector space over a field $K$. Let $\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n$ be linearly independent vectors in $V$. Let $U$ be the subspace of $V$ spanned by these vectors, that is, $U=\Span \{\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n\}$.
Let […]
The Subset Consisting of the Zero Vector is a Subspace and its Dimension is Zero
Let $V$ be a subset of the vector space $\R^n$ consisting only of the zero vector of $\R^n$. Namely $V=\{\mathbf{0}\}$.
Then prove that $V$ is a subspace of $\R^n$.
Proof.
To prove that $V=\{\mathbf{0}\}$ is a subspace of $\R^n$, we check the following subspace […]
[…] general eigenvectors corresponding to distinct eigenvalues are linearly independent. Thus, $mathbf{v}_1, mathbf{v}_2$ are linearly independent. Hence the lines $L_1, L_2$ spanned by […]
There is a typo on the first line of the proof. “To show that the vectors v1,v2 are linearly dependent” should say independent.
Dear Alan,
Thank you for finding the typo. I fixed it. Thank you!