Suppose that $\lambda$ and $\mu$ are two distinct eigenvalues of a square matrix $A$ and let $\mathbf{x}$ and $\mathbf{y}$ be eigenvectors corresponding to $\lambda$ and $\mu$, respectively.
If $a$ and $b$ are nonzero numbers, then prove that $a \mathbf{x}+b\mathbf{y}$ is not an eigenvector of $A$ (corresponding to any eigenvalue of $A$).

We use the following fact in the proof. Fact: Two eigenvectors corresponding to distinct eigenvalues are linearly independent.

Proof.

Seeking a contradiction, we assume that $a \mathbf{x}+b\mathbf{y}$ is an eigenvector corresponding to an eigenvalue $\zeta$.
Thus we have
\begin{align*}
A(a \mathbf{x}+b\mathbf{y})=\zeta (a \mathbf{x}+b\mathbf{y}). \tag{*}
\end{align*}

We calculate the left hand side of this equality as follows.
We have
\begin{align*}
A(a \mathbf{x}+b\mathbf{y})&=a A\mathbf{x}+bA\mathbf{y}\\
&=a\lambda \mathbf{x}+b\mu \mathbf{y}
\end{align*}
since $A\mathbf{x}=\lambda \mathbf{x}, A\mathbf{y}=\mu \mathbf{y}$ by defintion.

Therefore, from (*) we obtain
\begin{align*}
a(\lambda -\zeta) \mathbf{x}+b(\mu-\zeta)\mathbf{y}=\mathbf{0}.
\end{align*}

Recall that eigenvectors corresponding to distinct eigenvalues are linearly independent. Thus $\mathbf{x}$ and $\mathbf{y}$ are linearly independent.

Thus, the coefficients of the above linear combinations must be zero:
\[a(\lambda -\zeta)=0 \text{ and } b(\mu-\zeta)=0.\]

Since $a\neq0, b\neq 0$, this implies that we have
\[\lambda=\zeta=\mu,\]
and this is a contradiction because $\lambda$ and $\mu$ are supposed to be distinct.

Hence, $a \mathbf{x}+b\mathbf{y}$ cannot be an eigenvector of any eigenvalue of $A$.

Two Eigenvectors Corresponding to Distinct Eigenvalues are Linearly Independent
Let $A$ be an $n\times n$ matrix. Suppose that $\lambda_1, \lambda_2$ are distinct eigenvalues of the matrix $A$ and let $\mathbf{v}_1, \mathbf{v}_2$ be eigenvectors corresponding to $\lambda_1, \lambda_2$, respectively.
Show that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are […]

Two Subspaces Intersecting Trivially, and the Direct Sum of Vector Spaces.
Let $V$ and $W$ be subspaces of $\R^n$ such that $V \cap W =\{\mathbf{0}\}$ and $\dim(V)+\dim(W)=n$.
(a) If $\mathbf{v}+\mathbf{w}=\mathbf{0}$, where $\mathbf{v}\in V$ and $\mathbf{w}\in W$, then show that $\mathbf{v}=\mathbf{0}$ and $\mathbf{w}=\mathbf{0}$.
(b) If $B_1$ is a […]

Compute Determinant of a Matrix Using Linearly Independent Vectors
Let $A$ be a $3 \times 3$ matrix.
Let $\mathbf{x}, \mathbf{y}, \mathbf{z}$ are linearly independent $3$-dimensional vectors. Suppose that we have
\[A\mathbf{x}=\begin{bmatrix}
1 \\
0 \\
1
\end{bmatrix}, A\mathbf{y}=\begin{bmatrix}
0 \\
1 \\
0
[…]

If Vectors are Linearly Dependent, then What Happens When We Add One More Vectors?
Suppose that $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r$ are linearly dependent $n$-dimensional real vectors.
For any vector $\mathbf{v}_{r+1} \in \R^n$, determine whether the vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r, \mathbf{v}_{r+1}$ are linearly […]

Given Eigenvectors and Eigenvalues, Compute a Matrix Product (Stanford University Exam)
Suppose that $\begin{bmatrix}
1 \\
1
\end{bmatrix}$ is an eigenvector of a matrix $A$ corresponding to the eigenvalue $3$ and that $\begin{bmatrix}
2 \\
1
\end{bmatrix}$ is an eigenvector of $A$ corresponding to the eigenvalue $-2$.
Compute $A^2\begin{bmatrix}
4 […]

Given All Eigenvalues and Eigenspaces, Compute a Matrix Product
Let $C$ be a $4 \times 4$ matrix with all eigenvalues $\lambda=2, -1$ and eigensapces
\[E_2=\Span\left \{\quad \begin{bmatrix}
1 \\
1 \\
1 \\
1
\end{bmatrix} \quad\right \} \text{ and } E_{-1}=\Span\left \{ \quad\begin{bmatrix}
1 \\
2 \\
1 \\
1
[…]