Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$.
Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$.

(Nagoya University, Linear Algebra Final Exam Problem)

Suppose that $n\times n$ matrices $A$ and $B$ are similar.

Then show that the nullity of $A$ is equal to the nullity of $B$.
In other words, the dimension of the null space (kernel) $\calN(A)$ of $A$ is the same as the dimension of the null space $\calN(B)$ of $B$.

For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by
\[A=\begin{bmatrix}
\cos\theta & -\sin\theta & 0 \\
\sin\theta &\cos\theta &0 \\
0 & 0 & 1
\end{bmatrix}.\]

Let
\[A=\begin{bmatrix}
1 & 3 & 3 \\
-3 &-5 &-3 \\
3 & 3 & 1
\end{bmatrix} \text{ and } B=\begin{bmatrix}
2 & 4 & 3 \\
-4 &-6 &-3 \\
3 & 3 & 1
\end{bmatrix}.\]
For this problem, you may use the fact that both matrices have the same characteristic polynomial:
\[p_A(\lambda)=p_B(\lambda)=-(\lambda-1)(\lambda+2)^2.\]

(a) Find all eigenvectors of $A$.

(b) Find all eigenvectors of $B$.

(c) Which matrix $A$ or $B$ is diagonalizable?

(d) Diagonalize the matrix stated in (c), i.e., find an invertible matrix $P$ and a diagonal matrix $D$ such that $A=PDP^{-1}$ or $B=PDP^{-1}$.

(Stanford University Linear Algebra Final Exam Problem)

In this post, we explain how to diagonalize a matrix if it is diagonalizable.

As an example, we solve the following problem.

Diagonalize the matrix
\[A=\begin{bmatrix}
4 & -3 & -3 \\
3 &-2 &-3 \\
-1 & 1 & 2
\end{bmatrix}\]
by finding a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.

(Update 10/15/2017. A new example problem was added.) Read solution

Determine all eigenvalues and their algebraic multiplicities of the matrix
\[A=\begin{bmatrix}
1 & a & 1 \\
a &1 &a \\
1 & a & 1
\end{bmatrix},\]
where $a$ is a real number.

Find the value(s) of $h$ for which the following set of vectors
\[\left \{ \mathbf{v}_1=\begin{bmatrix}
1 \\
0 \\
0
\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}
h \\
1 \\
-h
\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}
1 \\
2h \\
3h+1
\end{bmatrix}\right\}\]
is linearly independent.

(Boston College, Linear Algebra Midterm Exam Sample Problem)

Prove that the matrix
\[A=\begin{bmatrix}
1 & 1.00001 & 1 \\
1.00001 &1 &1.00001 \\
1 & 1.00001 & 1
\end{bmatrix}\]
has one positive eigenvalue and one negative eigenvalue.

(University of California, Berkeley Qualifying Exam Problem)