Suppose that $\lambda$ and $\mu$ are two distinct eigenvalues of a square matrix $A$ and let $\mathbf{x}$ and $\mathbf{y}$ be eigenvectors corresponding to $\lambda$ and $\mu$, respectively.
If $a$ and $b$ are nonzero numbers, then prove that $a \mathbf{x}+b\mathbf{y}$ is not an eigenvector of $A$ (corresponding to any eigenvalue of $A$).
Let $P_4$ be the vector space consisting of all polynomials of degree $4$ or less with real number coefficients.
Let $W$ be the subspace of $P_2$ by
\[W=\{ p(x)\in P_4 \mid p(1)+p(-1)=0 \text{ and } p(2)+p(-2)=0 \}.\]
Find a basis of the subspace $W$ and determine the dimension of $W$.
Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are
\[\|\mathbf{a}\|=\|\mathbf{b}\|=1\]
and the inner product
\[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\]
Then determine the length $\|\mathbf{a}-\mathbf{b}\|$.
(Note that this length is the distance between $\mathbf{a}$ and $\mathbf{b}$.)
Determine whether the following is true or false. If it is true, then give a proof. If it is false, then give a counterexample.
Let $W_1$ and $W_2$ be subspaces of the vector space $\R^n$.
If $B_1$ and $B_2$ are bases for $W_1$ and $W_2$, respectively, then $B_1\cap B_2$ is a basis of the subspace $W_1\cap W_2$.
Let $\mathbf{u}$ and $\mathbf{v}$ be vectors in $\R^n$, and let $I$ be the $n \times n$ identity matrix. Suppose that the inner product of $\mathbf{u}$ and $\mathbf{v}$ satisfies
\[\mathbf{v}^{\trans}\mathbf{u}\neq -1.\]
Define the matrix
\[A=I+\mathbf{u}\mathbf{v}^{\trans}.\]
Prove that $A$ is invertible and the inverse matrix is given by the formula
\[A^{-1}=I-a\mathbf{u}\mathbf{v}^{\trans},\]
where
\[a=\frac{1}{1+\mathbf{v}^{\trans}\mathbf{u}}.\]
This formula is called the Sherman-Woodberry formula.
Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$.
Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$.
(Nagoya University, Linear Algebra Final Exam Problem)
For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by
\[A=\begin{bmatrix}
\cos\theta & -\sin\theta & 0 \\
\sin\theta &\cos\theta &0 \\
0 & 0 & 1
\end{bmatrix}.\]
Find the value(s) of $h$ for which the following set of vectors
\[\left \{ \mathbf{v}_1=\begin{bmatrix}
1 \\
0 \\
0
\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}
h \\
1 \\
-h
\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}
1 \\
2h \\
3h+1
\end{bmatrix}\right\}\]
is linearly independent.
(Boston College, Linear Algebra Midterm Exam Sample Problem)
Let $A$ be an $n\times n$ matrix. Suppose that $\lambda_1, \lambda_2$ are distinct eigenvalues of the matrix $A$ and let $\mathbf{v}_1, \mathbf{v}_2$ be eigenvectors corresponding to $\lambda_1, \lambda_2$, respectively.
Show that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent.
Let $A=(a_{ij})$ be an $n \times n$ matrix.
We say that $A=(a_{ij})$ is a right stochastic matrix if each entry $a_{ij}$ is nonnegative and the sum of the entries of each row is $1$. That is, we have
\[a_{ij}\geq 0 \quad \text{ and } \quad a_{i1}+a_{i2}+\cdots+a_{in}=1\]
for $1 \leq i, j \leq n$.
Let $A=(a_{ij})$ be an $n\times n$ right stochastic matrix. Then show the following statements.
(a)The stochastic matrix $A$ has an eigenvalue $1$.
(b) The absolute value of any eigenvalue of the stochastic matrix $A$ is less than or equal to $1$.
Suppose that $A$ and $P$ are $3 \times 3$ matrices and $P$ is invertible matrix.
If
\[P^{-1}AP=\begin{bmatrix}
1 & 2 & 3 \\
0 &4 &5 \\
0 & 0 & 6
\end{bmatrix},\]
then find all the eigenvalues of the matrix $A^2$.
Let $T$ be a linear transformation from the vector space $\R^3$ to $\R^3$.
Suppose that $k=3$ is the smallest positive integer such that $T^k=\mathbf{0}$ (the zero linear transformation) and suppose that we have $\mathbf{x}\in \R^3$ such that $T^2\mathbf{x}\neq \mathbf{0}$.
Show that the vectors $\mathbf{x}, T\mathbf{x}, T^2\mathbf{x}$ form a basis for $\R^3$.
(The Ohio State University Linear Algebra Exam Problem)
Suppose that $\begin{bmatrix}
1 \\
1
\end{bmatrix}$ is an eigenvector of a matrix $A$ corresponding to the eigenvalue $3$ and that $\begin{bmatrix}
2 \\
1
\end{bmatrix}$ is an eigenvector of $A$ corresponding to the eigenvalue $-2$.
Compute $A^2\begin{bmatrix}
4 \\
3
\end{bmatrix}$.
Suppose the following information is known about a $3\times 3$ matrix $A$.
\[A\begin{bmatrix}
1 \\
2 \\
1
\end{bmatrix}=6\begin{bmatrix}
1 \\
2 \\
1
\end{bmatrix},
\quad
A\begin{bmatrix}
1 \\
-1 \\
1
\end{bmatrix}=3\begin{bmatrix}
1 \\
-1 \\
1
\end{bmatrix}, \quad
A\begin{bmatrix}
2 \\
-1 \\
0
\end{bmatrix}=3\begin{bmatrix}
1 \\
-1 \\
1
\end{bmatrix}.\]
(a) Find the eigenvalues of $A$.
(b) Find the corresponding eigenspaces.
(c) In each of the following questions, you must give a correct reason (based on the theory of eigenvalues and eigenvectors) to get full credit.
Is $A$ a diagonalizable matrix?
Is $A$ an invertible matrix?
Is $A$ an idempotent matrix?