Let
\[
\mathbf{v}_{1}
=
\begin{bmatrix}
1 \\ 1
\end{bmatrix}
,\;
\mathbf{v}_{2}
=
\begin{bmatrix}
1 \\ -1
\end{bmatrix}
.
\]
Let $V=\Span(\mathbf{v}_{1},\mathbf{v}_{2})$. Do $\mathbf{v}_{1}$ and $\mathbf{v}_{2}$ form an orthonormal basis for $V$?
For this problem, use the real vectors
\[ \mathbf{v}_1 = \begin{bmatrix} -1 \\ 0 \\ 2 \end{bmatrix} , \mathbf{v}_2 = \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} , \mathbf{v}_3 = \begin{bmatrix} 2 \\ 2 \\ 3 \end{bmatrix} . \]
Suppose that $\mathbf{v}_4$ is another vector which is orthogonal to $\mathbf{v}_1$ and $\mathbf{v}_3$, and satisfying
\[ \mathbf{v}_2 \cdot \mathbf{v}_4 = -3 . \]
Let $\mathbb{R}^2$ be the vector space of size-2 column vectors. This vector space has an inner product defined by $ \langle \mathbf{v} , \mathbf{w} \rangle = \mathbf{v}^\trans \mathbf{w}$. A linear transformation $T : \R^2 \rightarrow \R^2$ is called an orthogonal transformation if for all $\mathbf{v} , \mathbf{w} \in \R^2$,
\[\langle T(\mathbf{v}) , T(\mathbf{w}) \rangle = \langle \mathbf{v} , \mathbf{w} \rangle.\]
For a fixed angle $\theta \in [0, 2 \pi )$ , define the matrix
\[ [T] = \begin{bmatrix} \cos (\theta) & – \sin ( \theta ) \\ \sin ( \theta ) & \cos ( \theta ) \end{bmatrix} \]
and the linear transformation $T : \R^2 \rightarrow \R^2$ by
\[T( \mathbf{v} ) = [T] \mathbf{v}.\]
Let $\mathbf{v}$ be a nonzero vector in $\R^n$.
Then the dot product $\mathbf{v}\cdot \mathbf{v}=\mathbf{v}^{\trans}\mathbf{v}\neq 0$.
Set $a:=\frac{2}{\mathbf{v}^{\trans}\mathbf{v}}$ and define the $n\times n$ matrix $A$ by
\[A=I-a\mathbf{v}\mathbf{v}^{\trans},\]
where $I$ is the $n\times n$ identity matrix.
Prove that $A$ is a symmetric matrix and $AA=I$.
Conclude that the inverse matrix is $A^{-1}=A$.
Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$.
Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$.
Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for $i=1,\dots, n$.
Prove that
\[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1.\]
Consider the $2\times 2$ real matrix
\[A=\begin{bmatrix}
1 & 1\\
1& 3
\end{bmatrix}.\]
(a) Prove that the matrix $A$ is positive definite.
(b) Since $A$ is positive definite by part (a), the formula
\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans} A \mathbf{y}\]
for $\mathbf{x}, \mathbf{y} \in \R^2$ defines an inner product on $\R^n$.
Consider $\R^2$ as an inner product space with this inner product.
Prove that the unit vectors
\[\mathbf{e}_1=\begin{bmatrix}
1 \\
0
\end{bmatrix} \text{ and } \mathbf{e}_2=\begin{bmatrix}
0 \\
1
\end{bmatrix}\]
are not orthogonal in the inner product space $\R^2$.
(c) Find an orthogonal basis $\{\mathbf{v}_1, \mathbf{v}_2\}$ of $\R^2$ from the basis $\{\mathbf{e}_1, \mathbf{e}_2\}$ using the Gram-Schmidt orthogonalization process.
(a) Suppose that $A$ is an $n\times n$ real symmetric positive definite matrix.
Prove that
\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\]
defines an inner product on the vector space $\R^n$.
(b) Let $A$ be an $n\times n$ real matrix. Suppose that
\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\]
defines an inner product on the vector space $\R^n$.
Prove that $A$ is symmetric and positive definite.
A square matrix $A$ is called idempotent if $A^2=A$.
(a) Let $\mathbf{u}$ be a vector in $\R^n$ with length $1$.
Define the matrix $P$ to be $P=\mathbf{u}\mathbf{u}^{\trans}$.
Prove that $P$ is an idempotent matrix.
(b) Suppose that $\mathbf{u}$ and $\mathbf{v}$ be unit vectors in $\R^n$ such that $\mathbf{u}$ and $\mathbf{v}$ are orthogonal.
Let $Q=\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans}$.
Prove that $Q$ is an idempotent matrix.
(c) Prove that each nonzero vector of the form $a\mathbf{u}+b\mathbf{v}$ for some $a, b\in \R$ is an eigenvector corresponding to the eigenvalue $1$ for the matrix $Q$ in part (b).
Let $A$ be an $n\times n$ matrix. Suppose that $A$ has real eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$ with corresponding eigenvectors $\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n$.
Furthermore, suppose that
\[|\lambda_1| > |\lambda_2| \geq \cdots \geq |\lambda_n|.\]
Let
\[\mathbf{x}_0=c_1\mathbf{u}_1+c_2\mathbf{u}_2+\cdots+c_n\mathbf{u}_n\]
for some real numbers $c_1, c_2, \dots, c_n$ and $c_1\neq 0$.
Define
\[\mathbf{x}_{k+1}=A\mathbf{x}_k \text{ for } k=0, 1, 2,\dots\]
and let
\[\beta_k=\frac{\mathbf{x}_k\cdot \mathbf{x}_{k+1}}{\mathbf{x}_k \cdot \mathbf{x}_k}=\frac{\mathbf{x}_k^{\trans} \mathbf{x}_{k+1}}{\mathbf{x}_k^{\trans} \mathbf{x}_k}.\]
Prove that
\[\lim_{k\to \infty} \beta_k=\lambda_1.\]
Let
\[\mathbf{v}=\begin{bmatrix}
a \\
b \\
c
\end{bmatrix}, \qquad \mathbf{v}_1=\begin{bmatrix}
1 \\
2 \\
0
\end{bmatrix}, \qquad \mathbf{v}_2=\begin{bmatrix}
2 \\
-1 \\
2
\end{bmatrix}.\]
Find the necessary and sufficient condition so that the vector $\mathbf{v}$ is a linear combination of the vectors $\mathbf{v}_1, \mathbf{v}_2$.
(a) For what value(s) of $a$ is the following set $S$ linearly dependent?
\[ S=\left \{\,\begin{bmatrix}
1 \\
2 \\
3 \\
a
\end{bmatrix}, \begin{bmatrix}
a \\
0 \\
-1 \\
2
\end{bmatrix}, \begin{bmatrix}
0 \\
0 \\
a^2 \\
7
\end{bmatrix}, \begin{bmatrix}
1 \\
a \\
1 \\
1
\end{bmatrix}, \begin{bmatrix}
2 \\
-2 \\
3 \\
a^3
\end{bmatrix} \, \right\}.\]
(b) Let $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a set of nonzero vectors in $\R^m$ such that the dot product
\[\mathbf{v}_i\cdot \mathbf{v}_j=0\]
when $i\neq j$.
Prove that the set is linearly independent.
Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are
\[\|\mathbf{a}\|=\|\mathbf{b}\|=1\]
and the inner product
\[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\]
Then determine the length $\|\mathbf{a}-\mathbf{b}\|$.
(Note that this length is the distance between $\mathbf{a}$ and $\mathbf{b}$.)
Let $\mathbf{u}$ and $\mathbf{v}$ be vectors in $\R^n$, and let $I$ be the $n \times n$ identity matrix. Suppose that the inner product of $\mathbf{u}$ and $\mathbf{v}$ satisfies
\[\mathbf{v}^{\trans}\mathbf{u}\neq -1.\]
Define the matrix
\[A=I+\mathbf{u}\mathbf{v}^{\trans}.\]
Prove that $A$ is invertible and the inverse matrix is given by the formula
\[A^{-1}=I-a\mathbf{u}\mathbf{v}^{\trans},\]
where
\[a=\frac{1}{1+\mathbf{v}^{\trans}\mathbf{u}}.\]
This formula is called the Sherman-Woodberry formula.
Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$.
Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$.
(Nagoya University, Linear Algebra Final Exam Problem)
For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by
\[A=\begin{bmatrix}
\cos\theta & -\sin\theta & 0 \\
\sin\theta &\cos\theta &0 \\
0 & 0 & 1
\end{bmatrix}.\]