Dot Products and Length of Vectors
Definition
Let $A$ be a square matrix.
 Two vectors $\mathbf{v}_1$ and $\mathbf{v}_2$ are orthogonal (perpendicular) if $\mathbf{v}_1\cdot \mathbf{v}_2=0$.
 Let $\mathbf{v}$ be $n$dimensional complex vector. Then the length of $\mathbf{v}$ is defined to be
\[\\mathbf{v}\=\sqrt{\bar{\mathbf{v}}^{\trans}\mathbf{v}}.\]  The distance between two vectors $\mathbf{v}_1, \mathbf{v}_2$ is the length $\\mathbf{v}_1\mathbf{v}_2\$.
 $A$ is orthogonal if $A^{\trans}A=I=A A^{\trans}$.
 $A$ is normal if $A^*A=A=AA^*$. Here $A^*=\bar{A}^{\trans}$.
Summary
Let $A$ be a square matrix. Let $\mathbf{v} \in K^n$, where $K=\R$ or $\C$.
 $\\mathbf{v}\\geq 0$.
 $A$ is orthogonal if and only if its column vectors form an orthonormal set.
=solution
Problems

Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in $\mathbf{u_1}=\mathbf{u_2}+a\mathbf{u}_3$.
(The Ohio State University) 
Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are $\\mathbf{a}\=\\mathbf{b}\=1$ and the inner product
\[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=\frac{1}{2}.\] Then determine the length $\\mathbf{a}\mathbf{b}\$.  Let $A$ and $B$ be $n\times n$ skewsymmetric matrices. Namely $A^{\trans}=A$ and $B^{\trans}=B$.
(a) Prove that $A+B$ is skewsymmetric.
(b) Prove that $cA$ is skewsymmetric for any scalar $c$.
(c) Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is skewsymmetric.
(d) Suppose that $A$ is real skewsymmetric. Prove that $iA$ is an Hermitian matrix.
(e) Prove that if $AB=BA$, then $AB$ is a skewsymmetric matrix.
(f) Let $\mathbf{v}$ be an $n$dimensional column vecotor. Prove that $\mathbf{v}^{\trans}A\mathbf{v}=0$.
(g) Suppose that $A$ is a real skewsymmetric matrix and $A^2\mathbf{v}=\mathbf{0}$ for some vector $\mathbf{v}\in \R^n$. Then prove that $A\mathbf{v}=\mathbf{0}$.  Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$. Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$. Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for $i=1,\dots, n$. Prove that
\[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1.\] 
Suppose that $A$ is a real $n\times n$ matrix.
(a) Is it true that $A$ must commute with its transpose?
(b) Suppose that the columns of $A$ (considered as vectors) form an orthonormal set. Is it true that the rows of $A$ must also form an orthonormal set?
(University of California, Berkeley)  Recall that a complex matrix is called Hermitian if $A^*=A$, where $A^*=\bar{A}^{\trans}$. Prove that every Hermitian matrix $A$ can be written as the sum $A=B+iC$, where $B$ is a real symmetric matrix and $C$ is a real skewsymmetric matrix.
 (a) Prove that each complex $n\times n$ matrix $A$ can be written as $A=B+iC$, where $B$ and $C$ are Hermitian matrices.
(b) Write the complex matrix $A=\begin{bmatrix}
i & 6\\
2i& 1+i
\end{bmatrix}$ as a sum $A=B+iC$, where $B$ and $C$ are Hermitian matrices.  Let $\mathbf{a}, \mathbf{b}$ be vectors in $\R^n$. Prove the CauchySchwarz inequality: $\mathbf{a}\cdot \mathbf{b}\leq \\mathbf{a}\\,\\mathbf{b}\$.

Let $A_1, A_2, \dots, A_m$ be $n\times n$ Hermitian matrices. Show that if
\[A_1^2+A_2^2+\cdots+A_m^2=\calO,\] where $\calO$ is the $n \times n$ zero matrix, then we have $A_i=\calO$ for each $i=1,2, \dots, m$. 
Find the inverse matrix of the matrix
\[A=\begin{bmatrix}
\frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &\frac{3}{7} \\[6pt] \frac{3}{7} & \frac{6}{7} & \frac{2}{7}
\end{bmatrix}.\] 
A complex square ($n\times n$) matrix $A$ is called normal if
\[A^* A=A A^*,\] where $A^*$ denotes the conjugate transpose of $A$, that is $A^*=\bar{A}^{\trans}$.
A matrix $A$ is said to be nilpotent if there exists a positive integer $k$ such that $A^k$ is the zero matrix.
(a) Prove that if $A$ is both normal and nilpotent, then $A$ is the zero matrix. You may use the fact that every normal matrix is diagonalizable.
(b) Give a proof of (a) without referring to eigenvalues and diagonalization.
(c) Let $A, B$ be $n\times n$ complex matrices. Prove that if $A$ is normal and $B$ is nilpotent such that $A+B=I$, then $A=I$, where $I$ is the $n\times n$ identity matrix.