Dot Products and Length of Vectors

Dot Products and Length of Vectors


Let $A$ be a square matrix.

  1. Two vectors $\mathbf{v}_1$ and $\mathbf{v}_2$ are orthogonal (perpendicular) if $\mathbf{v}_1\cdot \mathbf{v}_2=0$.
  2. Let $\mathbf{v}$ be $n$-dimensional complex vector. Then the length of $\mathbf{v}$ is defined to be
  3. The distance between two vectors $\mathbf{v}_1, \mathbf{v}_2$ is the length $\|\mathbf{v}_1-\mathbf{v}_2\|$.
  4. $A$ is orthogonal if $A^{\trans}A=I=A A^{\trans}$.
  5. $A$ is normal if $A^*A=A=AA^*$. Here $A^*=\bar{A}^{\trans}$.

Let $A$ be a square matrix. Let $\mathbf{v} \in K^n$, where $K=\R$ or $\C$.

  1. $\|\mathbf{v}\|\geq 0$.
  2. $A$ is orthogonal if and only if its column vectors form an orthonormal set.



  1. Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in $\mathbf{u_1}=\mathbf{u_2}+a\mathbf{u}_3$.
    (The Ohio State University)

  2. Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are $\|\mathbf{a}\|=\|\mathbf{b}\|=1$ and the inner product
    \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\|\mathbf{a}-\mathbf{b}\|$.

  3. Let $A$ and $B$ be $n\times n$ skew-symmetric matrices. Namely $A^{\trans}=-A$ and $B^{\trans}=-B$.
    (a) Prove that $A+B$ is skew-symmetric.
    (b) Prove that $cA$ is skew-symmetric for any scalar $c$.
    (c) Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is skew-symmetric.
    (d) Suppose that $A$ is real skew-symmetric. Prove that $iA$ is an Hermitian matrix.
    (e) Prove that if $AB=-BA$, then $AB$ is a skew-symmetric matrix.
    (f) Let $\mathbf{v}$ be an $n$-dimensional column vecotor. Prove that $\mathbf{v}^{\trans}A\mathbf{v}=0$.
    (g) Suppose that $A$ is a real skew-symmetric matrix and $A^2\mathbf{v}=\mathbf{0}$ for some vector $\mathbf{v}\in \R^n$. Then prove that $A\mathbf{v}=\mathbf{0}$.

  4. Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$. Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$. Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for $i=1,\dots, n$. Prove that
    \[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1.\]
  5. Suppose that $A$ is a real $n\times n$ matrix.
    (a) Is it true that $A$ must commute with its transpose?
    (b) Suppose that the columns of $A$ (considered as vectors) form an orthonormal set. Is it true that the rows of $A$ must also form an orthonormal set?
    (University of California, Berkeley)

  6. Recall that a complex matrix is called Hermitian if $A^*=A$, where $A^*=\bar{A}^{\trans}$. Prove that every Hermitian matrix $A$ can be written as the sum $A=B+iC$, where $B$ is a real symmetric matrix and $C$ is a real skew-symmetric matrix.
  7. (a) Prove that each complex $n\times n$ matrix $A$ can be written as $A=B+iC$, where $B$ and $C$ are Hermitian matrices.
    (b) Write the complex matrix $A=\begin{bmatrix}
    i & 6\\
    2-i& 1+i
    \end{bmatrix}$ as a sum $A=B+iC$, where $B$ and $C$ are Hermitian matrices.

  8. Let $\mathbf{a}, \mathbf{b}$ be vectors in $\R^n$. Prove the Cauchy-Schwarz inequality: $|\mathbf{a}\cdot \mathbf{b}|\leq \|\mathbf{a}\|\,\|\mathbf{b}\|$.

  9. Let $A_1, A_2, \dots, A_m$ be $n\times n$ Hermitian matrices. Show that if
    \[A_1^2+A_2^2+\cdots+A_m^2=\calO,\] where $\calO$ is the $n \times n$ zero matrix, then we have $A_i=\calO$ for each $i=1,2, \dots, m$.

  10. Find the inverse matrix of the matrix
    \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7}
  11. A complex square ($n\times n$) matrix $A$ is called normal if
    \[A^* A=A A^*,\] where $A^*$ denotes the conjugate transpose of $A$, that is $A^*=\bar{A}^{\trans}$.
    A matrix $A$ is said to be nilpotent if there exists a positive integer $k$ such that $A^k$ is the zero matrix.
    (a) Prove that if $A$ is both normal and nilpotent, then $A$ is the zero matrix. You may use the fact that every normal matrix is diagonalizable.
    (b) Give a proof of (a) without referring to eigenvalues and diagonalization.
    (c) Let $A, B$ be $n\times n$ complex matrices. Prove that if $A$ is normal and $B$ is nilpotent such that $A+B=I$, then $A=I$, where $I$ is the $n\times n$ identity matrix.