Dot Products and Length of Vectors

Dot Products and Length of Vectors

Definition

Let $A$ be a square matrix.

  1. Two vectors $\mathbf{v}_1$ and $\mathbf{v}_2$ are orthogonal (perpendicular) if $\mathbf{v}_1\cdot \mathbf{v}_2=0$.
  2. Let $\mathbf{v}$ be $n$-dimensional complex vector. Then the length of $\mathbf{v}$ is defined to be
    \[\|\mathbf{v}\|=\sqrt{\bar{\mathbf{v}}^{\trans}\mathbf{v}}.\]
  3. The distance between two vectors $\mathbf{v}_1, \mathbf{v}_2$ is the length $\|\mathbf{v}_1-\mathbf{v}_2\|$.
  4. $A$ is orthogonal if $A^{\trans}A=I=A A^{\trans}$.
  5. $A$ is normal if $A^*A=A=AA^*$. Here $A^*=\bar{A}^{\trans}$.
Summary

Let $A$ be a square matrix. Let $\mathbf{v} \in K^n$, where $K=\R$ or $\C$.

  1. $\|\mathbf{v}\|\geq 0$.
  2. $A$ is orthogonal if and only if its column vectors form an orthonormal set.

=solution

Problems

  1. For this problem, use the real vectors
    \[ \mathbf{v}_1 = \begin{bmatrix} -1 \\ 0 \\ 2 \end{bmatrix} , \mathbf{v}_2 = \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} , \mathbf{v}_3 = \begin{bmatrix} 2 \\ 2 \\ 3 \end{bmatrix} . \] Suppose that $\mathbf{v}_4$ is another vector which is orthogonal to $\mathbf{v}_1$ and $\mathbf{v}_3$, and satisfying $\mathbf{v}_2 \cdot \mathbf{v}_4 = -3$. Calculate the following expressions:
    (a) $\mathbf{v}_1 \cdot \mathbf{v}_2 $.
    (b) $\mathbf{v}_3 \cdot \mathbf{v}_4$.
    (c) $( 2 \mathbf{v}_1 + 3 \mathbf{v}_2 – \mathbf{v}_3 ) \cdot \mathbf{v}_4 $.
    (d) $\| \mathbf{v}_1 \| , \, \| \mathbf{v}_2 \| , \, \| \mathbf{v}_3 \| $.
    (e) What is the distance between $\mathbf{v}_1$ and $\mathbf{v}_2$?

  2. For this problem, use the complex vectors
    \[ \mathbf{w}_1 = \begin{bmatrix} 1 + i \\ 1 – i \\ 0 \end{bmatrix} , \, \mathbf{w}_2 = \begin{bmatrix} -i \\ 0 \\ 2 – i \end{bmatrix} , \, \mathbf{w}_3 = \begin{bmatrix} 2+i \\ 1 – 3i \\ 2i \end{bmatrix} . \] Suppose $\mathbf{w}_4$ is another complex vector which is orthogonal to both $\mathbf{w}_2$ and $\mathbf{w}_3$, and satisfies $\mathbf{w}_1 \cdot \mathbf{w}_4 = 2i$ and $\| \mathbf{w}_4 \| = 3$. Calculate the following expressions:
    (a) $ \mathbf{w}_1 \cdot \mathbf{w}_2 $.
    (b) $ \mathbf{w}_1 \cdot \mathbf{w}_3 $.
    (c) $((2+i)\mathbf{w}_1 – (1+i)\mathbf{w}_2 ) \cdot \mathbf{w}_4$.
    (d) $\| \mathbf{w}_1 \| , \| \mathbf{w}_2 \|$, and $\| \mathbf{w}_3 \|$.
    (e) $\| 3 \mathbf{w}_4 \|$.
    (f) What is the distance between $\mathbf{w}_2$ and $\mathbf{w}_3$?

  3. Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in $\mathbf{u_1}=\mathbf{u_2}+a\mathbf{u}_3$.
    (The Ohio State University)

  4. Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are $\|\mathbf{a}\|=\|\mathbf{b}\|=1$ and the inner product
    \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\|\mathbf{a}-\mathbf{b}\|$.

  5. Let $A$ and $B$ be $n\times n$ skew-symmetric matrices. Namely $A^{\trans}=-A$ and $B^{\trans}=-B$.
    (a) Prove that $A+B$ is skew-symmetric.
    (b) Prove that $cA$ is skew-symmetric for any scalar $c$.
    (c) Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is skew-symmetric.
    (d) Suppose that $A$ is real skew-symmetric. Prove that $iA$ is an Hermitian matrix.
    (e) Prove that if $AB=-BA$, then $AB$ is a skew-symmetric matrix.
    (f) Let $\mathbf{v}$ be an $n$-dimensional column vecotor. Prove that $\mathbf{v}^{\trans}A\mathbf{v}=0$.
    (g) Suppose that $A$ is a real skew-symmetric matrix and $A^2\mathbf{v}=\mathbf{0}$ for some vector $\mathbf{v}\in \R^n$. Then prove that $A\mathbf{v}=\mathbf{0}$.

  6. Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$. Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$. Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for $i=1,\dots, n$. Prove that
    \[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1.\]
  7. Suppose that $A$ is a real $n\times n$ matrix.
    (a) Is it true that $A$ must commute with its transpose?
    (b) Suppose that the columns of $A$ (considered as vectors) form an orthonormal set. Is it true that the rows of $A$ must also form an orthonormal set?
    (University of California, Berkeley)

  8. Recall that a complex matrix is called Hermitian if $A^*=A$, where $A^*=\bar{A}^{\trans}$. Prove that every Hermitian matrix $A$ can be written as the sum $A=B+iC$, where $B$ is a real symmetric matrix and $C$ is a real skew-symmetric matrix.
  9. (a) Prove that each complex $n\times n$ matrix $A$ can be written as $A=B+iC$, where $B$ and $C$ are Hermitian matrices.
    (b) Write the complex matrix $A=\begin{bmatrix}
    i & 6\\
    2-i& 1+i
    \end{bmatrix}$ as a sum $A=B+iC$, where $B$ and $C$ are Hermitian matrices.

  10. Let $\mathbf{a}, \mathbf{b}$ be vectors in $\R^n$. Prove the Cauchy-Schwarz inequality: $|\mathbf{a}\cdot \mathbf{b}|\leq \|\mathbf{a}\|\,\|\mathbf{b}\|$.

  11. Let $A_1, A_2, \dots, A_m$ be $n\times n$ Hermitian matrices. Show that if
    \[A_1^2+A_2^2+\cdots+A_m^2=\calO,\] where $\calO$ is the $n \times n$ zero matrix, then we have $A_i=\calO$ for each $i=1,2, \dots, m$.

  12. Find the inverse matrix of the matrix
    \[A=\begin{bmatrix}
    \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7}
    \end{bmatrix}.\]
  13. A complex square ($n\times n$) matrix $A$ is called normal if
    \[A^* A=A A^*,\] where $A^*$ denotes the conjugate transpose of $A$, that is $A^*=\bar{A}^{\trans}$.
    A matrix $A$ is said to be nilpotent if there exists a positive integer $k$ such that $A^k$ is the zero matrix.
    (a) Prove that if $A$ is both normal and nilpotent, then $A$ is the zero matrix. You may use the fact that every normal matrix is diagonalizable.
    (b) Give a proof of (a) without referring to eigenvalues and diagonalization.
    (c) Let $A, B$ be $n\times n$ complex matrices. Prove that if $A$ is normal and $B$ is nilpotent such that $A+B=I$, then $A=I$, where $I$ is the $n\times n$ identity matrix.