Equivalent Conditions to be a Unitary Matrix

Linear algebra problems and solutions

Problem 29

A complex matrix is called unitary if $\overline{A}^{\trans} A=I$.

The inner product $(\mathbf{x}, \mathbf{y})$ of complex vector $\mathbf{x}$, $\mathbf{y}$ is defined by $(\mathbf{x}, \mathbf{y}):=\overline{\mathbf{x}}^{\trans} \mathbf{y}$. The length of a complex vector $\mathbf{x}$ is defined to be $||\mathbf{x}||:=\sqrt{(\mathbf{x}, \mathbf{x})}$.

Let $A$ be an $n \times n$ complex matrix. Prove that the followings are equivalent.

(a) The matrix $A$ is unitary.

(b) $||A \mathbf{x}||=|| \mathbf{x}||$ for any $n$-dimensional complex vector $\mathbf{x}$.

(c) $(A\mathbf{x}, A\mathbf{y})=(\mathbf{x}, \mathbf{y})$ for any $n$-dimensional complex vectors $x, y$

LoadingAdd to solve later

Steps.

Try to show the implications (a)$\Rightarrow$ (b) $\Rightarrow$ (c) $\Rightarrow$ (a).

Proof.

We will show that (a)$\Rightarrow$ (b) $\Rightarrow$ (c) $\Rightarrow$ (a).


(a) $\Rightarrow$ (b).

Suppose that $A$ is unitary. Then we have for any $n$-dimensional complex vector $\mathbf{x}$
\begin{align*}
||A\mathbf{x}|| &=\sqrt{ {(\overline{A\mathbf{x}})^{\trans}} (A \mathbf{x})}
=\sqrt{ \overline{\mathbf{x}}^{\trans} \overline{A}^{\trans} A\mathbf{x} } \\
&= \sqrt{ \overline{\mathbf{x}}^{\trans} \mathbf{x} }=||\mathbf{x}||.
\end{align*}

(b) $\Rightarrow$ (c).

Suppose that the statement (b) is true.
Then we compute $||A(\mathbf{x}+\mathbf{y})||^2=||\mathbf{x}+\mathbf{y}||^2$.
The left hand side is
\begin{align*}
(A\mathbf{x}+A\mathbf{y} ,A\mathbf{x}+A\mathbf{y})&=(A\mathbf{x},A\mathbf{x})+(A\mathbf{x},A\mathbf{y})+(A\mathbf{y},A\mathbf{x})+(A\mathbf{y}, A\mathbf{y}) \\
&=||A\mathbf{x}||^2 + (A\mathbf{x},A\mathbf{y}) +\overline{(A\mathbf{x},A\mathbf{y})} + ||A\mathbf{y}||.
\end{align*}
The right hand side is
\begin{align*}
(\mathbf{x}, \mathbf{x})+(\mathbf{x}, \mathbf{y})+(\mathbf{y}, \mathbf{x})+(\mathbf{y}, \mathbf{y})=||\mathbf{x}||+(\mathbf{x}, \mathbf{y}) +\overline{(\mathbf{x}, \mathbf{y})} +||\mathbf{y}||.
\end{align*}
Using (b) we cancel the length terms and we obtain the relation
\begin{align*}
(A\mathbf{x},A\mathbf{y}) +\overline{(A\mathbf{x},A\mathbf{y})}=(\mathbf{x}, \mathbf{y}) +\overline{(\mathbf{x}, \mathbf{y})}. \tag{*}
\end{align*}
that holds for any $\mathbf{x}$ and $\mathbf{y}$.

This implies that we have the equality of the real parts $\Repart (A\mathbf{x},A\mathbf{y})= \Repart (\mathbf{x}, \mathbf{y})$.
To show that the imaginary parts are also equal, we substitute $ix$ into $x$ in the equation (*). (Here $i=\sqrt{-1}$.)
Then we have
\begin{align*}
i\left((A\mathbf{x},A\mathbf{y}) -\overline{(A\mathbf{x},A\mathbf{y})} \right)=i\left( (\mathbf{x}, \mathbf{y}) -\overline{(\mathbf{x}, \mathbf{y})} \right). \end{align*}
Thus, we have $\Impart( A\mathbf{x},A\mathbf{y})=\Impart (\mathbf{x}, \mathbf{y})$.
Hence we have $(A\mathbf{x},A\mathbf{y})=(\mathbf{x}, \mathbf{y})$.

(c) $\Rightarrow$ (a).

We give two proofs of this implication.

By assumption (c), we have $(A\mathbf{x},A\mathbf{y})=(\mathbf{x}, \mathbf{y})$ for any $\mathbf{x}$ and $\mathbf{y}$. Equivalently we have
\begin{align*}
\overline{\mathbf{x}}^{\trans}\overline{A}^{\trans}A \mathbf{y}=(\mathbf{x}, \mathbf{y}) \tag{**}.
\end{align*}
Let $e_i$ be a unit $n$-dimensional vector whose entries are all zero except that the $i$-th entry is $1$.
Take $\mathbf{x}=e_i$ and $\mathbf{y}=e_j$, we see that the left hand side of (**) is the $(i,j)$-entry of the matrix $\overline{A}^{\trans}A$ and the right hand side of (**) is $\delta_{i,j}$. Thus we have $\overline{A}^{\trans}A=I_n$.

The 2nd Proof.

We have for any $\mathbf{x}$ and $\mathbf{y}$,
\begin{align*}
(\mathbf{x}, (\overline{A}^{\trans} A -I_n) \mathbf{y}) &= (\mathbf{x}, \overline{A}^{\trans}A \mathbf{y})-(\mathbf{x}, \mathbf{y}) \\
&=(A \mathbf{x}, A\mathbf{y})-(\mathbf{x}, \mathbf{y})=0.
\end{align*}
Therefore, we have $\overline{A}^{\trans} A =I_n$.


LoadingAdd to solve later

More from my site

  • Eigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is EvenEigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is Even Let $A$ be a real skew-symmetric matrix, that is, $A^{\trans}=-A$. Then prove the following statements. (a) Each eigenvalue of the real skew-symmetric matrix $A$ is either $0$ or a purely imaginary number. (b) The rank of $A$ is even.   Proof. (a) Each […]
  • Find the Distance Between Two Vectors if the Lengths and the Dot Product are GivenFind the Distance Between Two Vectors if the Lengths and the Dot Product are Given Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are \[\|\mathbf{a}\|=\|\mathbf{b}\|=1\] and the inner product \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\|\mathbf{a}-\mathbf{b}\|$. (Note […]
  • Inner Product, Norm, and Orthogonal VectorsInner Product, Norm, and Orthogonal Vectors Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in […]
  • Find the Inverse Matrix of a Matrix With FractionsFind the Inverse Matrix of a Matrix With Fractions Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7} \end{bmatrix}.\]   Hint. You may use the augmented matrix […]
  • Inner Products, Lengths, and Distances of 3-Dimensional Real VectorsInner Products, Lengths, and Distances of 3-Dimensional Real Vectors For this problem, use the real vectors \[ \mathbf{v}_1 = \begin{bmatrix} -1 \\ 0 \\ 2 \end{bmatrix} , \mathbf{v}_2 = \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} , \mathbf{v}_3 = \begin{bmatrix} 2 \\ 2 \\ 3 \end{bmatrix} . \] Suppose that $\mathbf{v}_4$ is another vector which is […]
  • Prove the Cauchy-Schwarz InequalityProve the Cauchy-Schwarz Inequality Let $\mathbf{a}, \mathbf{b}$ be vectors in $\R^n$. Prove the Cauchy-Schwarz inequality: \[|\mathbf{a}\cdot \mathbf{b}|\leq \|\mathbf{a}\|\,\|\mathbf{b}\|.\]   We give two proofs. Proof 1 Let $x$ be a variable and consider the length of the vector […]
  • Dot Product, Lengths, and Distances of Complex VectorsDot Product, Lengths, and Distances of Complex Vectors For this problem, use the complex vectors \[ \mathbf{w}_1 = \begin{bmatrix} 1 + i \\ 1 - i \\ 0 \end{bmatrix} , \, \mathbf{w}_2 = \begin{bmatrix} -i \\ 0 \\ 2 - i \end{bmatrix} , \, \mathbf{w}_3 = \begin{bmatrix} 2+i \\ 1 - 3i \\ 2i \end{bmatrix} . \] Suppose $\mathbf{w}_4$ is […]
  • Sum of Squares of Hermitian Matrices is Zero, then Hermitian Matrices Are All ZeroSum of Squares of Hermitian Matrices is Zero, then Hermitian Matrices Are All Zero Let $A_1, A_2, \dots, A_m$ be $n\times n$ Hermitian matrices. Show that if \[A_1^2+A_2^2+\cdots+A_m^2=\calO,\] where $\calO$ is the $n \times n$ zero matrix, then we have $A_i=\calO$ for each $i=1,2, \dots, m$.   Hint. Recall that a complex matrix $A$ is Hermitian if […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Linear Algebra
Problems and solutions in Linear Algebra
Finite Order Matrix and its Trace

Let $A$ be an $n\times n$ matrix and suppose that $A^r=I_n$ for some positive integer $r$. Then show that (a)...

Close