If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set?

Linear Algebra exam problems and solutions at University of California, Berkeley

Problem 317

Suppose that $A$ is a real $n\times n$ matrix.

(a) Is it true that $A$ must commute with its transpose?

(b) Suppose that the columns of $A$ (considered as vectors) form an orthonormal set.
Is it true that the rows of $A$ must also form an orthonormal set?

(University of California, Berkeley, Linear Algebra Qualifying Exam)

 
LoadingAdd to solve later

Solution.

(a) Is it true that $A$ must commute with its transpose?

The answer is no.

We give a counterexample. Let
\[A=\begin{bmatrix}
1 & -1\\
0& 2
\end{bmatrix}.\] Then the transpose of $A$ is
\[A^{\trans}=\begin{bmatrix}
1 & 0\\
-1& 2
\end{bmatrix}.\] We compute
\[AA^{\trans}=\begin{bmatrix}
1 & -1\\
0& 2
\end{bmatrix}
\begin{bmatrix}
1 & 0\\
-1& 2
\end{bmatrix}
=
\begin{bmatrix}
2 & -2\\
-2& 4
\end{bmatrix},\] and
\[A^{\trans}A=
\begin{bmatrix}
1 & 0\\
-1& 2
\end{bmatrix}
\begin{bmatrix}
1 & -1\\
0& 2
\end{bmatrix}
=
\begin{bmatrix}
1 & -1\\
-1& 5
\end{bmatrix}.
\] Therefore, we see that
\[AA^{\trans}\neq A^{\trans} A,\] that is, $A$ does not commute with its transpose $A^{\trans}$.

(b) Is it true that the rows of $A$ must also form an orthonormal set?

The answer is yes.

Note that in general the column vectors of a matrix $M$ form an orthonormal set if and only if $M^{\trans}M=I$, where $I$ is the identity matrix. (Such a matrix is called orthogonal matrix.)

Thus, by assumption we have $A^{\trans} A=I$. Let $B=A^{\trans}$.
Then the column vectors of $B$ is the row vectors of $A$. Hence it suffices to show that $B^{\trans}B=I$.

Since $A^{\trans} A=I$, we know that $A$ is invertible and the inverse $A^{-1}=A^{\trans}$.
In particular, we have $A^{\trans} A=A A^{\trans}=I$.

We have
\begin{align*}
B^{\trans}B=(A^{\trans})^{\trans}A^{\trans}=(AA^{\trans})^{\trans}=I^{\trans}=I.
\end{align*}
Thus, we obtain $B^{\trans}B=I$ and by the general fact stated above, the column vectors of $B$ form an orthonormal set.
Hence the row column vectors of $A$ form an orthonormal set.


LoadingAdd to solve later

Sponsored Links

More from my site

  • A Matrix Equation of a Symmetric Matrix and the Limit of its SolutionA Matrix Equation of a Symmetric Matrix and the Limit of its Solution Let $A$ be a real symmetric $n\times n$ matrix with $0$ as a simple eigenvalue (that is, the algebraic multiplicity of the eigenvalue $0$ is $1$), and let us fix a vector $\mathbf{v}\in \R^n$. (a) Prove that for sufficiently small positive real $\epsilon$, the equation […]
  • A Matrix Having One Positive Eigenvalue and One Negative EigenvalueA Matrix Having One Positive Eigenvalue and One Negative Eigenvalue Prove that the matrix \[A=\begin{bmatrix} 1 & 1.00001 & 1 \\ 1.00001 &1 &1.00001 \\ 1 & 1.00001 & 1 \end{bmatrix}\] has one positive eigenvalue and one negative eigenvalue. (University of California, Berkeley Qualifying Exam Problem)   Solution. Let us put […]
  • Simple Commutative Relation on MatricesSimple Commutative Relation on Matrices Let $A$ and $B$ are $n \times n$ matrices with real entries. Assume that $A+B$ is invertible. Then show that \[A(A+B)^{-1}B=B(A+B)^{-1}A.\] (University of California, Berkeley Qualifying Exam) Proof. Let $P=A+B$. Then $B=P-A$. Using these, we express the given […]
  • Inequality Regarding Ranks of MatricesInequality Regarding Ranks of Matrices Let $A$ be an $n \times n$ matrix over a field $K$. Prove that \[\rk(A^2)-\rk(A^3)\leq \rk(A)-\rk(A^2),\] where $\rk(B)$ denotes the rank of a matrix $B$. (University of California, Berkeley, Qualifying Exam) Hint. Regard the matrix as a linear transformation $A: […]
  • Prove that the Length $\|A^n\mathbf{v}\|$ is As Small As We Like.Prove that the Length $\|A^n\mathbf{v}\|$ is As Small As We Like. Consider the matrix \[A=\begin{bmatrix} 3/2 & 2\\ -1& -3/2 \end{bmatrix} \in M_{2\times 2}(\R).\] (a) Find the eigenvalues and corresponding eigenvectors of $A$. (b) Show that for $\mathbf{v}=\begin{bmatrix} 1 \\ 0 \end{bmatrix}\in \R^2$, we can choose […]
  • Find the Rank of the Matrix $A+I$ if Eigenvalues of $A$ are $1, 2, 3, 4, 5$Find the Rank of the Matrix $A+I$ if Eigenvalues of $A$ are $1, 2, 3, 4, 5$ Let $A$ be an $n$ by $n$ matrix with entries in complex numbers $\C$. Its only eigenvalues are $1,2,3,4,5$, possibly with multiplicities. What is the rank of the matrix $A+I_n$, where $I_n$ is the identity $n$ by $n$ matrix. (UCB-University of California, Berkeley, […]
  • Find the Inverse Matrix of a Matrix With FractionsFind the Inverse Matrix of a Matrix With Fractions Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7} \end{bmatrix}.\]   Hint. You may use the augmented matrix […]
  • Linear Dependent/Independent Vectors of PolynomialsLinear Dependent/Independent Vectors of Polynomials Let $p_1(x), p_2(x), p_3(x), p_4(x)$ be (real) polynomials of degree at most $3$. Which (if any) of the following two conditions is sufficient for the conclusion that these polynomials are linearly dependent? (a) At $1$ each of the polynomials has the value $0$. Namely $p_i(1)=0$ […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Problems and solutions in Linear Algebra
Is there an Odd Matrix Whose Square is $-I$?

Let $n$ be an odd positive integer. Determine whether there exists an $n \times n$ real matrix $A$ such that...

Close