# If the Kernel of a Matrix $A$ is Trivial, then $A^T A$ is Invertible

## Problem 38

Let $A$ be an $m \times n$ real matrix.

Then the* kernel* of $A$ is defined as $\ker(A)=\{ x\in \R^n \mid Ax=0 \}$.

The kernel is also called the* null space* of $A$.

Suppose that $A$ is an $m \times n$ real matrix such that $\ker(A)=0$. Prove that $A^{\trans}A$ is invertible.

(*Stanford University Linear Algebra Exam*)

Add to solve later

Sponsored Links

Both proofs try to prove $\ker(A^{\trans}A)=0$. The method in the 1st proof is more or less direct computation.

For the second proof, you need to remember the relation between the transpose and the orthogonal complement of a vector space.

## 1st proof

Since the size of the transpose $A^{\trans}$ is $n\times m$, the product $A^{\trans}A$ is a $n\times n$ square matrix.

To show that it is invertible, we show that the kernel of $A^{\trans}A$ is trivial.

Then the result follows since $A^{\trans}A$ is an injective linear transformation from $\R^n$ to $\R^n$, thus an isomorphism. Hence $A^{\trans}A$ is invertible.

To show that the kernel of $A^{\trans}A$ is trivial, denote $A=[A_1 A_2\dots A_n]$, where $A_i$ is the $i$-th column vector of $A$.

Note that the column vectors $A_i$ are linearly independent since $\ker(A)=0$.

Then

\[A^{\trans}=\begin{bmatrix}

A_1^{\trans} \\

A_2^{\trans} \\

\vdots \\

A_n^{\trans}

\end{bmatrix}.\]

Now we suppose that $A^{\trans}Ax=0$ for $x=\begin{bmatrix}

x_1 \\

x_2 \\

\vdots \\

x_n

\end{bmatrix}\in R^n$.

Then we have

\begin{align*}

0=&A^{\trans}A x\\[6pt]
&=\begin{bmatrix}

A_1^{\trans}A_1 & A_1^{\trans}A_2 & \cdots & A_1^{\trans}A_n \\

A_2^{\trans}A_1 &A_2^{\trans}A_2 & \cdots & A_2^{\trans}A_n \\

\vdots & \vdots & \vdots & \vdots \\

A_n^{\trans}A_1 & A_n^{\trans}A_2 & \cdots & A_n^{\trans}A_n

\end{bmatrix}x\\[6pt]
&=\begin{bmatrix}

A_1^{\trans}(x_1 A_1+\cdots +x_n A_n) \\

A_2^{\trans}(x_1 A_1+\cdots+x_n A_n \\

\vdots \\

A_n^{\trans}(x_1 A_1+\cdots+x_n A_n)

\end{bmatrix}.

\end{align*}

Therefore we have $A_i^{\trans}(x_1 A_1+\cdots+x_n A_n)=0$ for all $i=1,\dots, n$.

Equivalently, we have the inner product $A_i\cdot Ax=0$ for all $i=1,\dots,n$.

If $Ax=x_1 A_1+\cdots+x_n A_n\neq 0$, then the vectors $A_1, A_2, \dots, A_n, Ax$ are linearly independent vectors in $\R^n$ since $A_1,\dots, A_n$ are linearly independent and the inner products with $A_i$ and the vector $Ax$ are all zero, hence they are orthogonal.

However, this is a contradiction since there are $n+1$ linearly independent vectors in $\R^n$ of dimension $n$ vector space. Thus, we must have $Ax=0$. Since the kernel of $A$ is trivial, this implies $x=0$. Therefore the kernel of $A^{\trans}A$ is trivial.

## 2nd proof

In the second proof, we still want to prove the kernel of $A^{\trans}A$ is trivial.

Let $x\in \ker(A^{\trans}A)$.

Then we have $A^{\trans}(Ax)=0$ and thus $Ax \in \ker(A^{\trans})=\im(A)^{\perp}$.

On the other hand, clearly $Ax \in \im(A)$.

Thus $Ax \in \im(A)\cap \im(A)^{\perp}=\{0\}$.

So we have $Ax=0$, and $x=0$ since $\ker(A)=0$.

Therefore $\ker(A^{\trans}A)=0$, and since any injective homomorphism from $\R^n$ to itself is an isomorphism, we conclude that $A^{\trans}A$ is invertible.

Add to solve later

Sponsored Links