# Subspaces of Symmetric, Skew-Symmetric Matrices

## Problem 143

Let $V$ be the vector space over $\R$ consisting of all $n\times n$ real matrices for some fixed integer $n$. Prove or disprove that the following subsets of $V$ are subspaces of $V$.

**(a) **The set $S$ consisting of all $n\times n$ symmetric matrices.

**(b)** The set $T$ consisting of all $n \times n$ skew-symmetric matrices.

**(c)** The set $U$ consisting of all $n\times n$ nonsingular matrices.

Sponsored Links

Contents

## Hint.

Recall that

- a matrix $A$ is symmetric if $A^{\trans}=A$.
- a matrix $A$ is skew-symmetric if $A^{\trans}=-A$.

## Proof.

To show that a subset $W$ of a vector space $V$ is a subspace, we need to check that

- the zero vector in $V$ is in $W$
- for any two vectors $u,v \in W$, we have $u+v \in W$
- for any scalar $c$ and any vector $u \in W$, we have $cu \in W$.

### (a) The set $S$ consisting of all $n\times n$ symmetric matrices.

We will prove that $S$ is a subspace of $V$. The zero vector $O$ in $V$ is the $n \times n$ zero matrix and it is symmetric. Thus the zero vector $O\in S$ and the condition 1 is met.

To check the second condition, take any $A, B \in S$, that is, $A, B$ are symmetric matrices.

To show that $A+B \in S$, we need to check that the matrix $A+B$ is symmetric.

We have

\begin{align*}

(A+B)^{\trans}=A^{\trans}+B^{\trans}=A+B

\end{align*}

since $A, B$ are symmetric. Thus $A+B$ is also symmetric, and $A+B \in S$. Condition 2 is also satisfied.

Finally, to check condition 3, let $A \in S$ and let $r\in R$. We show that $rA \in S$, namely, we show that $rA$ is symmetric.

We have

\begin{align*}

(rA)^{\trans}=rA^{\trans}=rA

\end{align*}

since $A$ is symmetric. Thus $rA$ is symmetric and hence $rA \in S$.

Thus condition 3 is met.

By the subspace criteria, the subset $S$ is a subspace of the vector space $V$.

### (b) The set $T$ consisting of all $n \times n$ skew-symmetric matrices.

We will prove that $T$ is a subspace of $V$.

The zero vector $O$ in $V$ is the $n \times n$ matrix, and it is skew-symmetric because

\[O^{\trans}=O=-O.\]
Thus condition 1 is met.

For condition 2, take arbitrary elements $A, B \in T$. The matrices $A, B$ are skew-symmetric, namely, we have

\[A^{\trans}=-A \text{ and } B^{\trans}=-B \tag{*}.\]
We show that $A+B \in T$, or equivalently we show that the matrix $A+B$ is skew-symmetric.

We have

\begin{align*}

(A+B)^{\trans}=A^{\trans}+B^{\trans} \stackrel{*}{=} -A+(-B)=-(A+B).

\end{align*}

Therefore the matrix $A+B$ is skew-symmetric and condition 2 is met.

To prove the last condition, consider any $A \in T$ and $r \in \R$.

We show that $rA$ is skew-symmetric, and hence $rA \in T$.

Using the fact that $A$ is skew-symmetric ($A^{\trans}=-A$), we have

\[(rA)^{\trans}=rA^{\trans}=r(-A)=-rA.\]
Hence $rA$ is skew-symmetric and condition 3 is satisfied.

By the subspace criteria, the subset $T$ is a subspace of the vector space $V$.

### (c) The set $U$ consisting of all $n\times n$ nonsingular matrices.

We claim that $U$ is not a subspace of $V$.

As the zero vector of $V$ is the $n \times n$ matrix and the zero matrix is singular, the zero vector is not in $U$. Hence condition 1 is not met, and thus $U$ is not a subspace.

Another reason that $U$ is not a subspace is that the addition is not closed. For example,

if $A$ is a nonsingular matrix (say, $A$ is $n\times n$ identity matrix), then $-A$ is also nonsingular matrix but their addition $A+(-A)=O$ is nonsingular, hence it is not in $U$.

Add to solve later

Sponsored Links