## The Intersection of Two Subspaces is also a Subspace

## Problem 595

Let $U$ and $V$ be subspaces of the $n$-dimensional vector space $\R^n$.

Prove that the intersection $U\cap V$ is also a subspace of $\R^n$.

Add to solve laterLet $U$ and $V$ be subspaces of the $n$-dimensional vector space $\R^n$.

Prove that the intersection $U\cap V$ is also a subspace of $\R^n$.

Add to solve later Is it possible that each element of an infinite group has a finite order?

If so, give an example. Otherwise, prove the non-existence of such a group.

We fix a nonzero vector $\mathbf{a}$ in $\R^3$ and define a map $T:\R^3\to \R^3$ by

\[T(\mathbf{v})=\mathbf{a}\times \mathbf{v}\]
for all $\mathbf{v}\in \R^3$.

Here the right-hand side is the cross product of $\mathbf{a}$ and $\mathbf{v}$.

**(a)** Prove that $T:\R^3\to \R^3$ is a linear transformation.

**(b)** Determine the eigenvalues and eigenvectors of $T$.

Let $\R^n$ be an inner product space with inner product $\langle \mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\trans}\mathbf{y}$ for $\mathbf{x}, \mathbf{y}\in \R^n$.

A linear transformation $T:\R^n \to \R^n$ is called **orthogonal transformation** if for all $\mathbf{x}, \mathbf{y}\in \R^n$, it satisfies

\[\langle T(\mathbf{x}), T(\mathbf{y})\rangle=\langle\mathbf{x}, \mathbf{y} \rangle.\]

Prove that if $T:\R^n\to \R^n$ is an orthogonal transformation, then $T$ is an isomorphism.

Add to solve later Let $S=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ be a set of nonzero vectors in $\R^n$.

Suppose that $S$ is an orthogonal set.

**(a)** Show that $S$ is linearly independent.

**(b)** If $k=n$, then prove that $S$ is a basis for $\R^n$.

Let $C[-1, 1]$ be the vector space over $\R$ of all continuous functions defined on the interval $[-1, 1]$. Let

\[V:=\{f(x)\in C[-1,1] \mid f(x)=a e^x+b e^{2x}+c e^{3x}, a, b, c\in \R\}\]
be a subset in $C[-1, 1]$.

**(a)** Prove that $V$ is a subspace of $C[-1, 1]$.

**(b)** Prove that the set $B=\{e^x, e^{2x}, e^{3x}\}$ is a basis of $V$.

**(c)** Prove that

\[B’=\{e^x-2e^{3x}, e^x+e^{2x}+2e^{3x}, 3e^{2x}+e^{3x}\}\]
is a basis for $V$.

Let $R$ be an integral domain and let $I$ be an ideal of $R$.

Is the quotient ring $R/I$ an integral domain?

Let $P_2$ be the vector space over $\R$ of all polynomials of degree $2$ or less.

Let $S=\{p_1(x), p_2(x), p_3(x)\}$, where

\[p_1(x)=x^2+1, \quad p_2(x)=6x^2+x+2, \quad p_3(x)=3x^2+x.\]

**(a)** Use the basis $B=\{x^2, x, 1\}$ of $P_2$ to prove that the set $S$ is a basis for $P_2$.

**(b)** Find the coordinate vector of $p(x)=x^2+2x+3\in P_2$ with respect to the basis $S$.

Let $A$ and $B$ be square matrices such that they commute each other: $AB=BA$.

Assume that $A-B$ is a nilpotent matrix.

Then prove that the eigenvalues of $A$ and $B$ are the same.

Add to solve later Let $V$ be the vector space over $\R$ of all real $2\times 2$ matrices.

Let $W$ be the subset of $V$ consisting of all symmetric matrices.

**(a)** Prove that $W$ is a subspace of $V$.

**(b)** Find a basis of $W$.

**(c)** Determine the dimension of $W$.

Consider the Hermitian matrix

\[A=\begin{bmatrix}

1 & i\\

-i& 1

\end{bmatrix}.\]

**(a)** Find the eigenvalues of $A$.

**(b)** For each eigenvalue of $A$, find the eigenvectors.

**(c)** Diagonalize the Hermitian matrix $A$ by a unitary matrix. Namely, find a diagonal matrix $D$ and a unitary matrix $U$ such that $U^{-1}AU=D$.

Prove that the matrix

\[A=\begin{bmatrix}

0 & 1\\

-1& 0

\end{bmatrix}\]
is diagonalizable.

Prove, however, that $A$ cannot be diagonalized by a real nonsingular matrix.

That is, there is no real nonsingular matrix $S$ such that $S^{-1}AS$ is a diagonal matrix.

Consider the $2\times 2$ complex matrix

\[A=\begin{bmatrix}

a & b-a\\

0& b

\end{bmatrix}.\]

**(a)** Find the eigenvalues of $A$.

**(b)** For each eigenvalue of $A$, determine the eigenvectors.

**(c)** Diagonalize the matrix $A$.

**(d)** Using the result of the diagonalization, compute and simplify $A^k$ for each positive integer $k$.

A square matrix $A$ is called **nilpotent** if some power of $A$ is the zero matrix.

Namely, $A$ is nilpotent if there exists a positive integer $k$ such that $A^k=O$, where $O$ is the zero matrix.

Suppose that $A$ is a nilpotent matrix and let $B$ be an invertible matrix of the same size as $A$.

Is the matrix $B-A$ invertible? If so prove it. Otherwise, give a counterexample.

Let $V$ be a vector space over a scalar field $K$.

Let $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k$ be vectors in $V$ and consider the subset

\[W=\{a_1\mathbf{v}_1+a_2\mathbf{v}_2+\cdots+ a_k\mathbf{v}_k \mid a_1, a_2, \dots, a_k \in K \text{ and } a_1+a_2+\cdots+a_k=0\}.\]
So each element of $W$ is a linear combination of vectors $\mathbf{v}_1, \dots, \mathbf{v}_k$ such that the sum of the coefficients is zero.

Prove that $W$ is a subspace of $V$.

Add to solve later**(a)** Prove that the column vectors of every $3\times 5$ matrix $A$ are linearly dependent.

**(b)** Prove that the row vectors of every $5\times 3$ matrix $B$ are linearly dependent.

Determine whether each of the following sets is a basis for $\R^3$.

**(a)** $S=\left\{\, \begin{bmatrix}

1 \\

0 \\

-1

\end{bmatrix}, \begin{bmatrix}

2 \\

1 \\

-1

\end{bmatrix}, \begin{bmatrix}

-2 \\

1 \\

4

\end{bmatrix} \,\right\}$

**(b)** $S=\left\{\, \begin{bmatrix}

1 \\

4 \\

7

\end{bmatrix}, \begin{bmatrix}

2 \\

5 \\

8

\end{bmatrix}, \begin{bmatrix}

3 \\

6 \\

9

\end{bmatrix} \,\right\}$

**(c)** $S=\left\{\, \begin{bmatrix}

1 \\

1 \\

2

\end{bmatrix}, \begin{bmatrix}

0 \\

1 \\

7

\end{bmatrix} \,\right\}$

**(d)** $S=\left\{\, \begin{bmatrix}

1 \\

2 \\

5

\end{bmatrix}, \begin{bmatrix}

7 \\

4 \\

0

\end{bmatrix}, \begin{bmatrix}

3 \\

8 \\

6

\end{bmatrix}, \begin{bmatrix}

-1 \\

9 \\

10

\end{bmatrix} \,\right\}$

Let $V$ be a subset of $\R^4$ consisting of vectors that are perpendicular to vectors $\mathbf{a}, \mathbf{b}$ and $\mathbf{c}$, where

\[\mathbf{a}=\begin{bmatrix}

1 \\

0 \\

1 \\

0

\end{bmatrix}, \quad \mathbf{b}=\begin{bmatrix}

1 \\

1 \\

0 \\

0

\end{bmatrix}, \quad \mathbf{c}=\begin{bmatrix}

0 \\

1 \\

-1 \\

0

\end{bmatrix}.\]

Namely,

\[V=\{\mathbf{x}\in \R^4 \mid \mathbf{a}^{\trans}\mathbf{x}=0, \mathbf{b}^{\trans}\mathbf{x}=0, \text{ and } \mathbf{c}^{\trans}\mathbf{x}=0\}.\]

**(a)** Prove that $V$ is a subspace of $\R^4$.

**(b)** Find a basis of $V$.

**(c)** Determine the dimension of $V$.

Let $V$ be a subspace of $\R^n$.

Suppose that $B=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ is a basis of the subspace $V$.

Prove that every basis of $V$ consists of $k$ vectors in $V$.

Add to solve later Let $V$ be a subspace of $\R^n$.

Suppose that

\[S=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_m\}\]
is a spanning set for $V$.

Prove that any set of $m+1$ or more vectors in $V$ is linearly dependent.

Add to solve later