Dimension of the Sum of Two Subspaces
Problem 440
Let $U$ and $V$ be finite dimensional subspaces in a vector space over a scalar field $K$.
Then prove that
\[\dim(U+V) \leq \dim(U)+\dim(V).\]
Let $U$ and $V$ be finite dimensional subspaces in a vector space over a scalar field $K$.
Then prove that
\[\dim(U+V) \leq \dim(U)+\dim(V).\]
Determine whether each of the following statements is True or False.
(a) If $A$ and $B$ are $n \times n$ matrices, and $P$ is an invertible $n \times n$ matrix such that $A=PBP^{-1}$, then $\det(A)=\det(B)$.
(b) If the characteristic polynomial of an $n \times n$ matrix $A$ is
\[p(\lambda)=(\lambda-1)^n+2,\]
then $A$ is invertible.
(c) If $A^2$ is an invertible $n\times n$ matrix, then $A^3$ is also invertible.
(d) If $A$ is a $3\times 3$ matrix such that $\det(A)=7$, then $\det(2A^{\trans}A^{-1})=2$.
(e) If $\mathbf{v}$ is an eigenvector of an $n \times n$ matrix $A$ with corresponding eigenvalue $\lambda_1$, and if $\mathbf{w}$ is an eigenvector of $A$ with corresponding eigenvalue $\lambda_2$, then $\mathbf{v}+\mathbf{w}$ is an eigenvector of $A$ with corresponding eigenvalue $\lambda_1+\lambda_2$.
(Stanford University, Linear Algebra Exam Problem)
Read solution
Let $\calF[0, 2\pi]$ be the vector space of all real valued functions defined on the interval $[0, 2\pi]$.
Define the map $f:\R^2 \to \calF[0, 2\pi]$ by
\[\left(\, f\left(\, \begin{bmatrix}
\alpha \\
\beta
\end{bmatrix} \,\right) \,\right)(x):=\alpha \cos x + \beta \sin x.\]
We put
\[V:=\im f=\{\alpha \cos x + \beta \sin x \in \calF[0, 2\pi] \mid \alpha, \beta \in \R\}.\]
(a) Prove that the map $f$ is a linear transformation.
(b) Prove that the set $\{\cos x, \sin x\}$ is a basis of the vector space $V$.
(c) Prove that the kernel is trivial, that is, $\ker f=\{\mathbf{0}\}$.
(This yields an isomorphism of $\R^2$ and $V$.)
(d) Define a map $g:V \to V$ by
\[g(\alpha \cos x + \beta \sin x):=\frac{d}{dx}(\alpha \cos x+ \beta \sin x)=\beta \cos x -\alpha \sin x.\]
Prove that the map $g$ is a linear transformation.
(e) Find the matrix representation of the linear transformation $g$ with respect to the basis $\{\cos x, \sin x\}$.
(Kyoto University, Linear Algebra exam problem)
Add to solve laterLet $P_3$ be the vector space of polynomials of degree $3$ or less with real coefficients.
(a) Prove that the differentiation is a linear transformation. That is, prove that the map $T:P_3 \to P_3$ defined by
\[T\left(\, f(x) \,\right)=\frac{d}{dx} f(x)\]
for any $f(x)\in P_3$ is a linear transformation.
(b) Let $B=\{1, x, x^2, x^3\}$ be a basis of $P_3$. With respect to the basis $B$, find the matrix representation of the linear transformation $T$ in part (a).
Add to solve later Let $V$ be a vector space over a field $K$.
If $W_1$ and $W_2$ are subspaces of $V$, then prove that the subset
\[W_1+W_2:=\{\mathbf{x}+\mathbf{y} \mid \mathbf{x}\in W_1, \mathbf{y}\in W_2\}\]
is a subspace of the vector space $V$.
Let $A$ be an $n\times n$ idempotent matrix, that is, $A^2=A$. Then prove that $A$ is diagonalizable.
Add to solve later Let $T:\R^3 \to \R^3$ be a linear transformation and suppose that its matrix representation with respect to the standard basis is given by the matrix
\[A=\begin{bmatrix}
1 & 0 & 2 \\
0 &3 &0 \\
4 & 0 & 5
\end{bmatrix}.\]
(a) Prove that the linear transformation $T$ sends points on the $x$-$z$ plane to points on the $x$-$z$ plane.
(b) Prove that the restriction of $T$ on the $x$-$z$ plane is a linear transformation.
(c) Find the matrix representation of the linear transformation obtained in part (b) with respect to the standard basis
\[\left\{\, \begin{bmatrix}
1 \\
0 \\
0
\end{bmatrix}, \begin{bmatrix}
0 \\
0 \\
1
\end{bmatrix} \,\right\}\]
of the $x$-$z$ plane.
Let $W_1, W_2$ be subspaces of a vector space $V$. Then prove that $W_1 \cup W_2$ is a subspace of $V$ if and only if $W_1 \subset W_2$ or $W_2 \subset W_1$.
Add to solve laterA square matrix $A$ is called idempotent if $A^2=A$.
(a) Suppose $A$ is an $n \times n$ idempotent matrix and let $I$ be the $n\times n$ identity matrix. Prove that the matrix $I-A$ is an idempotent matrix.
(b) Assume that $A$ is an $n\times n$ nonzero idempotent matrix. Then determine all integers $k$ such that the matrix $I-kA$ is idempotent.
(c) Let $A$ and $B$ be $n\times n$ matrices satisfying
\[AB=A \text{ and } BA=B.\]
Then prove that $A$ is an idempotent matrix.
(a) Prove that each complex $n\times n$ matrix $A$ can be written as
\[A=B+iC,\]
where $B$ and $C$ are Hermitian matrices.
(b) Write the complex matrix
\[A=\begin{bmatrix}
i & 6\\
2-i& 1+i
\end{bmatrix}\]
as a sum $A=B+iC$, where $B$ and $C$ are Hermitian matrices.
Let $A$ and $B$ be $n\times n$ matrices.
Suppose that $A$ and $B$ have the same eigenvalues $\lambda_1, \dots, \lambda_n$ with the same corresponding eigenvectors $\mathbf{x}_1, \dots, \mathbf{x}_n$.
Prove that if the eigenvectors $\mathbf{x}_1, \dots, \mathbf{x}_n$ are linearly independent, then $A=B$.
Determine all $2\times 2$ matrices $A$ such that $A$ has eigenvalues $2$ and $-1$ with corresponding eigenvectors
\[\begin{bmatrix}
1 \\
0
\end{bmatrix} \text{ and } \begin{bmatrix}
2 \\
1
\end{bmatrix},\]
respectively.
Find the inverse matrix of the matrix
\[A=\begin{bmatrix}
1 & 1 & 2 \\
9 &2 &0 \\
5 & 0 & 3
\end{bmatrix}\]
using the Cayley–Hamilton theorem.
(a) Let $A$ be a real orthogonal $n\times n$ matrix. Prove that the length (magnitude) of each eigenvalue of $A$ is $1$.
(b) Let $A$ be a real orthogonal $3\times 3$ matrix and suppose that the determinant of $A$ is $1$. Then prove that $A$ has $1$ as an eigenvalue.
Add to solve later Let $n$ be an odd integer and let $A$ be an $n\times n$ real matrix.
Prove that the matrix $A$ has at least one real eigenvalue.
Let $A$ be an $n\times n$ matrix. Suppose that $\mathbf{y}$ is a nonzero row vector such that
\[\mathbf{y}A=\mathbf{y}.\]
(Here a row vector means a $1\times n$ matrix.)
Prove that there is a nonzero column vector $\mathbf{x}$ such that
\[A\mathbf{x}=\mathbf{x}.\]
(Here a column vector means an $n \times 1$ matrix.)
Recall that a complex matrix is called Hermitian if $A^*=A$, where $A^*=\bar{A}^{\trans}$.
Prove that every Hermitian matrix $A$ can be written as the sum
\[A=B+iC,\]
where $B$ is a real symmetric matrix and $C$ is a real skew-symmetric matrix.
Let $A$ be an $n\times n$ real matrix.
Prove that if $\lambda$ is an eigenvalue of $A$, then its complex conjugate $\bar{\lambda}$ is also an eigenvalue of $A$.
Add to solve later Let $A$ be an $n\times n$ matrix. Suppose that $A$ has real eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$ with corresponding eigenvectors $\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n$.
Furthermore, suppose that
\[|\lambda_1| > |\lambda_2| \geq \cdots \geq |\lambda_n|.\]
Let
\[\mathbf{x}_0=c_1\mathbf{u}_1+c_2\mathbf{u}_2+\cdots+c_n\mathbf{u}_n\]
for some real numbers $c_1, c_2, \dots, c_n$ and $c_1\neq 0$.
Define
\[\mathbf{x}_{k+1}=A\mathbf{x}_k \text{ for } k=0, 1, 2,\dots\]
and let
\[\beta_k=\frac{\mathbf{x}_k\cdot \mathbf{x}_{k+1}}{\mathbf{x}_k \cdot \mathbf{x}_k}=\frac{\mathbf{x}_k^{\trans} \mathbf{x}_{k+1}}{\mathbf{x}_k^{\trans} \mathbf{x}_k}.\]
Prove that
\[\lim_{k\to \infty} \beta_k=\lambda_1.\]