Prove that $\mathbf{v} \mathbf{v}^\trans$ is a Symmetric Matrix for any Vector $\mathbf{v}$
Problem 640
Let $\mathbf{v}$ be an $n \times 1$ column vector.
Prove that $\mathbf{v} \mathbf{v}^\trans$ is a symmetric matrix.
Add to solve laterLet $\mathbf{v}$ be an $n \times 1$ column vector.
Prove that $\mathbf{v} \mathbf{v}^\trans$ is a symmetric matrix.
Add to solve laterLet $\mathbf{v}$ be an $n \times 1$ column vector.
Prove that $\mathbf{v}^\trans \mathbf{v} = 0$ if and only if $\mathbf{v}$ is the zero vector $\mathbf{0}$.
Add to solve laterLet $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors.
Prove that $\tr ( \mathbf{v} \mathbf{w}^\trans ) = \mathbf{v}^\trans \mathbf{w}$.
Add to solve laterLet $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors.
(a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$.
(b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} \mathbf{v}^\trans$.
Add to solve laterCalculate the following expressions, using the following matrices:
\[A = \begin{bmatrix} 2 & 3 \\ -5 & 1 \end{bmatrix}, \qquad B = \begin{bmatrix} 0 & -1 \\ 1 & -1 \end{bmatrix}, \qquad \mathbf{v} = \begin{bmatrix} 2 \\ -4 \end{bmatrix}\]
(a) $A B^\trans + \mathbf{v} \mathbf{v}^\trans$.
(b) $A \mathbf{v} – 2 \mathbf{v}$.
(c) $\mathbf{v}^{\trans} B$.
(d) $\mathbf{v}^\trans \mathbf{v} + \mathbf{v}^\trans B A^\trans \mathbf{v}$.
Add to solve laterLet $A$ and $B$ be $n \times n$ matrices, and $\mathbf{v}$ an $n \times 1$ column vector.
Use the matrix components to prove that $(A + B) \mathbf{v} = A\mathbf{v} + B\mathbf{v}$.
Add to solve laterLet $A$ and $B$ be $n \times n$ matrices.
Is it always true that $\tr (A B) = \tr (A) \tr (B) $?
If it is true, prove it. If not, give a counterexample.
Add to solve laterLet $A$ be an $n \times n$ matrix.
Is it true that $\tr ( A^\trans ) = \tr(A)$? If it is true, prove it. If not, give a counterexample.
Add to solve laterSuppose that $B=\{\mathbf{v}_1, \mathbf{v}_2\}$ is a basis for $\R^2$. Let $S:=[\mathbf{v}_1, \mathbf{v}_2]$.
Note that as the column vectors of $S$ are linearly independent, the matrix $S$ is invertible.
Prove that for each vector $\mathbf{v} \in V$, the vector $S^{-1}\mathbf{v}$ is the coordinate vector of $\mathbf{v}$ with respect to the basis $B$.
Add to solve laterLet $A=\begin{bmatrix}
a & b\\
c& d
\end{bmatrix}$ be an $2\times 2$ matrix.
Express the eigenvalues of $A$ in terms of the trace and the determinant of $A$.
Add to solve laterConsider the matrix $A=\begin{bmatrix}
a & -b\\
b& a
\end{bmatrix}$, where $a$ and $b$ are real numbers and $b\neq 0$.
(a) Find all eigenvalues of $A$.
(b) For each eigenvalue of $A$, determine the eigenspace $E_{\lambda}$.
(c) Diagonalize the matrix $A$ by finding a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.
Add to solve laterDiagonalize the $2\times 2$ matrix $A=\begin{bmatrix}
2 & -1\\
-1& 2
\end{bmatrix}$ by finding a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.
Determine whether the function $T:\R^2 \to \R^3$ defined by
\[T\left(\, \begin{bmatrix}
x \\
y
\end{bmatrix} \,\right)
=
\begin{bmatrix}
x_+y \\
x+1 \\
3y
\end{bmatrix}\]
is a linear transformation.
Let $A$ be an $n\times n$ matrix. Suppose that the sum of elements in each row of $A$ is zero.
Then prove that the matrix $A$ is singular.
Add to solve laterLet $C[-2\pi, 2\pi]$ be the vector space of all real-valued continuous functions defined on the interval $[-2\pi, 2\pi]$.
Consider the subspace $W=\Span\{\sin^2(x), \cos^2(x)\}$ spanned by functions $\sin^2(x)$ and $\cos^2(x)$.
(a) Prove that the set $B=\{\sin^2(x), \cos^2(x)\}$ is a basis for $W$.
(b) Prove that the set $\{\sin^2(x)-\cos^2(x), 1\}$ is a basis for $W$.
Add to solve laterAn $n\times n$ matrix $A$ is called orthogonal if $A^{\trans}A=I$.
Let $V$ be the vector space of all real $2\times 2$ matrices.
Consider the subset
\[W:=\{A\in V \mid \text{$A$ is an orthogonal matrix}\}.\]
Prove or disprove that $W$ is a subspace of $V$.
Let $A$ be a $2\times 2$ real symmetric matrix.
Prove that all the eigenvalues of $A$ are real numbers by considering the characteristic polynomial of $A$.
Let $A$ and $B$ be $n\times n$ matrices and assume that they commute: $AB=BA$.
Then prove that the matrices $A$ and $B$ share at least one common eigenvector.
Let $\calP_3$ be the vector space of all polynomials of degree $3$ or less.
Let
\[S=\{p_1(x), p_2(x), p_3(x), p_4(x)\},\]
where
\begin{align*}
p_1(x)&=1+3x+2x^2-x^3 & p_2(x)&=x+x^3\\
p_3(x)&=x+x^2-x^3 & p_4(x)&=3+8x+8x^3.
\end{align*}
(a) Find a basis $Q$ of the span $\Span(S)$ consisting of polynomials in $S$.
(b) For each polynomial in $S$ that is not in $Q$, find the coordinate vector with respect to the basis $Q$.
(The Ohio State University, Linear Algebra Midterm)
Read solution
Let $V$ be a vector space and $B$ be a basis for $V$.
Let $\mathbf{w}_1, \mathbf{w}_2, \mathbf{w}_3, \mathbf{w}_4, \mathbf{w}_5$ be vectors in $V$.
Suppose that $A$ is the matrix whose columns are the coordinate vectors of $\mathbf{w}_1, \mathbf{w}_2, \mathbf{w}_3, \mathbf{w}_4, \mathbf{w}_5$ with respect to the basis $B$.
After applying the elementary row operations to $A$, we obtain the following matrix in reduced row echelon form
\[\begin{bmatrix}
1 & 0 & 2 & 1 & 0 \\
0 & 1 & 3 & 0 & 1 \\
0 & 0 & 0 & 0 & 0 \\
0 & 0 & 0 & 0 & 0
\end{bmatrix}.\]
(a) What is the dimension of $V$?
(b) What is the dimension of $\Span\{\mathbf{w}_1, \mathbf{w}_2, \mathbf{w}_3, \mathbf{w}_4, \mathbf{w}_5\}$?
(The Ohio State University, Linear Algebra Midterm)
Read solution