Find an Orthonormal Basis of $\R^3$ Containing a Given Vector
Problem 600
Let $\mathbf{v}_1=\begin{bmatrix}
2/3 \\ 2/3 \\ 1/3
\end{bmatrix}$ be a vector in $\R^3$.
Find an orthonormal basis for $\R^3$ containing the vector $\mathbf{v}_1$.

Let $\mathbf{v}_1=\begin{bmatrix}
2/3 \\ 2/3 \\ 1/3
\end{bmatrix}$ be a vector in $\R^3$.
Find an orthonormal basis for $\R^3$ containing the vector $\mathbf{v}_1$.
Let $A$ be a real symmetric matrix whose diagonal entries are all positive real numbers.
Is it true that the all of the diagonal entries of the inverse matrix $A^{-1}$ are also positive?
If so, prove it. Otherwise, give a counterexample.
Let $R$ be a commutative ring with $1$.
Prove that if every proper ideal of $R$ is a prime ideal, then $R$ is a field.
Let $F:\R^2\to \R^2$ be the function that maps each vector in $\R^2$ to its reflection with respect to $x$-axis.
Determine the formula for the function $F$ and prove that $F$ is a linear transformation.
Let
\[A=\begin{bmatrix}
a & b\\
-b& a
\end{bmatrix}\]
be a $2\times 2$ matrix, where $a, b$ are real numbers.
Suppose that $b\neq 0$.
Prove that the matrix $A$ does not have real eigenvalues.
Let $U$ and $V$ be subspaces of the $n$-dimensional vector space $\R^n$.
Prove that the intersection $U\cap V$ is also a subspace of $\R^n$.
Is it possible that each element of an infinite group has a finite order?
If so, give an example. Otherwise, prove the non-existence of such a group.
We fix a nonzero vector $\mathbf{a}$ in $\R^3$ and define a map $T:\R^3\to \R^3$ by
\[T(\mathbf{v})=\mathbf{a}\times \mathbf{v}\]
for all $\mathbf{v}\in \R^3$.
Here the right-hand side is the cross product of $\mathbf{a}$ and $\mathbf{v}$.
(a) Prove that $T:\R^3\to \R^3$ is a linear transformation.
(b) Determine the eigenvalues and eigenvectors of $T$.
Let $\R^n$ be an inner product space with inner product $\langle \mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\trans}\mathbf{y}$ for $\mathbf{x}, \mathbf{y}\in \R^n$.
A linear transformation $T:\R^n \to \R^n$ is called orthogonal transformation if for all $\mathbf{x}, \mathbf{y}\in \R^n$, it satisfies
\[\langle T(\mathbf{x}), T(\mathbf{y})\rangle=\langle\mathbf{x}, \mathbf{y} \rangle.\]
Prove that if $T:\R^n\to \R^n$ is an orthogonal transformation, then $T$ is an isomorphism.
Let $S=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ be a set of nonzero vectors in $\R^n$.
Suppose that $S$ is an orthogonal set.
(a) Show that $S$ is linearly independent.
(b) If $k=n$, then prove that $S$ is a basis for $\R^n$.
Let $C[-1, 1]$ be the vector space over $\R$ of all continuous functions defined on the interval $[-1, 1]$. Let
\[V:=\{f(x)\in C[-1,1] \mid f(x)=a e^x+b e^{2x}+c e^{3x}, a, b, c\in \R\}\]
be a subset in $C[-1, 1]$.
(a) Prove that $V$ is a subspace of $C[-1, 1]$.
(b) Prove that the set $B=\{e^x, e^{2x}, e^{3x}\}$ is a basis of $V$.
(c) Prove that
\[B’=\{e^x-2e^{3x}, e^x+e^{2x}+2e^{3x}, 3e^{2x}+e^{3x}\}\]
is a basis for $V$.
Let $R$ be an integral domain and let $I$ be an ideal of $R$.
Is the quotient ring $R/I$ an integral domain?
Let $P_2$ be the vector space over $\R$ of all polynomials of degree $2$ or less.
Let $S=\{p_1(x), p_2(x), p_3(x)\}$, where
\[p_1(x)=x^2+1, \quad p_2(x)=6x^2+x+2, \quad p_3(x)=3x^2+x.\]
(a) Use the basis $B=\{x^2, x, 1\}$ of $P_2$ to prove that the set $S$ is a basis for $P_2$.
(b) Find the coordinate vector of $p(x)=x^2+2x+3\in P_2$ with respect to the basis $S$.
Let $A$ and $B$ be square matrices such that they commute each other: $AB=BA$.
Assume that $A-B$ is a nilpotent matrix.
Then prove that the eigenvalues of $A$ and $B$ are the same.
Let $V$ be the vector space over $\R$ of all real $2\times 2$ matrices.
Let $W$ be the subset of $V$ consisting of all symmetric matrices.
(a) Prove that $W$ is a subspace of $V$.
(b) Find a basis of $W$.
(c) Determine the dimension of $W$.
Consider the Hermitian matrix
\[A=\begin{bmatrix}
1 & i\\
-i& 1
\end{bmatrix}.\]
(a) Find the eigenvalues of $A$.
(b) For each eigenvalue of $A$, find the eigenvectors.
(c) Diagonalize the Hermitian matrix $A$ by a unitary matrix. Namely, find a diagonal matrix $D$ and a unitary matrix $U$ such that $U^{-1}AU=D$.
Prove that the matrix
\[A=\begin{bmatrix}
0 & 1\\
-1& 0
\end{bmatrix}\]
is diagonalizable.
Prove, however, that $A$ cannot be diagonalized by a real nonsingular matrix.
That is, there is no real nonsingular matrix $S$ such that $S^{-1}AS$ is a diagonal matrix.
Consider the $2\times 2$ complex matrix
\[A=\begin{bmatrix}
a & b-a\\
0& b
\end{bmatrix}.\]
(a) Find the eigenvalues of $A$.
(b) For each eigenvalue of $A$, determine the eigenvectors.
(c) Diagonalize the matrix $A$.
(d) Using the result of the diagonalization, compute and simplify $A^k$ for each positive integer $k$.
A square matrix $A$ is called nilpotent if some power of $A$ is the zero matrix.
Namely, $A$ is nilpotent if there exists a positive integer $k$ such that $A^k=O$, where $O$ is the zero matrix.
Suppose that $A$ is a nilpotent matrix and let $B$ be an invertible matrix of the same size as $A$.
Is the matrix $B-A$ invertible? If so prove it. Otherwise, give a counterexample.
Let $V$ be a vector space over a scalar field $K$.
Let $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k$ be vectors in $V$ and consider the subset
\[W=\{a_1\mathbf{v}_1+a_2\mathbf{v}_2+\cdots+ a_k\mathbf{v}_k \mid a_1, a_2, \dots, a_k \in K \text{ and } a_1+a_2+\cdots+a_k=0\}.\]
So each element of $W$ is a linear combination of vectors $\mathbf{v}_1, \dots, \mathbf{v}_k$ such that the sum of the coefficients is zero.
Prove that $W$ is a subspace of $V$.