Consider the linear combination
\[c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots +c_k \mathbf{v}_k=\mathbf{0}.\]
Our goal is to show that $c_1=c_2=\cdots=c_k=0$.
We compute the dot product of $\mathbf{v}_i$ and the above linear combination for each $i=1, 2, \dots, k$:
\begin{align*}
0&=\mathbf{v}_i\cdot \mathbf{0}\\
&=\mathbf{v}_i \cdot (c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots +c_k \mathbf{v}_k)\\
&=c_1\mathbf{v}_i \cdot \mathbf{v}_1+c_2\mathbf{v}_i \cdot \mathbf{v}_2+\cdots +c_k \mathbf{v}_i \cdot\mathbf{v}_k.
\end{align*}
As $S$ is an orthogonal set, we have $\mathbf{v}_i\cdot \mathbf{v}_j=0$ if $i\neq j$.
Hence all terms but the $i$-th one are zero, and thus we have
\[0=c_i\mathbf{v}_i\cdot \mathbf{v}_i=c_i \|\mathbf{v}_i\|^2.\]
Since $\mathbf{v}_i$ is a nonzero vector, its length $\|\mathbf{v}_i\|$ is nonzero.
It follows that $c_i=0$.
As this computation holds for every $i=1, 2, \dots, k$, we conclude that $c_1=c_2=\cdots=c_k=0$.
Hence the set $S$ is linearly independent.
(b) If $k=n$, then prove that $S$ is a basis for $\R^n$.
Suppose that $k=n$. Then by part (a), the set $S$ consists of $n$ linearly independent vectors in the dimension $n$ vector space $\R^n$.
Thus, $S$ is also a spanning set of $\R^n$, and hence $S$ is a basis for $\R^n$.
True or False Problems of Vector Spaces and Linear Transformations
These are True or False problems.
For each of the following statements, determine if it contains a wrong information or not.
Let $A$ be a $5\times 3$ matrix. Then the range of $A$ is a subspace in $\R^3$.
The function $f(x)=x^2+1$ is not in the vector space $C[-1,1]$ because […]
Linear Independent Vectors and the Vector Space Spanned By Them
Let $V$ be a vector space over a field $K$. Let $\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n$ be linearly independent vectors in $V$. Let $U$ be the subspace of $V$ spanned by these vectors, that is, $U=\Span \{\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n\}$.
Let […]
Inner Product, Norm, and Orthogonal Vectors
Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in […]
Orthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct Eigenvalues
Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$.
Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$.
(Nagoya University, Linear Algebra Final Exam Problem)
Hint.
Two […]
Eigenvalues and Eigenvectors of The Cross Product Linear Transformation
We fix a nonzero vector $\mathbf{a}$ in $\R^3$ and define a map $T:\R^3\to \R^3$ by
\[T(\mathbf{v})=\mathbf{a}\times \mathbf{v}\]
for all $\mathbf{v}\in \R^3$.
Here the right-hand side is the cross product of $\mathbf{a}$ and $\mathbf{v}$.
(a) Prove that $T:\R^3\to \R^3$ is […]