Let $T: \R^n \to \R^m$ be a linear transformation.
Suppose that $S=\{\mathbf{x}_1, \mathbf{x}_2,\dots, \mathbf{x}_k\}$ is a subset of $\R^n$ such that $\{T(\mathbf{x}_1), T(\mathbf{x}_2), \dots, T(\mathbf{x}_k) \}$ is a linearly independent subset of $\R^m$.

Vectors $\mathbf{x}_1, \mathbf{x}_2,\dots, \mathbf{x}_k$ are linearly independent
if and only if the only solution to the vector equation
\[c_1\mathbf{x}_1+c_2\mathbf{x}_2+\cdots+c_k\mathbf{x}_k=\mathbf{0}_n\]
is $c_1=c_2=\cdots=c_k=0$.

A linear transformation $T:\R^n \to \R^m$ is a map such that

$T(\mathbf{v}+\mathbf{w})=T(\mathbf{v})+\mathbf{w}$ for all $\mathbf{v}, \mathbf{w} \in \R^n$.

$T(c\mathbf{v})=cT(\mathbf{v})$ for all $c \in \R$ and $\mathbf{v}\in \R^n$.

Proof.

Consider a linear combination of vectors in $S$
\[c_1\mathbf{x}_1+c_2\mathbf{x}_2+\cdots+c_k\mathbf{x}_k=\mathbf{0}_n,\]
where $\mathbf{0}_n$ is the $n$ dimensional zero vector.
To show that $S$ is linearly independent, we need to show that the coefficients $c_i$ are all zero.

Thus we have
\begin{align*}
\mathbf{0}_m&=T(\mathbf{0}_n)=T(c_1\mathbf{x}_1+c_2\mathbf{x}_2+\cdots+c_k\mathbf{x}_k)\\
&= c_1T(\mathbf{x}_1)+c_2T(\mathbf{x}_2)+\cdots+c_k T(\mathbf{x}_k).
\end{align*}
In the last step, we used the linearity of $T$.

Since the vectors $T(\mathbf{x}_1), T(\mathbf{x}_2), \dots, T(\mathbf{x}_k)$ are linearly independent, the coefficient of this linear combination of these vectors must be zero.
Thus we have $c_1=c_2=\dots=c_k=0$, hence the set $S$ is linearly independent.

A Linear Transformation from Vector Space over Rational Numbers to itself
Let $\Q$ denote the set of rational numbers (i.e., fractions of integers). Let $V$ denote the set of the form $x+y \sqrt{2}$ where $x,y \in \Q$. You may take for granted that the set $V$ is a vector space over the field $\Q$.
(a) Show that $B=\{1, \sqrt{2}\}$ is a basis for the […]

If Vectors are Linearly Dependent, then What Happens When We Add One More Vectors?
Suppose that $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r$ are linearly dependent $n$-dimensional real vectors.
For any vector $\mathbf{v}_{r+1} \in \R^n$, determine whether the vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r, \mathbf{v}_{r+1}$ are linearly […]

Two Subspaces Intersecting Trivially, and the Direct Sum of Vector Spaces.
Let $V$ and $W$ be subspaces of $\R^n$ such that $V \cap W =\{\mathbf{0}\}$ and $\dim(V)+\dim(W)=n$.
(a) If $\mathbf{v}+\mathbf{w}=\mathbf{0}$, where $\mathbf{v}\in V$ and $\mathbf{w}\in W$, then show that $\mathbf{v}=\mathbf{0}$ and $\mathbf{w}=\mathbf{0}$.
(b) If $B_1$ is a […]

Linear Combination of Eigenvectors is Not an Eigenvector
Suppose that $\lambda$ and $\mu$ are two distinct eigenvalues of a square matrix $A$ and let $\mathbf{x}$ and $\mathbf{y}$ be eigenvectors corresponding to $\lambda$ and $\mu$, respectively.
If $a$ and $b$ are nonzero numbers, then prove that $a \mathbf{x}+b\mathbf{y}$ is not an […]

Compute Determinant of a Matrix Using Linearly Independent Vectors
Let $A$ be a $3 \times 3$ matrix.
Let $\mathbf{x}, \mathbf{y}, \mathbf{z}$ are linearly independent $3$-dimensional vectors. Suppose that we have
\[A\mathbf{x}=\begin{bmatrix}
1 \\
0 \\
1
\end{bmatrix}, A\mathbf{y}=\begin{bmatrix}
0 \\
1 \\
0
[…]

Dual Vector Space and Dual Basis, Some Equality
Let $V$ be a finite dimensional vector space over a field $k$ and let $V^*=\Hom(V, k)$ be the dual vector space of $V$.
Let $\{v_i\}_{i=1}^n$ be a basis of $V$ and let $\{v^i\}_{i=1}^n$ be the dual basis of $V^*$. Then prove that
\[x=\sum_{i=1}^nv^i(x)v_i\]
for any vector $x\in […]

Let $V$ and $W$ be subspaces of $\R^n$ such that $V \cap W =\{\mathbf{0}\}$ and $\dim(V)+\dim(W)=n$. (a) If $\mathbf{v}+\mathbf{w}=\mathbf{0}$, where...