Let $A$ be an $n\times n$ nonsingular matrix. Let $\mathbf{v}, \mathbf{w}$ be linearly independent vectors in $\R^n$. Prove that the vectors $A\mathbf{v}$ and $A\mathbf{w}$ are linearly independent.

Suppose that we have a linear combination
\[c_1(A\mathbf{v})+c_2(A\mathbf{w})=\mathbf{0},\]
where $c_1, c_2$ are scalars.
Out goal is to show that $c_1=c_2=0$.
Factoring out $A$, we have
\[A(c_1\mathbf{v}+c_2\mathbf{w})=\mathbf{0}.\]

Note that since $A$ is a nonsingular matrix, the equation $A\mathbf{x}=\mathbf{0}$ has only the zero solution $\mathbf{x}=\mathbf{0}$.
The above equality yields that $c_1\mathbf{v}+c_2\mathbf{w}$ is a solution to $A\mathbf{x}=\mathbf{0}$.
Hence, we have
\[c_1\mathbf{v}+c_2\mathbf{w}=\mathbf{0}.\]

By assumption, the vectors $\mathbf{v}$ and $\mathbf{w}$ are linearly independent, and this implies that $c_1=c_2=0$.

We have shown that whenever we have $c_1(A\mathbf{v})+c_2(A\mathbf{w})=\mathbf{0}$, we must have $c_1=c_2=0$. This yields that the vectors $A\mathbf{v}$ and $A\mathbf{w}$ are linearly independent.

Common Mistake

This is a midterm exam problem of Lienar Algebra at the Ohio State University.

One common mistake is to ignore the logic and write down whatever you know.
For example, some students started with $c_1\mathbf{v}+c_2\mathbf{w}=\mathbf{0}$ and since $\mathbf{v}$ and $\mathbf{w}$ are linearly independent, we have $c_1=c_2=0$.
Multiplying by $A$, we have $c_1A\mathbf{v}+c_2A\mathbf{w}=\mathbf{0}$ and $c_1=c_2=0$.

This argument is totally wrong.

The above argument is wrong because it started with different vectors than we want to prove to be linearly independent.
There is nothing wrong about the mathematical operations in the above arguments. However, that argument does not prove that the vectors $A\mathbf{v}$ and $A\mathbf{w}$ are linearly independent.

We should first assume that $c_1A\mathbf{v}+c_2A\mathbf{w}=\mathbf{0}$ and prove that $c_1=c_2=0$.

Linear Algebra Midterm 1 at the Ohio State University (3/3)
The following problems are Midterm 1 problems of Linear Algebra (Math 2568) at the Ohio State University in Autumn 2017.
There were 9 problems that covered Chapter 1 of our textbook (Johnson, Riess, Arnold).
The time limit was 55 minutes.
This post is Part 3 and contains […]

Are the Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$ Linearly Independent?
Let $C[-2\pi, 2\pi]$ be the vector space of all continuous functions defined on the interval $[-2\pi, 2\pi]$.
Consider the functions \[f(x)=\sin^2(x) \text{ and } g(x)=\cos^2(x)\]
in $C[-2\pi, 2\pi]$.
Prove or disprove that the functions $f(x)$ and $g(x)$ are linearly […]

Subspace of Skew-Symmetric Matrices and Its Dimension
Let $V$ be the vector space of all $2\times 2$ matrices. Let $W$ be a subset of $V$ consisting of all $2\times 2$ skew-symmetric matrices. (Recall that a matrix $A$ is skew-symmetric if $A^{\trans}=-A$.)
(a) Prove that the subset $W$ is a subspace of $V$.
(b) Find the […]

The Product of Two Nonsingular Matrices is Nonsingular
Prove that if $n\times n$ matrices $A$ and $B$ are nonsingular, then the product $AB$ is also a nonsingular matrix.
(The Ohio State University, Linear Algebra Final Exam Problem)
Definition (Nonsingular Matrix)
An $n\times n$ matrix is called nonsingular if the […]

Linear Transformation and a Basis of the Vector Space $\R^3$
Let $T$ be a linear transformation from the vector space $\R^3$ to $\R^3$.
Suppose that $k=3$ is the smallest positive integer such that $T^k=\mathbf{0}$ (the zero linear transformation) and suppose that we have $\mathbf{x}\in \R^3$ such that $T^2\mathbf{x}\neq \mathbf{0}$.
Show […]

Express a Vector as a Linear Combination of Other Vectors
Express the vector $\mathbf{b}=\begin{bmatrix}
2 \\
13 \\
6
\end{bmatrix}$ as a linear combination of the vectors
\[\mathbf{v}_1=\begin{bmatrix}
1 \\
5 \\
-1
\end{bmatrix},
\mathbf{v}_2=
\begin{bmatrix}
1 \\
2 \\
1
[…]

Determine Whether There Exists a Nonsingular Matrix Satisfying $A^4=ABA^2+2A^3$
Determine whether there exists a nonsingular matrix $A$ if
\[A^4=ABA^2+2A^3,\]
where $B$ is the following matrix.
\[B=\begin{bmatrix}
-1 & 1 & -1 \\
0 &-1 &0 \\
2 & 1 & -4
\end{bmatrix}.\]
If such a nonsingular matrix $A$ exists, find the inverse […]