If $\mathbf{v}, \mathbf{w}$ are Linearly Independent Vectors and $A$ is Nonsingular, then $A\mathbf{v}, A\mathbf{w}$ are Linearly Independent

Ohio State University exam problems and solutions in mathematics

Problem 700

Let $A$ be an $n\times n$ nonsingular matrix. Let $\mathbf{v}, \mathbf{w}$ be linearly independent vectors in $\R^n$. Prove that the vectors $A\mathbf{v}$ and $A\mathbf{w}$ are linearly independent.

 
LoadingAdd to solve later

Proof.

Suppose that we have a linear combination
\[c_1(A\mathbf{v})+c_2(A\mathbf{w})=\mathbf{0},\] where $c_1, c_2$ are scalars.
Out goal is to show that $c_1=c_2=0$.
Factoring out $A$, we have
\[A(c_1\mathbf{v}+c_2\mathbf{w})=\mathbf{0}.\]


Note that since $A$ is a nonsingular matrix, the equation $A\mathbf{x}=\mathbf{0}$ has only the zero solution $\mathbf{x}=\mathbf{0}$.
The above equality yields that $c_1\mathbf{v}+c_2\mathbf{w}$ is a solution to $A\mathbf{x}=\mathbf{0}$.
Hence, we have
\[c_1\mathbf{v}+c_2\mathbf{w}=\mathbf{0}.\]

By assumption, the vectors $\mathbf{v}$ and $\mathbf{w}$ are linearly independent, and this implies that $c_1=c_2=0$.

We have shown that whenever we have $c_1(A\mathbf{v})+c_2(A\mathbf{w})=\mathbf{0}$, we must have $c_1=c_2=0$. This yields that the vectors $A\mathbf{v}$ and $A\mathbf{w}$ are linearly independent.

Common Mistake

This is a midterm exam problem of Lienar Algebra at the Ohio State University.


One common mistake is to ignore the logic and write down whatever you know.
For example, some students started with $c_1\mathbf{v}+c_2\mathbf{w}=\mathbf{0}$ and since $\mathbf{v}$ and $\mathbf{w}$ are linearly independent, we have $c_1=c_2=0$.
Multiplying by $A$, we have $c_1A\mathbf{v}+c_2A\mathbf{w}=\mathbf{0}$ and $c_1=c_2=0$.

This argument is totally wrong.

The above argument is wrong because it started with different vectors than we want to prove to be linearly independent.
There is nothing wrong about the mathematical operations in the above arguments. However, that argument does not prove that the vectors $A\mathbf{v}$ and $A\mathbf{w}$ are linearly independent.

We should first assume that $c_1A\mathbf{v}+c_2A\mathbf{w}=\mathbf{0}$ and prove that $c_1=c_2=0$.


LoadingAdd to solve later

More from my site

  • Linear Algebra Midterm 1 at the Ohio State University (3/3)Linear Algebra Midterm 1 at the Ohio State University (3/3) The following problems are Midterm 1 problems of Linear Algebra (Math 2568) at the Ohio State University in Autumn 2017. There were 9 problems that covered Chapter 1 of our textbook (Johnson, Riess, Arnold). The time limit was 55 minutes. This post is Part 3 and contains […]
  • Are the Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$ Linearly Independent?Are the Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$ Linearly Independent? Let $C[-2\pi, 2\pi]$ be the vector space of all continuous functions defined on the interval $[-2\pi, 2\pi]$. Consider the functions \[f(x)=\sin^2(x) \text{ and } g(x)=\cos^2(x)\] in $C[-2\pi, 2\pi]$. Prove or disprove that the functions $f(x)$ and $g(x)$ are linearly […]
  • Determine Whether Trigonometry Functions $\sin^2(x), \cos^2(x), 1$ are Linearly Independent or DependentDetermine Whether Trigonometry Functions $\sin^2(x), \cos^2(x), 1$ are Linearly Independent or Dependent Let $f(x)=\sin^2(x)$, $g(x)=\cos^2(x)$, and $h(x)=1$. These are vectors in $C[-1, 1]$. Determine whether the set $\{f(x), \, g(x), \, h(x)\}$ is linearly dependent or linearly independent. (The Ohio State University, Linear Algebra Midterm Exam […]
  • Subspace of Skew-Symmetric Matrices and Its DimensionSubspace of Skew-Symmetric Matrices and Its Dimension Let $V$ be the vector space of all $2\times 2$ matrices. Let $W$ be a subset of $V$ consisting of all $2\times 2$ skew-symmetric matrices. (Recall that a matrix $A$ is skew-symmetric if $A^{\trans}=-A$.) (a) Prove that the subset $W$ is a subspace of $V$. (b) Find the […]
  • Express a Vector as a Linear Combination of Other VectorsExpress a Vector as a Linear Combination of Other Vectors Express the vector $\mathbf{b}=\begin{bmatrix} 2 \\ 13 \\ 6 \end{bmatrix}$ as a linear combination of the vectors \[\mathbf{v}_1=\begin{bmatrix} 1 \\ 5 \\ -1 \end{bmatrix}, \mathbf{v}_2= \begin{bmatrix} 1 \\ 2 \\ 1 […]
  • The Product of Two Nonsingular Matrices is NonsingularThe Product of Two Nonsingular Matrices is Nonsingular Prove that if $n\times n$ matrices $A$ and $B$ are nonsingular, then the product $AB$ is also a nonsingular matrix. (The Ohio State University, Linear Algebra Final Exam Problem)   Definition (Nonsingular Matrix) An $n\times n$ matrix is called nonsingular if the […]
  • Linear Transformation and a Basis of the Vector Space $\R^3$Linear Transformation and a Basis of the Vector Space $\R^3$ Let $T$ be a linear transformation from the vector space $\R^3$ to $\R^3$. Suppose that $k=3$ is the smallest positive integer such that $T^k=\mathbf{0}$ (the zero linear transformation) and suppose that we have $\mathbf{x}\in \R^3$ such that $T^2\mathbf{x}\neq \mathbf{0}$. Show […]
  • Linear Algebra Midterm 1 at the Ohio State University (2/3)Linear Algebra Midterm 1 at the Ohio State University (2/3) The following problems are Midterm 1 problems of Linear Algebra (Math 2568) at the Ohio State University in Autumn 2017. There were 9 problems that covered Chapter 1 of our textbook (Johnson, Riess, Arnold). The time limit was 55 minutes. This post is Part 2 and contains […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Linear Algebra
Ohio State University exam problems and solutions in mathematics
Find a Nonsingular Matrix $A$ satisfying $3A=A^2+AB$

(a) Find a $3\times 3$ nonsingular matrix $A$ satisfying $3A=A^2+AB$, where \[B=\begin{bmatrix} 2 & 0 & -1 \\ 0 &2...

Close