Linear Combination and Linear Independence
Definition
- The expression $c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots+c_k\mathbf{v}_k$ is called a linear combination of vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\in \R^n$, where $c_1, c_2, \dots, c_k$ are scalars in $\R$.
- A set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ is said to be linearly independent if the only scalrs $c_1, c_2, \dots, c_k$ satisfying $c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots+c_k\mathbf{v}_k=\mathbf{0}$ are $c_1=c_2=\cdots=c_k=0$. We also say that the vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k$ are linearly independent.
- If vectors are not linearly independent, they are linearly dependent.
Summary
- The set of $n$-dimensional vectors $\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ are linearly dependent if $k > n$. (If there are more vectors than the dimension, then the vectors are linearly dependent.)
=solution
Problems
-
Express the vector $\mathbf{b}=\begin{bmatrix}
2 \\
13 \\
6
\end{bmatrix}$ as a linear combination of the vectors
\[\mathbf{v}_1=\begin{bmatrix}
1 \\
5 \\
-1
\end{bmatrix},
\mathbf{v}_2=
\begin{bmatrix}
1 \\
2 \\
1
\end{bmatrix},
\mathbf{v}_3=
\begin{bmatrix}
1 \\
4 \\
3
\end{bmatrix}.\] (The Ohio State University, Linear Algebra Exam) - Write the vector $\begin{bmatrix} 1 \\ 3 \\ -1 \end{bmatrix}$ as a linear combination of the vectors $\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}$ , $\begin{bmatrix} 2 \\ -2 \\ 1 \end{bmatrix}$, $\begin{bmatrix} 2 \\ 0 \\ 4 \end{bmatrix}$.
-
For what value(s) of $a$ is the following set $S$ linearly dependent?
\[ S=\left \{\,\begin{bmatrix}
1 \\
2 \\
3 \\
a
\end{bmatrix}, \begin{bmatrix}
a \\
0 \\
-1 \\
2
\end{bmatrix}, \begin{bmatrix}
0 \\
0 \\
a^2 \\
7
\end{bmatrix}, \begin{bmatrix}
1 \\
a \\
1 \\
1
\end{bmatrix}, \begin{bmatrix}
2 \\
-2 \\
3 \\
a^3
\end{bmatrix} \, \right\}.\] - Prove that any set of vectors which contains the zero vector is linearly dependent.
- Let $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a set of nonzero vectors in $\R^m$ such that the dot product $mathbf{v}_i\cdot \mathbf{v}_j=0$ when $i\neq j$. Prove that the set is linearly independent.
-
Determine whether the following set of vectors is linearly independent or linearly dependent. If the set is linearly dependent, express one vector in the set as a linear combination of the others.
\[\left\{\, \begin{bmatrix}
1 \\
0 \\
-1 \\
0
\end{bmatrix}, \begin{bmatrix}
1 \\
2 \\
3 \\
4
\end{bmatrix}, \begin{bmatrix}
-1 \\
-2 \\
0 \\
1
\end{bmatrix},
\begin{bmatrix}
-2 \\
-2 \\
7 \\
11
\end{bmatrix}\, \right\}.\] -
Find the value(s) of $h$ for which the following set of vectors
\[\left \{ \mathbf{v}_1=\begin{bmatrix}
1 \\
0 \\
0
\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}
h \\
1 \\
-h
\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}
1 \\
2h \\
3h+1
\end{bmatrix}\right\}\] is linearly independent.
(Boston College) - Let
\[\mathbf{v}_1=\begin{bmatrix}
1 \\
2 \\
0
\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}
1 \\
a \\
5
\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}
0 \\
4 \\
b
\end{bmatrix}\] be vectors in $\R^3$. Determine a condition on the scalars $a, b$ so that the set of vectors $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ is linearly dependent. -
Determine conditions on the scalars $a, b$ so that the following set $S$ of vectors is linearly dependent.
\begin{align*}
S=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\},
\end{align*}
where
\[\mathbf{v}_1=\begin{bmatrix}
1 \\
3 \\
1
\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}
1 \\
a \\
4
\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}
0 \\
2 \\
b
\end{bmatrix}.\] - Let $A$ be a $3\times 3$ matrix and let $\mathbf{v}=\begin{bmatrix}
1 \\
2 \\
-1
\end{bmatrix}$ and $\mathbf{w}=\begin{bmatrix}
2 \\
-1 \\
3
\end{bmatrix}$. Suppose that $A\mathbf{v}=-\mathbf{v}$ and $A\mathbf{w}=2\mathbf{w}$.
Then find the vector $A^5\begin{bmatrix}
-1 \\
8 \\
-9
\end{bmatrix}$. - (a) Prove that the column vectors of every $3\times 5$ matrix $A$ are linearly dependent.
(b) Prove that the row vectors of every $5\times 3$ matrix $B$ are linearly dependent. - Suppose $M$ is an $n \times n$ upper-triangular matrix. If the diagonal entries of $M$ are all non-zero, then prove that the column vectors are linearly independent. Does the conclusion hold if we do not assume that $M$ has non-zero diagonal entries?
- Suppose that an $n \times m$ matrix $M$ is composed of the column vectors $\mathbf{b}_1 , \cdots , \mathbf{b}_m$. Prove that a vector $\mathbf{v} \in \R^n$ can be written as a linear combination of the column vectors if and only if there is a vector $\mathbf{x}$ which solves the equation $M \mathbf{x} = \mathbf{v}$.