Suppose that $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r$ are linearly dependent $n$-dimensional real vectors.

For any vector $\mathbf{v}_{r+1} \in \R^n$, determine whether the vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r, \mathbf{v}_{r+1}$ are linearly independent or linearly dependent.

We claim that the vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r, \mathbf{v}_{r+1}$ are linearly dependent.
Since the vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r$ are linearly dependent, there exist scalars (real numbers) $a_1, a_2, \dots, a_r$ such that
\[a_1 \mathbf{v}_1+a_2\mathbf{v}_2+\cdots +a_r\mathbf{v}_r=\mathbf{0} \tag{*}\]
and not all of $a_1, \dots, a_r$ are zero, that is, $(a_1, \dots, a_r) \neq (0, \dots, 0)$.

Consider the equation
\[x_1\mathbf{v}_1+x_2 \mathbf{v}_2+\cdots +x_r \mathbf{v}_r+x_{r+1} \mathbf{v}_{r+1}=\mathbf{0}.\]
If this equation has a nonzero solution $(x_1, \dots, x_r, x_{r+1})$, then the vectors $\mathbf{v}_1, \dots, \mathbf{v}_{r+1}$ are linearly dependent.

In fact,
\[(x_1,x_2,\dots, x_r, x_{r+1})=(a_1, a_2, \dots, a_r, 0)\]
is a nonzero solution of the above equation.
To see this, first note that since not all of $a_1, a_2, \dots, a_r$ are zero, we have
\[(a_1, a_2, \dots, a_r, 0)\neq (0, 0, \dots, 0, 0).\]

Plug these values in the equation, we have
\begin{align*}
&a_1 \mathbf{v}_1+a_2\mathbf{v}_2+\cdots +a_r\mathbf{v}_r+0\mathbf{v}_{r+1}\\
&=a_1 \mathbf{v}_1+a_2\mathbf{v}_2+\cdots +a_r\mathbf{v}_r=\mathbf{0} \text{ by (*).}
\end{align*}
Therefore, we conclude that the vectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_r, \mathbf{v}_{r+1}$ are linearly dependent.

Linear Dependent/Independent Vectors of Polynomials
Let $p_1(x), p_2(x), p_3(x), p_4(x)$ be (real) polynomials of degree at most $3$. Which (if any) of the following two conditions is sufficient for the conclusion that these polynomials are linearly dependent?
(a) At $1$ each of the polynomials has the value $0$. Namely $p_i(1)=0$ […]

Determine Conditions on Scalars so that the Set of Vectors is Linearly Dependent
Determine conditions on the scalars $a, b$ so that the following set $S$ of vectors is linearly dependent.
\begin{align*}
S=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\},
\end{align*}
where
\[\mathbf{v}_1=\begin{bmatrix}
1 \\
3 \\
1
\end{bmatrix}, […]

Two Subspaces Intersecting Trivially, and the Direct Sum of Vector Spaces.
Let $V$ and $W$ be subspaces of $\R^n$ such that $V \cap W =\{\mathbf{0}\}$ and $\dim(V)+\dim(W)=n$.
(a) If $\mathbf{v}+\mathbf{w}=\mathbf{0}$, where $\mathbf{v}\in V$ and $\mathbf{w}\in W$, then show that $\mathbf{v}=\mathbf{0}$ and $\mathbf{w}=\mathbf{0}$.
(b) If $B_1$ is a […]

Any Vector is a Linear Combination of Basis Vectors Uniquely
Let $B=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a basis for a vector space $V$ over a scalar field $K$. Then show that any vector $\mathbf{v}\in V$ can be written uniquely as
\[\mathbf{v}=c_1\mathbf{v}_1+c_2\mathbf{v}_2+c_3\mathbf{v}_3,\]
where $c_1, c_2, c_3$ are […]

Compute Determinant of a Matrix Using Linearly Independent Vectors
Let $A$ be a $3 \times 3$ matrix.
Let $\mathbf{x}, \mathbf{y}, \mathbf{z}$ are linearly independent $3$-dimensional vectors. Suppose that we have
\[A\mathbf{x}=\begin{bmatrix}
1 \\
0 \\
1
\end{bmatrix}, A\mathbf{y}=\begin{bmatrix}
0 \\
1 \\
0
[…]

Linear Combination of Eigenvectors is Not an Eigenvector
Suppose that $\lambda$ and $\mu$ are two distinct eigenvalues of a square matrix $A$ and let $\mathbf{x}$ and $\mathbf{y}$ be eigenvectors corresponding to $\lambda$ and $\mu$, respectively.
If $a$ and $b$ are nonzero numbers, then prove that $a \mathbf{x}+b\mathbf{y}$ is not an […]

Show the Subset of the Vector Space of Polynomials is a Subspace and Find its Basis
Let $P_3$ be the vector space over $\R$ of all degree three or less polynomial with real number coefficient.
Let $W$ be the following subset of $P_3$.
\[W=\{p(x) \in P_3 \mid p'(-1)=0 \text{ and } p^{\prime\prime}(1)=0\}.\]
Here $p'(x)$ is the first derivative of $p(x)$ and […]