If the diagonal entries of $M$ are all non-zero, then prove that the column vectors are linearly independent.

Let $\mathbf{x}$ denote an arbitrary column vector of length $n$, and let $\mathbf{0}$ denote the zero vector of the same size.

The columns of $M$ are linearly independent if and only if the only solution to the equation $ M \mathbf{x} = \mathbf{0} $ is the vector $\mathbf{x} = \mathbf{0}$.

The equation $M \mathbf{x} = \mathbf{0}$ then yields a system of linear equations with $n$ equations and $n$ variables.
To find a solution, consider the augmented matrix $ \begin{bmatrix}[c|c] M & \mathbf{0} \end{bmatrix}$.

Because $M$ is upper-triangular, we can use back-substitution to solve. The bottom row of the augmented matrix gives the equation $m_{n, n} x_n = 0$.
By assumption, $m_{n, n} \neq 0$ because it is a diagonal entry. Thus we must have that $x_n=0$.

Next, the second-to-last row in the augmented matrix gives the equation $m_{n-1, n-1} x_{n-1} + m_{n-1, n} x_n = 0$. Because $x_n = 0$ and $m_{n-1, n-1} \neq 0$, we must have that $x_{n-1} = 0$.

We continue working backward in this way to see that $x_i = 0$ for all $1 \leq i \leq n$. Thus $\mathbf{x} = \mathbf{0}$, and so the columns of $M$ must be linearly independent.

Does the conclusion hold if we do not assume that $M$ has non-zero diagonal entries?

If the diagonal entries of $M$ could be non-zero, then the columns might be linearly dependent. Consider the simple example
\[M = \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix}.\]

The Column Vectors of Every $3\times 5$ Matrix Are Linearly Dependent
(a) Prove that the column vectors of every $3\times 5$ matrix $A$ are linearly dependent.
(b) Prove that the row vectors of every $5\times 3$ matrix $B$ are linearly dependent.
Proof.
(a) Prove that the column vectors of every $3\times 5$ matrix $A$ are linearly […]

The Set of Vectors Perpendicular to a Given Vector is a Subspace
Fix the row vector $\mathbf{b} = \begin{bmatrix} -1 & 3 & -1 \end{bmatrix}$, and let $\R^3$ be the vector space of $3 \times 1$ column vectors. Define
\[W = \{ \mathbf{v} \in \R^3 \mid \mathbf{b} \mathbf{v} = 0 \}.\]
Prove that $W$ is a vector subspace of $\R^3$, and find a basis […]

Linear Independent Continuous Functions
Let $C[3, 10]$ be the vector space consisting of all continuous functions defined on the interval $[3, 10]$. Consider the set
\[S=\{ \sqrt{x}, x^2 \}\]
in $C[3,10]$.
Show that the set $S$ is linearly independent in $C[3,10]$.
Proof.
Note that the zero vector […]

Column Rank = Row Rank. (The Rank of a Matrix is the Same as the Rank of its Transpose)
Let $A$ be an $m\times n$ matrix. Prove that the rank of $A$ is the same as the rank of the transpose matrix $A^{\trans}$.
Hint.
Recall that the rank of a matrix $A$ is the dimension of the range of $A$.
The range of $A$ is spanned by the column vectors of the matrix […]

A Condition that a Vector is a Linear Combination of Columns Vectors of a Matrix
Suppose that an $n \times m$ matrix $M$ is composed of the column vectors $\mathbf{b}_1 , \cdots , \mathbf{b}_m$.
Prove that a vector $\mathbf{v} \in \R^n$ can be written as a linear combination of the column vectors if and only if there is a vector $\mathbf{x}$ which solves the […]