Tagged: linear algebra

A Matrix is Invertible If and Only If It is Nonsingular

Problem 26

In this problem, we will show that the concept of non-singularity of a matrix is equivalent to the concept of invertibility.
That is, we will prove that:

A matrix $A$ is nonsingular if and only if $A$ is invertible.

(a) Show that if $A$ is invertible, then $A$ is nonsingular.


(b) Let $A, B, C$ be $n\times n$ matrices such that $AB=C$.
Prove that if either $A$ or $B$ is singular, then so is $C$.


(c) Show that if $A$ is nonsingular, then $A$ is invertible.

Read solution

LoadingAdd to solve later

Properties of Nonsingular and Singular Matrices

Problem 25

An $n \times n$ matrix $A$ is called nonsingular if the only solution of the equation $A \mathbf{x}=\mathbf{0}$ is the zero vector $\mathbf{x}=\mathbf{0}$.
Otherwise $A$ is called singular.

(a) Show that if $A$ and $B$ are $n\times n$ nonsingular matrices, then the product $AB$ is also nonsingular.

(b) Show that if $A$ is nonsingular, then the column vectors of $A$ are linearly independent.

(c) Show that an $n \times n$ matrix $A$ is nonsingular if and only if the equation $A\mathbf{x}=\mathbf{b}$ has a unique solution for any vector $\mathbf{b}\in \R^n$.

Restriction
Do not use the fact that a matrix is nonsingular if and only if the matrix is invertible.

Read solution

LoadingAdd to solve later

Questions About the Trace of a Matrix

Problem 19

Let $A=(a_{i j})$ and $B=(b_{i j})$ be $n\times n$ real matrices for some $n \in \N$. Then answer the following questions about the trace of a matrix.

(a) Express $\tr(AB^{\trans})$ in terms of the entries of the matrices $A$ and $B$. Here $B^{\trans}$ is the transpose matrix of $B$.

(b) Show that $\tr(AA^{\trans})$ is the sum of the square of the entries of $A$.

(c) Show that if $A$ is nonzero symmetric matrix, then $\tr(A^2)>0$.

Read solution

LoadingAdd to solve later

Linear Dependent/Independent Vectors of Polynomials

Problem 15

Let $p_1(x), p_2(x), p_3(x), p_4(x)$ be (real) polynomials of degree at most $3$. Which (if any) of the following two conditions is sufficient for the conclusion that these polynomials are linearly dependent?

(a) At $1$ each of the polynomials has the value $0$. Namely $p_i(1)=0$ for $i=1,2,3,4$.

(b) At $0$ each of the polynomials has the value $1$. Namely $p_i(0)=1$ for $i=1,2,3,4$.

(University of California, Berkeley)

Read solution

LoadingAdd to solve later

Transpose of a Matrix and Eigenvalues and Related Questions

Problem 12

Let $A$ be an $n \times n$ real matrix. Prove the followings.

(a) The matrix $AA^{\trans}$ is a symmetric matrix.

(b) The set of eigenvalues of $A$ and the set of eigenvalues of $A^{\trans}$ are equal.

(c) The matrix $AA^{\trans}$ is non-negative definite.

(An $n\times n$ matrix $B$ is called non-negative definite if for any $n$ dimensional vector $\mathbf{x}$, we have $\mathbf{x}^{\trans}B \mathbf{x} \geq 0$.)

(d) All the eigenvalues of $AA^{\trans}$ is non-negative.

Read solution

LoadingAdd to solve later

Determinant/Trace and Eigenvalues of a Matrix

Problem 9

Let $A$ be an $n\times n$ matrix and let $\lambda_1, \dots, \lambda_n$ be its eigenvalues.
Show that

(1) $$\det(A)=\prod_{i=1}^n \lambda_i$$

(2) $$\tr(A)=\sum_{i=1}^n \lambda_i$$

Here $\det(A)$ is the determinant of the matrix $A$ and $\tr(A)$ is the trace of the matrix $A$.

Namely, prove that (1) the determinant of $A$ is the product of its eigenvalues, and (2) the trace of $A$ is the sum of the eigenvalues.
Read solution

LoadingAdd to solve later

Powers of a Diagonal Matrix

Problem 7

Let $A=\begin{bmatrix}
a & 0\\
0& b
\end{bmatrix}$.
Show that

(1) $A^n=\begin{bmatrix}
a^n & 0\\
0& b^n
\end{bmatrix}$ for any $n \in \N$.

(2) Let $B=S^{-1}AS$, where $S$ be an invertible $2 \times 2$ matrix.
Show that $B^n=S^{-1}A^n S$ for any $n \in \N$

Read solution

LoadingAdd to solve later