10 True of False Problems about Nonsingular / Invertible Matrices
Problem 500
10 questions about nonsingular matrices, invertible matrices, and linearly independent vectors.
The quiz is designed to test your understanding of the basic properties of these topics.
You can take the quiz as many times as you like.
The solutions will be given after completing all the 10 problems.
Click the View question button to see the solutions.
Add to solve later
Notations: $I$ denotes an identity matrix and $O$ denotes a zero matrix.
The sizes of these matrices should be determined from the context.
10 True or False Problems about Nonsingular Matrix Operations
Quizsummary
0 of 10 questions completed
Questions:
 1
 2
 3
 4
 5
 6
 7
 8
 9
 10
Information
Determine whether each of the following sentences are True or False.
Notations: $I$ denotes an identity matrix and $O$ denotes a zero matrix. The sizes of these matrices should be determined from the context.
You have already completed the quiz before. Hence you can not start it again.
Quiz is loading...
You must sign in or sign up to start the quiz.
You have to finish following quiz, to start this quiz:
Results
0 of 10 questions answered correctly
Your time:
Time has elapsed
You have reached 0 of 0 points, (0)
Average score 

Your score 

Categories
 Not categorized 0%
 1
 2
 3
 4
 5
 6
 7
 8
 9
 10
 Answered
 Review

Question 1 of 10
1. Question
True or False. Suppose that $A$ and $B$ are nonsingular $n\times n$ matrices. Then $A+B$ is nonsingular.
Correct
False. For example, let $A=I$ and $B=I$. Then both matrices are nonsingular but $A+B=O$ is singular.
Incorrect
False. For example, let $A=I$ and $B=I$. Then both matrices are nonsingular but $A+B=O$ is singular.

Question 2 of 10
2. Question
True or False. If a square matrix has no zero rows or columns, then it has an inverse matrix.
Correct
False. As a counterexample, consider
\[A=\begin{bmatrix}
1 & 1\\
1& 1
\end{bmatrix}.\]
Then $A$ has no zero rows or columns, yet is does not have the inverse matrix as the determinant of $A$ is zero.Incorrect
False. As a counterexample, consider
\[A=\begin{bmatrix}
1 & 1\\
1& 1
\end{bmatrix}.\]
Then $A$ has no zero rows or columns, yet is does not have the inverse matrix as the determinant of $A$ is zero. 
Question 3 of 10
3. Question
True or False. Let $A$ be an $m \times n$ matrix.
If the equation $A\mathbf{x}=\mathbf{0}$ has only the trivial solution $\mathbf{x}\in \R^n$, then the columns of $A$ are linearly independent.Correct
True. Let
\[B=\begin{bmatrix}
B_1 & B_2 &\dots & B_n \\
\end{bmatrix},\]
where $B_i$ is the $i$th column vector of $B$ for $i=1, \dots, n$.
Suppose that we have a linear combination
\[c_1B_1+c_2B_2+\cdots+c_n B_n=\mathbf{0}\]
for some scalars $c_1, c_2, \dots, c_n$.
Then we can write it as
\[\begin{bmatrix}
B_1 & B_2 &\dots & B_n \\
\end{bmatrix}\begin{bmatrix}
c_1 \\
c_2 \\
\vdots \\
c_n
\end{bmatrix}=\mathbf{0}.\]Since $B\mathbf{x}=\mathbf{0}$ has only the trivial solution, we must have
\[\begin{bmatrix}
c_1 \\
c_2 \\
\vdots \\
c_n
\end{bmatrix}=\mathbf{0}.\]
Hence $c_1=c_2=\cdots=c_n=0$, and the column vectors $B_1, B_2, \dots, B_n$ are linearly independent.Incorrect
True. Let
\[B=\begin{bmatrix}
B_1 & B_2 &\dots & B_n \\
\end{bmatrix},\]
where $B_i$ is the $i$th column vector of $B$ for $i=1, \dots, n$.
Suppose that we have a linear combination
\[c_1B_1+c_2B_2+\cdots+c_n B_n=\mathbf{0}\]
for some scalars $c_1, c_2, \dots, c_n$.
Then we can write it as
\[\begin{bmatrix}
B_1 & B_2 &\dots & B_n \\
\end{bmatrix}\begin{bmatrix}
c_1 \\
c_2 \\
\vdots \\
c_n
\end{bmatrix}=\mathbf{0}.\]Since $B\mathbf{x}=\mathbf{0}$ has only the trivial solution, we must have
\[\begin{bmatrix}
c_1 \\
c_2 \\
\vdots \\
c_n
\end{bmatrix}=\mathbf{0}.\]
Hence $c_1=c_2=\cdots=c_n=0$, and the column vectors $B_1, B_2, \dots, B_n$ are linearly independent. 
Question 4 of 10
4. Question
True or False. Let $A$ be an $m \times n$ matrix.
If the equation $A\mathbf{x}=\mathbf{0}$ has only the trivial solution $\mathbf{x}\in \R^n$, then the rows of $A$ are linearly independent.Correct
False. As a counterexample, consider the $3\times 2$ matrix
\[A=\begin{bmatrix}
1 & 0 \\
1 & 0 \\
0 &1
\end{bmatrix}.\]
Then the equation $A\mathbf{x}=\mathbf{0}$ has only the trivial solution $\mathbf{x}=\mathbf{0}$, and yet the first and the second row of $A$ are linearly dependent.Incorrect
False. As a counterexample, consider the $3\times 2$ matrix
\[A=\begin{bmatrix}
1 & 0 \\
1 & 0 \\
0 &1
\end{bmatrix}.\]
Then the equation $A\mathbf{x}=\mathbf{0}$ has only the trivial solution $\mathbf{x}=\mathbf{0}$, and yet the first and the second row of $A$ are linearly dependent. 
Question 5 of 10
5. Question
True or False. The row echelon form of an invertible $3\times 3$ matrix is invertible.
Correct
True. The reduced row echelon form of an invertible matrix is the identity matrix, which is invertible.
Incorrect
True. The reduced row echelon form of an invertible matrix is the identity matrix, which is invertible.

Question 6 of 10
6. Question
True or False. There is a nonzero nonsingular matrix $A$ such that $A^2=O$.
Correct
False. Suppose $A$ is nonsingular such that $A^2=O$.
Since $A$ is nonsingular, it is invertible.
Hence we have
\begin{align*}
A=A^{1}A^2=A^{1}O=O,
\end{align*}
and the matrix $A$ must be the zero matrix.Incorrect
False. Suppose $A$ is nonsingular such that $A^2=O$.
Since $A$ is nonsingular, it is invertible.
Hence we have
\begin{align*}
A=A^{1}A^2=A^{1}O=O,
\end{align*}
and the matrix $A$ must be the zero matrix. 
Question 7 of 10
7. Question
True or False. If $A$ and $B$ are invertible $n\times n$ matrices, then $AB=BA$.
Correct
False. For example, consider
\[A=\begin{bmatrix}
1 & 1\\
0& 1
\end{bmatrix} \text{ and } B=\begin{bmatrix}
1 & 0\\
0& 2
\end{bmatrix}.\]
Then these matrices are invertible as their determinants are $\det(A)=1\neq 0$ and $\det(B)=2\neq 0$.
However they do not commute since
\begin{align*}
AB=\begin{bmatrix}
1 & 2\\
0& 2
\end{bmatrix} \text{ and } BA=\begin{bmatrix}
1 & 1\\
0& 2
\end{bmatrix}.
\end{align*}Incorrect
False. For example, consider
\[A=\begin{bmatrix}
1 & 1\\
0& 1
\end{bmatrix} \text{ and } B=\begin{bmatrix}
1 & 0\\
0& 2
\end{bmatrix}.\]
Then these matrices are invertible as their determinants are $\det(A)=1\neq 0$ and $\det(B)=2\neq 0$.
However they do not commute since
\begin{align*}
AB=\begin{bmatrix}
1 & 2\\
0& 2
\end{bmatrix} \text{ and } BA=\begin{bmatrix}
1 & 1\\
0& 2
\end{bmatrix}.
\end{align*} 
Question 8 of 10
8. Question
True or False. If $A$ and $B$ are $n\times n$ nonsingular matrices such that $A^2=I$ and $B^2=I$, then $(AB)^{1}=BA$.
Correct
True. Note that we have
\[A=A^{1}A^2=A^{1}I=A^{1}.\]
Similarly, we have $B^{1}=B$.It follows that
\begin{align*}
(AB)^{1}=B^{1}A^{1}=BA.
\end{align*}Incorrect
True. Note that we have
\[A=A^{1}A^2=A^{1}I=A^{1}.\]
Similarly, we have $B^{1}=B$.It follows that
\begin{align*}
(AB)^{1}=B^{1}A^{1}=BA.
\end{align*} 
Question 9 of 10
9. Question
True or False. If $A$ is an $m \times n$ matrix such that $A\mathbf{x}=\mathbf{0}$ for every vector $\mathbf{x}$ in $\R^n$, then $A$ is the $m\times n$ zero matrix.
Correct
True. For each $i=1, 2, \dots, n$, let $\mathbf{e}_i\in \R^n$ be the $n$dimensional vector whose $i$th entry is $1$ and $0$ elsewhere.
Then $A\mathbf{e}_i$ is the $i$th column vector of the matrix $A$.
Since by assumption $A\mathbf{e}_i=\mathbf{0}$, we see that the $i$th column of $A$ is the zero vector.
As this is true for any $i=1, \dots, n$, we conclude that $A$ is the zero matrix.Incorrect
True. For each $i=1, 2, \dots, n$, let $\mathbf{e}_i\in \R^n$ be the $n$dimensional vector whose $i$th entry is $1$ and $0$ elsewhere.
Then $A\mathbf{e}_i$ is the $i$th column vector of the matrix $A$.
Since by assumption $A\mathbf{e}_i=\mathbf{0}$, we see that the $i$th column of $A$ is the zero vector.
As this is true for any $i=1, \dots, n$, we conclude that $A$ is the zero matrix. 
Question 10 of 10
10. Question
True or False. Let $A$ be a $2 \times 2$ nonsingular matrix and let $\mathbf{v}_1$ and $\mathbf{v}_2$ be linearly independent vectors in $\R^2$.
Then the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are linearly independent vectors in $\R^2$.Correct
True. Consider a linear combination
\[c_1(A\mathbf{v}_1)+c_2(A\mathbf{v}_2)=\mathbf{0},\]
where $c_1, c_2$ are scalars.It yields that
\[A(c_1\mathbf{v}_1+c_2\mathbf{v}_2)=\mathbf{0}.\]Since $A$ is nonsingular, we have
\[c_1\mathbf{v}_1+c_2\mathbf{v}_2=\mathbf{0}.\]Because the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent, we have
\[c_1=c_2=0.\]
Thus, the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are linearly independent vectors.Incorrect
True. Consider a linear combination
\[c_1(A\mathbf{v}_1)+c_2(A\mathbf{v}_2)=\mathbf{0},\]
where $c_1, c_2$ are scalars.It yields that
\[A(c_1\mathbf{v}_1+c_2\mathbf{v}_2)=\mathbf{0}.\]Since $A$ is nonsingular, we have
\[c_1\mathbf{v}_1+c_2\mathbf{v}_2=\mathbf{0}.\]Because the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent, we have
\[c_1=c_2=0.\]
Thus, the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are linearly independent vectors.
Add to solve later
Sponsored Links
More from my site
 Quiz 4: Inverse Matrix/ Nonsingular Matrix Satisfying a Relation (a) Find the inverse matrix of \[A=\begin{bmatrix} 1 & 0 & 1 \\ 1 &0 &0 \\ 2 & 1 & 1 \end{bmatrix}\] if it exists. If you think there is no inverse matrix of $A$, then give a reason. (b) Find a nonsingular $2\times 2$ matrix $A$ such that \[A^3=A^2B3A^2,\] where […]
 Linearly Independent vectors $\mathbf{v}_1, \mathbf{v}_2$ and Linearly Independent Vectors $A\mathbf{v}_1, A\mathbf{v}_2$ for a Nonsingular Matrix Let $\mathbf{v}_1$ and $\mathbf{v}_2$ be $2$dimensional vectors and let $A$ be a $2\times 2$ matrix. (a) Show that if $\mathbf{v}_1, \mathbf{v}_2$ are linearly dependent vectors, then the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are also linearly dependent. (b) If $\mathbf{v}_1, […]
 Problems and Solutions About Similar Matrices Let $A, B$, and $C$ be $n \times n$ matrices and $I$ be the $n\times n$ identity matrix. Prove the following statements. (a) If $A$ is similar to $B$, then $B$ is similar to $A$. (b) $A$ is similar to itself. (c) If $A$ is similar to $B$ and $B$ […]
 True or False: $(AB)(A+B)=A^2B^2$ for Matrices $A$ and $B$ Let $A$ and $B$ be $2\times 2$ matrices. Prove or find a counterexample for the statement that $(AB)(A+B)=A^2B^2$. Hint. In general, matrix multiplication is not commutative: $AB$ and $BA$ might be different. Solution. Let us calculate $(AB)(A+B)$ as […]
 If $M, P$ are Nonsingular, then Exists a Matrix $N$ such that $MN=P$ Suppose that $M, P$ are two $n \times n$ nonsingular matrix. Prove that there is a matrix $N$ such that $MN = P$. Proof. As nonsingularity and invertibility are equivalent, we know that $M$ has the inverse matrix $M^{1}$. Let us think backwards. Suppose that […]
 Find a Nonsingular Matrix Satisfying Some Relation Determine whether there exists a nonsingular matrix $A$ if \[A^2=AB+2A,\] where $B$ is the following matrix. If such a nonsingular matrix $A$ exists, find the inverse matrix $A^{1}$. (a) \[B=\begin{bmatrix} 1 & 1 & 1 \\ 0 &1 &0 \\ 1 & 2 & […]
 True of False Problems on Determinants and Invertible Matrices Determine whether each of the following statements is True or False. (a) If $A$ and $B$ are $n \times n$ matrices, and $P$ is an invertible $n \times n$ matrix such that $A=PBP^{1}$, then $\det(A)=\det(B)$. (b) If the characteristic polynomial of an $n \times n$ matrix $A$ […]
 Possibilities of the Number of Solutions of a Homogeneous System of Linear Equations Here is a very short true or false problem. Select either True or False. Then click "Finish quiz" button. You will be able to see an explanation of the solution by clicking "View questions" button.
“The row echelon form of an 3×3 matrix” – I don’t think this is always invertible.
If A =
110
110
110
The row echelon form of A is
110
000
000
If I am wrong on that, please let me know.
Dear john kilbourne,
Thank you for your comment. You are right. In the problem, “invertible” was missing.
“The row echelon form of an 3×3 matrix” was a mistake. The correct problem should be:
“The row echelon form of an invertible 3×3 matrix”.
Thank you for pointing out this.