Linear Independent Vectors, Invertible Matrix, and Expression of a Vector as a Linear Combinations

Ohio State University exam problems and solutions in mathematics

Problem 66

Consider the matrix
\[A=\begin{bmatrix}
1 & 2 & 1 \\
2 &5 &4 \\
1 & 1 & 0
\end{bmatrix}.\]


(a) Calculate the inverse matrix $A^{-1}$. If you think the matrix $A$ is not invertible, then explain why.


(b) Are the vectors
\[ \mathbf{A}_1=\begin{bmatrix}
1 \\
2 \\
1
\end{bmatrix}, \mathbf{A}_2=\begin{bmatrix}
2 \\
5 \\
1
\end{bmatrix},
\text{ and } \mathbf{A}_3=\begin{bmatrix}
1 \\
4 \\
0
\end{bmatrix}\] linearly independent?


(c) Write the vector $\mathbf{b}=\begin{bmatrix}
1 \\
1 \\
1
\end{bmatrix}$ as a linear combination of $\mathbf{A}_1$, $\mathbf{A}_2$, and $\mathbf{A}_3$.

(The Ohio State University, Linear Algebra Exam)

LoadingAdd to solve later

Sponsored Links


Hint.

  1. For (a), consider the augmented matrix $[A|I]$ and reduce it.
  2. Note that given vectors are column vectors of the matrix $A$.
  3. Use the inverse matrix $A^{-1}$ to solve a system.

Solution.

(a) Calculate the inverse matrix $A^{-1}$

We consider the augmented matrix
\[ \left[\begin{array}{rrr|rrr}
1 & 2 & 1 & 1 &0 & 0 \\
2 & 5 & 4 & 0 & 1 & 0 \\
1 & 1 & 0 & 0 & 0 & 1 \\
\end{array} \right] \] and reduce this matrix using the elementary row operations as follows.
\begin{align*}
&\left[ \begin{array}{rrr|rrr}
1 & 2 & 1 & 1 &0 & 0 \\
2 & 5 & 4 & 0 & 1 & 0 \\
1 & 1 & 0 & 0 & 0 & 1 \\
\end{array} \right] \xrightarrow[R_3-R_1]{R_2-2R_1}
\left[\begin{array}{rrr|rrr}
1 & 2 & 1 & 1 &0 & 0 \\
0 & 1 & 2 & -2 & 1 & 0 \\
0 & -1 & -1 & -1 & 0 & 1 \\
\end{array} \right] \xrightarrow[R_3+R_2]{R_1-2R_2} \\[6pt] &
\left[\begin{array}{rrr|rrr}
1 & 0 & -3 & 5 &-2 & 0 \\
0 & 1 & 2 & -2 & 1 & 0 \\
0 & 0 & 1 & -3 & 1 & 1 \\
\end{array} \right] \xrightarrow[R_2-2R_3]{R_1+3R_3} \left[\begin{array}{rrr|rrr}
1 & 0 & 0 & -4 &1 & 3 \\
0 & 1 & 0 & 4 & -1 & -2 \\
0 & 0 & 1 & -3 & 1 & 1 \\
\end{array} \right].
\end{align*}

Since the left $3 \times 3$ part of the last matrix is the identity matrix, the inverse matrix of $A$ is
\[A^{-1}=\begin{bmatrix}
-4 & 1 & 3 \\
4 &-1 &-2 \\
-3 & 1 & 1
\end{bmatrix}.\]

(b) Are the Vectors Linearly Independent?

To check whether $\mathbf{A}_1, \mathbf{A}_2, \mathbf{A}_3$ are linearly independent, we consider the linear combination
\[x_1\mathbf{A}_1+x_2\mathbf{A}_2+x_3\mathbf{A}_3=\mathbf{0}\] and if this equation has only zero solution, the vectors are linearly independent.

This equation can be written as the matrix equation
\[A\mathbf{x}=\mathbf{0},\] where $\mathbf{x}=\begin{bmatrix}
x_1 \\
x_2 \\
x_3
\end{bmatrix}$.
Since by part (a), the inverse matrix $A^{-1}$ exists. Thus multiplying by $A^{-1}$ on the left we get $\mathbf{x}=\mathbf{0}$. Thus the solution is $\mathbf{x}=\mathbf{0}$ and the vectors $\mathbf{A}_1, \mathbf{A}_2, \mathbf{A}_3$ are linearly independent.

Another Solution of (b)

Note that $\mathbf{A}_1, \mathbf{A}_2, \mathbf{A}_3$ are columns vectors of the matrix $A$. We proved in part (a) that $A$ is invertible. We know that the column vectors of an invertible matrix are linearly independent. Thus the vectors $\mathbf{A}_1, \mathbf{A}_2, \mathbf{A}_3$ are linearly independent.

(c) Write the vector $\mathbf{b}$ as a linear combination of $\mathbf{A}_1$, $\mathbf{A}_2$, and $\mathbf{A}_3$

We want to solve the vector equation
\[x_1\mathbf{A}_1+x_2\mathbf{A}_2+x_3\mathbf{A}_3=\mathbf{b}.\] This can be written as the matrix equation
\[A\mathbf{x}=\mathbf{b}.\]

Since $A$ is invertible, we have
\[\mathbf{x}=A^{-1}\mathbf{b}=\begin{bmatrix}
-4 & 1 & 3 \\
4 &-1 &-2 \\
-3 & 1 & 1
\end{bmatrix}
\begin{bmatrix}
1 \\
1 \\
1
\end{bmatrix}=\begin{bmatrix}
0 \\
1 \\
-1
\end{bmatrix}.\]

Thus $x_1=0$, $x_2=1$, and $x_3=-1$ and the linear combination is
\[\mathbf{b}=\begin{bmatrix}
1 \\
1 \\
1
\end{bmatrix}=\mathbf{A}_2-\mathbf{A}_3.\]


LoadingAdd to solve later

Sponsored Links

More from my site

  • A Linear Transformation from Vector Space over Rational Numbers to itselfA Linear Transformation from Vector Space over Rational Numbers to itself Let $\Q$ denote the set of rational numbers (i.e., fractions of integers). Let $V$ denote the set of the form $x+y \sqrt{2}$ where $x,y \in \Q$. You may take for granted that the set $V$ is a vector space over the field $\Q$. (a) Show that $B=\{1, \sqrt{2}\}$ is a basis for the […]
  • Solving a System of Linear Equations By Using an Inverse MatrixSolving a System of Linear Equations By Using an Inverse Matrix Consider the system of linear equations \begin{align*} x_1&= 2, \\ -2x_1 + x_2 &= 3, \\ 5x_1-4x_2 +x_3 &= 2 \end{align*} (a) Find the coefficient matrix and its inverse matrix. (b) Using the inverse matrix, solve the system of linear equations. (The Ohio […]
  • Possibilities For the Number of Solutions for a Linear SystemPossibilities For the Number of Solutions for a Linear System Determine whether the following systems of equations (or matrix equations) described below has no solution, one unique solution or infinitely many solutions and justify your answer. (a) \[\left\{ \begin{array}{c} ax+by=c \\ dx+ey=f, \end{array} \right. \] where $a,b,c, d$ […]
  • Express a Vector as a Linear Combination of Other VectorsExpress a Vector as a Linear Combination of Other Vectors Express the vector $\mathbf{b}=\begin{bmatrix} 2 \\ 13 \\ 6 \end{bmatrix}$ as a linear combination of the vectors \[\mathbf{v}_1=\begin{bmatrix} 1 \\ 5 \\ -1 \end{bmatrix}, \mathbf{v}_2= \begin{bmatrix} 1 \\ 2 \\ 1 […]
  • Quiz 4: Inverse Matrix/ Nonsingular Matrix Satisfying a RelationQuiz 4: Inverse Matrix/ Nonsingular Matrix Satisfying a Relation (a) Find the inverse matrix of \[A=\begin{bmatrix} 1 & 0 & 1 \\ 1 &0 &0 \\ 2 & 1 & 1 \end{bmatrix}\] if it exists. If you think there is no inverse matrix of $A$, then give a reason. (b) Find a nonsingular $2\times 2$ matrix $A$ such that \[A^3=A^2B-3A^2,\] where […]
  • Find a Basis For the Null Space of a Given $2\times 3$ MatrixFind a Basis For the Null Space of a Given $2\times 3$ Matrix Let \[A=\begin{bmatrix} 1 & 1 & 0 \\ 1 &1 &0 \end{bmatrix}\] be a matrix. Find a basis of the null space of the matrix $A$. (Remark: a null space is also called a kernel.)   Solution. The null space $\calN(A)$ of the matrix $A$ is by […]
  • Find the Inverse Matrix of a $3\times 3$ Matrix if ExistsFind the Inverse Matrix of a $3\times 3$ Matrix if Exists Find the inverse matrix of \[A=\begin{bmatrix} 1 & 1 & 2 \\ 0 &0 &1 \\ 1 & 0 & 1 \end{bmatrix}\] if it exists. If you think there is no inverse matrix of $A$, then give a reason. (The Ohio State University, Linear Algebra Midterm Exam […]
  • Given All Eigenvalues and Eigenspaces, Compute a Matrix ProductGiven All Eigenvalues and Eigenspaces, Compute a Matrix Product Let $C$ be a $4 \times 4$ matrix with all eigenvalues $\lambda=2, -1$ and eigensapces \[E_2=\Span\left \{\quad \begin{bmatrix} 1 \\ 1 \\ 1 \\ 1 \end{bmatrix} \quad\right \} \text{ and } E_{-1}=\Span\left \{ \quad\begin{bmatrix} 1 \\ 2 \\ 1 \\ 1 […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Ohio State University exam problems and solutions in mathematics
Solving a System of Linear Equations By Using an Inverse Matrix

Consider the system of linear equations \begin{align*} x_1&= 2, \\ -2x_1 + x_2 &= 3, \\ 5x_1-4x_2 +x_3 &= 2...

Close