Tagged: linear algebra

Quiz 4: Inverse Matrix/ Nonsingular Matrix Satisfying a Relation

Problem 289

(a) Find the inverse matrix of
\[A=\begin{bmatrix}
1 & 0 & 1 \\
1 &0 &0 \\
2 & 1 & 1
\end{bmatrix}\] if it exists. If you think there is no inverse matrix of $A$, then give a reason.

(b) Find a nonsingular $2\times 2$ matrix $A$ such that
\[A^3=A^2B-3A^2,\] where
\[B=\begin{bmatrix}
4 & 1\\
2& 6
\end{bmatrix}.\] Verify that the matrix $A$ you obtained is actually a nonsingular matrix.

(The Ohio State University, Linear Algebra Midterm Exam Problem)
 
Read solution

LoadingAdd to solve later

Summary: Possibilities for the Solution Set of a System of Linear Equations

Problem 288

In this post, we summarize theorems about the possibilities for the solution set of a system of linear equations and solve the following problems.

Determine all possibilities for the solution set of the system of linear equations described below.

(a) A homogeneous system of $3$ equations in $5$ unknowns.

(b) A homogeneous system of $5$ equations in $4$ unknowns.

(c) A system of $5$ equations in $4$ unknowns.

(d) A system of $2$ equations in $3$ unknowns that has $x_1=1, x_2=-5, x_3=0$ as a solution.

(e) A homogeneous system of $4$ equations in $4$ unknowns.

(f) A homogeneous system of $3$ equations in $4$ unknowns.

(g) A homogeneous system that has $x_1=3, x_2=-2, x_3=1$ as a solution.

(h) A homogeneous system of $5$ equations in $3$ unknowns and the rank of the system is $3$.

(i) A system of $3$ equations in $2$ unknowns and the rank of the system is $2$.

(j) A homogeneous system of $4$ equations in $3$ unknowns and the rank of the system is $2$.
 
Read solution

LoadingAdd to solve later

Basis For Subspace Consisting of Matrices Commute With a Given Diagonal Matrix

Problem 287

Let $V$ be the vector space of all $3\times 3$ real matrices.
Let $A$ be the matrix given below and we define
\[W=\{M\in V \mid AM=MA\}.\] That is, $W$ consists of matrices that commute with $A$.
Then $W$ is a subspace of $V$.

Determine which matrices are in the subspace $W$ and find the dimension of $W$.

(a) \[A=\begin{bmatrix}
a & 0 & 0 \\
0 &b &0 \\
0 & 0 & c
\end{bmatrix},\] where $a, b, c$ are distinct real numbers.

(b) \[A=\begin{bmatrix}
a & 0 & 0 \\
0 &a &0 \\
0 & 0 & b
\end{bmatrix},\] where $a, b$ are distinct real numbers.

 
Read solution

LoadingAdd to solve later

Linearly Independent vectors $\mathbf{v}_1, \mathbf{v}_2$ and Linearly Independent Vectors $A\mathbf{v}_1, A\mathbf{v}_2$ for a Nonsingular Matrix

Problem 284

Let $\mathbf{v}_1$ and $\mathbf{v}_2$ be $2$-dimensional vectors and let $A$ be a $2\times 2$ matrix.

(a) Show that if $\mathbf{v}_1, \mathbf{v}_2$ are linearly dependent vectors, then the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are also linearly dependent.

(b) If $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent vectors, can we conclude that the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are also linearly independent?

(c) If $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent vectors and $A$ is nonsingular, then show that the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are also linearly independent.

 
Read solution

LoadingAdd to solve later

Dual Vector Space and Dual Basis, Some Equality

Problem 282

Let $V$ be a finite dimensional vector space over a field $k$ and let $V^*=\Hom(V, k)$ be the dual vector space of $V$.
Let $\{v_i\}_{i=1}^n$ be a basis of $V$ and let $\{v^i\}_{i=1}^n$ be the dual basis of $V^*$. Then prove that
\[x=\sum_{i=1}^nv^i(x)v_i\] for any vector $x\in V$.

 
Read solution

LoadingAdd to solve later

Quiz 3. Condition that Vectors are Linearly Dependent/ Orthogonal Vectors are Linearly Independent

Problem 281

(a) For what value(s) of $a$ is the following set $S$ linearly dependent?
\[ S=\left \{\,\begin{bmatrix}
1 \\
2 \\
3 \\
a
\end{bmatrix}, \begin{bmatrix}
a \\
0 \\
-1 \\
2
\end{bmatrix}, \begin{bmatrix}
0 \\
0 \\
a^2 \\
7
\end{bmatrix}, \begin{bmatrix}
1 \\
a \\
1 \\
1
\end{bmatrix}, \begin{bmatrix}
2 \\
-2 \\
3 \\
a^3
\end{bmatrix} \, \right\}.\]

(b) Let $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a set of nonzero vectors in $\R^m$ such that the dot product
\[\mathbf{v}_i\cdot \mathbf{v}_j=0\] when $i\neq j$.
Prove that the set is linearly independent.

 
Read solution

LoadingAdd to solve later

Find a Nonsingular Matrix Satisfying Some Relation

Problem 280

Determine whether there exists a nonsingular matrix $A$ if
\[A^2=AB+2A,\] where $B$ is the following matrix.
If such a nonsingular matrix $A$ exists, find the inverse matrix $A^{-1}$.

(a) \[B=\begin{bmatrix}
-1 & 1 & -1 \\
0 &-1 &0 \\
1 & 2 & -2
\end{bmatrix}\]

(b) \[B=\begin{bmatrix}
-1 & 1 & -1 \\
0 &-1 &0 \\
2 & 1 & -4
\end{bmatrix}.\]

 
Read solution

LoadingAdd to solve later

Determine Linearly Independent or Linearly Dependent. Express as a Linear Combination

Problem 277

Determine whether the following set of vectors is linearly independent or linearly dependent. If the set is linearly dependent, express one vector in the set as a linear combination of the others.
\[\left\{\, \begin{bmatrix}
1 \\
0 \\
-1 \\
0
\end{bmatrix}, \begin{bmatrix}
1 \\
2 \\
3 \\
4
\end{bmatrix}, \begin{bmatrix}
-1 \\
-2 \\
0 \\
1
\end{bmatrix},
\begin{bmatrix}
-2 \\
-2 \\
7 \\
11
\end{bmatrix}\, \right\}.\]

 
Read solution

LoadingAdd to solve later

Linear Transformation, Basis For the Range, Rank, and Nullity, Not Injective

Problem 276

Let $V$ be the vector space of all $2\times 2$ real matrices and let $P_3$ be the vector space of all polynomials of degree $3$ or less with real coefficients.
Let $T: P_3 \to V$ be the linear transformation defined by
\[T(a_0+a_1x+a_2x^2+a_3x^3)=\begin{bmatrix}
a_0+a_2 & -a_0+a_3\\
a_1-a_2 & -a_1-a_3
\end{bmatrix}\] for any polynomial $a_0+a_1x+a_2x^2+a_3 \in P_3$.
Find a basis for the range of $T$, $\calR(T)$, and determine the rank of $T$, $\rk(T)$, and the nullity of $T$, $\nullity(T)$.
Also, prove that $T$ is not injective.

 
Read solution

LoadingAdd to solve later

The Inverse Matrix of an Upper Triangular Matrix with Variables

Problem 275

Let $A$ be the following $3\times 3$ upper triangular matrix.
\[A=\begin{bmatrix}
1 & x & y \\
0 &1 &z \\
0 & 0 & 1
\end{bmatrix},\] where $x, y, z$ are some real numbers.

Determine whether the matrix $A$ is invertible or not. If it is invertible, then find the inverse matrix $A^{-1}$.

 
Read solution

LoadingAdd to solve later

Quiz 2. The Vector Form For the General Solution / Transpose Matrices. Math 2568 Spring 2017.

Problem 273

(a) The given matrix is the augmented matrix for a system of linear equations.
Give the vector form for the general solution.
\[ \left[\begin{array}{rrrrr|r}
1 & 0 & -1 & 0 &-2 & 0 \\
0 & 1 & 2 & 0 & -1 & 0 \\
0 & 0 & 0 & 1 & 1 & 0 \\
\end{array} \right].\]

(b) Let
\[A=\begin{bmatrix}
1 & 2 & 3 \\
4 &5 &6
\end{bmatrix}, B=\begin{bmatrix}
1 & 0 & 1 \\
0 &1 &0
\end{bmatrix}, C=\begin{bmatrix}
1 & 2\\
0& 6
\end{bmatrix}, \mathbf{v}=\begin{bmatrix}
0 \\
1 \\
0
\end{bmatrix}.\] Then compute and simplify the following expression.
\[\mathbf{v}^{\trans}\left( A^{\trans}-(A-B)^{\trans}\right)C.\]

 
Read solution

LoadingAdd to solve later

Prove a Given Subset is a Subspace and Find a Basis and Dimension

Problem 270

Let
\[A=\begin{bmatrix}
4 & 1\\
3& 2
\end{bmatrix}\] and consider the following subset $V$ of the 2-dimensional vector space $\R^2$.
\[V=\{\mathbf{x}\in \R^2 \mid A\mathbf{x}=5\mathbf{x}\}.\]

(a) Prove that the subset $V$ is a subspace of $\R^2$.

(b) Find a basis for $V$ and determine the dimension of $V$.

 
Read solution

LoadingAdd to solve later

Vector Form for the General Solution of a System of Linear Equations

Problem 267

Solve the following system of linear equations by transforming its augmented matrix to reduced echelon form (Gauss-Jordan elimination).

Find the vector form for the general solution.
\begin{align*}
x_1-x_3-3x_5&=1\\
3x_1+x_2-x_3+x_4-9x_5&=3\\
x_1-x_3+x_4-2x_5&=1.
\end{align*}

 
Read solution

LoadingAdd to solve later

Invertible Matrix Satisfying a Quadratic Polynomial

Problem 266

Let $A$ be an $n \times n$ matrix satisfying
\[A^2+c_1A+c_0I=O,\] where $c_0, c_1$ are scalars, $I$ is the $n\times n$ identity matrix, and $O$ is the $n\times n$ zero matrix.

Prove that if $c_0\neq 0$, then the matrix $A$ is invertible (nonsingular).
How about the converse? Namely, is it true that if $c_0=0$, then the matrix $A$ is not invertible?

 
Read solution

LoadingAdd to solve later

Idempotent Matrices. 2007 University of Tokyo Entrance Exam Problem

Problem 265

For a real number $a$, consider $2\times 2$ matrices $A, P, Q$ satisfying the following five conditions.

  1. $A=aP+(a+1)Q$
  2. $P^2=P$
  3. $Q^2=Q$
  4. $PQ=O$
  5. $QP=O$,

where $O$ is the $2\times 2$ zero matrix.
Then do the following problems.


(a) Prove that $(P+Q)A=A$.


(b) Suppose $a$ is a positive real number and let
\[ A=\begin{bmatrix}
a & 0\\
1& a+1
\end{bmatrix}.\] Then find all matrices $P, Q$ satisfying conditions (1)-(5).


(c) Let $n$ be an integer greater than $1$. For any integer $k$, $2\leq k \leq n$, we define the matrix
\[A_k=\begin{bmatrix}
k & 0\\
1& k+1
\end{bmatrix}.\] Then calculate and simplify the matrix product
\[A_nA_{n-1}A_{n-2}\cdots A_2.\]

(Tokyo University Entrance Exam 2007)
 
Read solution

LoadingAdd to solve later