The following problems are Midterm 1 problems of Linear Algebra (Math 2568) at the Ohio State University in Autumn 2017.
There were 9 problems that covered Chapter 1 of our textbook (Johnson, Riess, Arnold).
The time limit was 55 minutes.

This post is Part 3 and contains Problem 7, 8, and 9.
Check out Part 1 and Part 2 for the rest of the exam problems.

Problem 7. Let $A=\begin{bmatrix}
-3 & -4\\
8& 9
\end{bmatrix}$ and $\mathbf{v}=\begin{bmatrix}
-1 \\
2
\end{bmatrix}$.

(a) Calculate $A\mathbf{v}$ and find the number $\lambda$ such that $A\mathbf{v}=\lambda \mathbf{v}$.

(b) Without forming $A^3$, calculate the vector $A^3\mathbf{v}$.

Problem 8. Prove that if $A$ and $B$ are $n\times n$ nonsingular matrices, then the product $AB$ is also nonsingular.

Problem 9.
Determine whether each of the following sentences is true or false.

(a) There is a $3\times 3$ homogeneous system that has exactly three solutions.

(b) If $A$ and $B$ are $n\times n$ symmetric matrices, then the sum $A+B$ is also symmetric.

(c) If $n$-dimensional vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly dependent, then the vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3, \mathbf{v}_4$ is also linearly dependent for any $n$-dimensional vector $\mathbf{v}_4$.

(d) If the coefficient matrix of a system of linear equations is singular, then the system is inconsistent.

The following problems are Midterm 1 problems of Linear Algebra (Math 2568) at the Ohio State University in Autumn 2017.
There were 9 problems that covered Chapter 1 of our textbook (Johnson, Riess, Arnold).
The time limit was 55 minutes.

This post is Part 2 and contains Problem 4, 5, and 6.
Check out Part 1 and Part 3 for the rest of the exam problems.

Problem 4. Let
\[\mathbf{a}_1=\begin{bmatrix}
1 \\
2 \\
3
\end{bmatrix}, \mathbf{a}_2=\begin{bmatrix}
2 \\
-1 \\
4
\end{bmatrix}, \mathbf{b}=\begin{bmatrix}
0 \\
a \\
2
\end{bmatrix}.\]

Find all the values for $a$ so that the vector $\mathbf{b}$ is a linear combination of vectors $\mathbf{a}_1$ and $\mathbf{a}_2$.

Problem 5.
Find the inverse matrix of
\[A=\begin{bmatrix}
0 & 0 & 2 & 0 \\
0 &1 & 0 & 0 \\
1 & 0 & 0 & 0 \\
1 & 0 & 0 & 1
\end{bmatrix}\]
if it exists. If you think there is no inverse matrix of $A$, then give a reason.

Problem 6.
Consider the system of linear equations
\begin{align*}
3x_1+2x_2&=1\\
5x_1+3x_2&=2.
\end{align*}

(a) Find the coefficient matrix $A$ of the system.

(b) Find the inverse matrix of the coefficient matrix $A$.

(c) Using the inverse matrix of $A$, find the solution of the system.

(Linear Algebra Midterm Exam 1, the Ohio State University)

Suppose that the vectors
\[\mathbf{v}_1=\begin{bmatrix}
-2 \\
1 \\
0 \\
0 \\
0
\end{bmatrix}, \qquad \mathbf{v}_2=\begin{bmatrix}
-4 \\
0 \\
-3 \\
-2 \\
1
\end{bmatrix}\]
are a basis vectors for the null space of a $4\times 5$ matrix $A$. Find a vector $\mathbf{x}$ such that
\[\mathbf{x}\neq0, \quad \mathbf{x}\neq \mathbf{v}_1, \quad \mathbf{x}\neq \mathbf{v}_2,\]
and
\[A\mathbf{x}=\mathbf{0}.\]

(Stanford University, Linear Algebra Exam Problem)

Let $A$ be a $3\times 3$ matrix. Suppose that $A$ has eigenvalues $2$ and $-1$, and suppose that $\mathbf{u}$ and $\mathbf{v}$ are eigenvectors corresponding to $2$ and $-1$, respectively, where
\[\mathbf{u}=\begin{bmatrix}
1 \\
0 \\
-1
\end{bmatrix} \text{ and } \mathbf{v}=\begin{bmatrix}
2 \\
1 \\
0
\end{bmatrix}.\]
Then compute $A^5\mathbf{w}$, where
\[\mathbf{w}=\begin{bmatrix}
7 \\
2 \\
-3
\end{bmatrix}.\]

Let $V$ be a vector space over a scalar field $K$.
Let $S=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_n\}$ be the set of vectors in $V$, where $n \geq 2$.

Then prove that the set $S$ is linearly dependent if and only if at least one of the vectors in $S$ can be written as a linear combination of remaining vectors in $S$.

Let $n$ be a positive integer. Let $T:\R^n \to \R$ be a non-zero linear transformation.
Prove the followings.

(a) The nullity of $T$ is $n-1$. That is, the dimension of the nullspace of $T$ is $n-1$.

(b) Let $B=\{\mathbf{v}_1, \cdots, \mathbf{v}_{n-1}\}$ be a basis of the nullspace $\calN(T)$ of $T$.
Let $\mathbf{w}$ be the $n$-dimensional vector that is not in $\calN(T)$. Then
\[B’=\{\mathbf{v}_1, \cdots, \mathbf{v}_{n-1}, \mathbf{w}\}\]
is a basis of $\R^n$.

(c) Each vector $\mathbf{u}\in \R^n$ can be expressed as
\[\mathbf{u}=\mathbf{v}+\frac{T(\mathbf{u})}{T(\mathbf{w})}\mathbf{w}\]
for some vector $\mathbf{v}\in \calN(T)$.

Let $T$ be the linear transformation from the $3$-dimensional vector space $\R^3$ to $\R^3$ itself satisfying the following relations.
\begin{align*}
T\left(\, \begin{bmatrix}
1 \\
1 \\
1
\end{bmatrix} \,\right)
=\begin{bmatrix}
1 \\
0 \\
1
\end{bmatrix}, \qquad T\left(\, \begin{bmatrix}
2 \\
3 \\
5
\end{bmatrix} \, \right) =
\begin{bmatrix}
0 \\
2 \\
-1
\end{bmatrix}, \qquad
T \left( \, \begin{bmatrix}
0 \\
1 \\
2
\end{bmatrix} \, \right)=
\begin{bmatrix}
1 \\
0 \\
0
\end{bmatrix}.
\end{align*}
Then for any vector
\[\mathbf{x}=\begin{bmatrix}
x \\
y \\
z
\end{bmatrix}\in \R^3,\]
find the formula for $T(\mathbf{x})$.

Let
\[\mathbf{v}=\begin{bmatrix}
a \\
b \\
c
\end{bmatrix}, \qquad \mathbf{v}_1=\begin{bmatrix}
1 \\
2 \\
0
\end{bmatrix}, \qquad \mathbf{v}_2=\begin{bmatrix}
2 \\
-1 \\
2
\end{bmatrix}.\]
Find the necessary and sufficient condition so that the vector $\mathbf{v}$ is a linear combination of the vectors $\mathbf{v}_1, \mathbf{v}_2$.

Let $V$ be a subset of the vector space $\R^n$ consisting only of the zero vector of $\R^n$. Namely $V=\{\mathbf{0}\}$.
Then prove that $V$ is a subspace of $\R^n$.

Let $\mathbf{v}_1$ and $\mathbf{v}_2$ be $2$-dimensional vectors and let $A$ be a $2\times 2$ matrix.

(a) Show that if $\mathbf{v}_1, \mathbf{v}_2$ are linearly dependent vectors, then the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are also linearly dependent.

(b) If $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent vectors, can we conclude that the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are also linearly independent?

(c) If $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent vectors and $A$ is nonsingular, then show that the vectors $A\mathbf{v}_1, A\mathbf{v}_2$ are also linearly independent.

Let $V$ be a finite dimensional vector space over a field $k$ and let $V^*=\Hom(V, k)$ be the dual vector space of $V$.
Let $\{v_i\}_{i=1}^n$ be a basis of $V$ and let $\{v^i\}_{i=1}^n$ be the dual basis of $V^*$. Then prove that
\[x=\sum_{i=1}^nv^i(x)v_i\]
for any vector $x\in V$.

(a) For what value(s) of $a$ is the following set $S$ linearly dependent?
\[ S=\left \{\,\begin{bmatrix}
1 \\
2 \\
3 \\
a
\end{bmatrix}, \begin{bmatrix}
a \\
0 \\
-1 \\
2
\end{bmatrix}, \begin{bmatrix}
0 \\
0 \\
a^2 \\
7
\end{bmatrix}, \begin{bmatrix}
1 \\
a \\
1 \\
1
\end{bmatrix}, \begin{bmatrix}
2 \\
-2 \\
3 \\
a^3
\end{bmatrix} \, \right\}.\]

(b) Let $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a set of nonzero vectors in $\R^m$ such that the dot product
\[\mathbf{v}_i\cdot \mathbf{v}_j=0\]
when $i\neq j$.
Prove that the set is linearly independent.

Determine conditions on the scalars $a, b$ so that the following set $S$ of vectors is linearly dependent.
\begin{align*}
S=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\},
\end{align*}
where
\[\mathbf{v}_1=\begin{bmatrix}
1 \\
3 \\
1
\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}
1 \\
a \\
4
\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}
0 \\
2 \\
b
\end{bmatrix}.\]
Read solution

Determine whether the following set of vectors is linearly independent or linearly dependent. If the set is linearly dependent, express one vector in the set as a linear combination of the others.
\[\left\{\, \begin{bmatrix}
1 \\
0 \\
-1 \\
0
\end{bmatrix}, \begin{bmatrix}
1 \\
2 \\
3 \\
4
\end{bmatrix}, \begin{bmatrix}
-1 \\
-2 \\
0 \\
1
\end{bmatrix},
\begin{bmatrix}
-2 \\
-2 \\
7 \\
11
\end{bmatrix}\, \right\}.\]

Suppose that $\lambda$ and $\mu$ are two distinct eigenvalues of a square matrix $A$ and let $\mathbf{x}$ and $\mathbf{y}$ be eigenvectors corresponding to $\lambda$ and $\mu$, respectively.
If $a$ and $b$ are nonzero numbers, then prove that $a \mathbf{x}+b\mathbf{y}$ is not an eigenvector of $A$ (corresponding to any eigenvalue of $A$).

Let $A$ be an $n\times n$ matrix. Suppose that $\lambda_1, \lambda_2$ are distinct eigenvalues of the matrix $A$ and let $\mathbf{v}_1, \mathbf{v}_2$ be eigenvectors corresponding to $\lambda_1, \lambda_2$, respectively.

Show that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent.