A Linear Transformation Preserves Exactly Two Lines If and Only If There are Two Real Non-Zero Eigenvalues
Problem 472
Let $T:\R^2 \to \R^2$ be a linear transformation and let $A$ be the matrix representation of $T$ with respect to the standard basis of $\R^2$.
Prove that the following two statements are equivalent.
(a) There are exactly two distinct lines $L_1, L_2$ in $\R^2$ passing through the origin that are mapped onto themselves:
\[T(L_1)=L_1 \text{ and } T(L_2)=L_2.\]
(b) The matrix $A$ has two distinct nonzero real eigenvalues.
Sponsored Links
Proof.
(a) $\implies$ (b).
Suppose that the statement (a) holds.
That is, there are lines $L_1, L_2$ in $\R^2$ passing through the origin that are mapped onto themselves, and no other lines passing through the origin are mapped onto themselves.
A line passing through the origin in $\R^2$ is a one-dimensional subspace in $\R^2$. Thus, it is spanned by a single nonzero vector.
We have
\[L_1=\Span(\mathbf{v}_1) \text{ and } L_2=\Span(\mathbf{v}_2)\]
for some nonzero vectors $\mathbf{v}_1, \mathbf{v}_2$.
Since $T(L_1)=L_1$, we have $A\mathbf{v}_1\in L_1=\Span(\mathbf{v}_1)$.
Hence there exists $\lambda_1 \in \R$ such that
\[A\mathbf{v}_1=\lambda_1\mathbf{v}_1.\]
Since $\mathbf{v}_1$ is a nonzero vector, this means that $\lambda_1$ is an eigenvalue of $A$.
We claim that $\lambda_1\neq 0$. Otherwise, we have $A\mathbf{v}_1=\mathbf{0}$.
Any vector in $L_1$ is of the form $\mathbf{v}=a\mathbf{v}_1$ for some $a\in \R$. So we have
\[A\mathbf{v}=A(a\mathbf{v}_1)=aA\mathbf{v}_1=\mathbf{0},\]
and this yields that $T(L_1)=\{\mathbf{0}\}$, a contradiction.
Hence $\lambda_1$ is a nonzero real eigenvalue of $A$.
By the same argument, there is a nonzero real eigenvalue $\lambda_2$ such that $A\mathbf{v}_2=\lambda_2 \mathbf{v}_2$.
It remains to show that $\lambda_1 \neq \lambda_2$.
Assume on the contrary that $\lambda:=\lambda_1=\lambda_2$.
Since $\mathbf{v}_1, \mathbf{v}_2$ are a basis of two distinct lines, they form a basis of $\R^2$.
Hence any vector $\mathbf{v}\in \R^2$ can be written as a linear combination
\[\mathbf{v}=a\mathbf{v}_1+b\mathbf{v}_2\]
for some $a, b \in \R$.
Then we have
\begin{align*}
A\mathbf{v}&=A(a\mathbf{v}_1+b\mathbf{v}_2)\\
&=aA\mathbf{v}_1+bA\mathbf{v}_2\\
&=a\lambda \mathbf{v}_1+b\lambda \mathbf{v}_2\\
&=\lambda (a\mathbf{v}_1+b\mathbf{v}_2)\\
&=\lambda \mathbf{v}.
\end{align*}
This implies that any line spanned by a nonzero vector $\mathbf{v}$ is mapped onto itself by the linear transformation $T$.
This contradicts our assumption that $L_1, L_2$ are the only such lines.
Therefore, $\lambda_1 \neq \lambda_2$, and we have proved that $A$ has two distinct nonzero real eigenvalues.
(b) $\implies$ (a).
Now we suppose that the statement (b) holds.
Namely, we suppose that there are two distinct nonzero real eigenvalues of $A$. Let us call them $\lambda_1, \lambda_2$ and let $\mathbf{v}_1, \mathbf{v}_2$ be eigenvectors corresponding to $\lambda_1, \lambda_2$, respectively.
In general eigenvectors corresponding to distinct eigenvalues are linearly independent.
Thus, $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent.
Hence the lines $L_1, L_2$ spanned by $\mathbf{v}_1, \mathbf{v}_2$ are distinct.
Each vector on $L_1$ is of the form $\mathbf{v}=a\mathbf{v}_1$ for some $a \in R$. Hence we have
\begin{align*}
A\mathbf{v}=A(a\mathbf{v}_1)=aA\mathbf{v}_1=a\lambda_1\mathbf{v}_1=\lambda \mathbf{v}.
\end{align*}
It follows that each vector $\mathbf{v}\in L_1$ is mapped onto $L_1$.
(Since $\lambda \neq 0$, the image under $T$ is one-dimensional, hence $T(L_1)=L_1$.)
Similarly, we have $T(L_2)=L_2$.
Finally, we show that no other lines passing through the origin are mapped onto themselves by $T$.
Assume that we have a line $L=\Span(\mathbf{w})$ such that $T(L)=L$.
Then since we have $A\mathbf{w}\in L=\Span(\mathbf{w})$, there is $\mu\in \R$ such that
\[A\mathbf{w}=\mu\mathbf{w}. \tag{*}\]
Since $\{\mathbf{v}_1, \mathbf{v}_2\}$ is a basis of $\R^2$, we have
\[\mathbf{w}=a\mathbf{v}_1+b\mathbf{v}_2\]
for some $a, b\in \R$.
Then the left hand side of (*) becomes
\begin{align*}
A\mathbf{w}&=A(a\mathbf{v}_1+b\mathbf{v}_2)\\
&=aA\mathbf{v}_1+bA\mathbf{v}_2\\
&=a\lambda_1 \mathbf{v}_1+b\lambda_2 \mathbf{v}_2.
\end{align*}
Thus the relation (*) yields that
\begin{align*}
a\lambda_1 \mathbf{v}_1+b\lambda_2 \mathbf{v}_2=\mu a\mathbf{v}_1+\mu b\mathbf{v}_2\\
a(\lambda_1-\mu )\mathbf{v}_1+b(\lambda_2-\mu)\mathbf{v}_2=\mathbf{0}.
\end{align*}
Since $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent, we have
\[a(\lambda_1-\mu )=0 \text{ and } b(\lambda_2-\mu)=0.\]
The equality $a(\lambda_1-\mu )=0$ implies that either $a=0$ or $\lambda_1=\mu$.
If $a=0$, then we have $\mathbf{w}=b \mathbf{v}_2$, and hence $L=L_2$.
If $\mu=\lambda_1$, then the second equality $b(\lambda_2-\mu)=0$ implies that $b=0$ since $\lambda_1\neq \lambda_2$.
Then we have $\mathbf{w}=a\mathbf{v}_1$, and hence $L=L_1$.
In either case, $L$ must be $L_1$ or $L_2$.
This completes the proof.
Add to solve later
Sponsored Links
1 Response
[…] For a proof of this problem, see the post “A Linear Transformation Preserves Exactly Two Lines If and Only If There are Two Real Non-Zero Eigen…“. […]