<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>triangularizable matrix &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/triangularizable-matrix/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Sat, 12 Aug 2017 05:01:47 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.6</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Nilpotent Matrix and Eigenvalues of the Matrix</title>
		<link>https://yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/</link>
				<comments>https://yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/#comments</comments>
				<pubDate>Fri, 22 Jul 2016 22:52:47 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[Cayley-Hamilton theorem]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[Jordan canonical form]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nilpotent matrix]]></category>
		<category><![CDATA[triangularizable matrix]]></category>
		<category><![CDATA[upper triangular matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=98</guid>
				<description><![CDATA[<p>An $n\times n$ matrix $A$ is called nilpotent if $A^k=O$, where $O$ is the $n\times n$ zero matrix. Prove the followings. (a) The matrix $A$ is nilpotent if and only if all the eigenvalues&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/" target="_blank">Nilpotent Matrix and Eigenvalues of the Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 11</h2>
<p>An $n\times n$ matrix $A$ is called <strong>nilpotent</strong> if $A^k=O$, where $O$ is the $n\times n$ zero matrix.<br />
Prove the followings.</p>
<p><strong>(a)</strong> The matrix $A$ is nilpotent if and only if all the eigenvalues of $A$ is zero.</p>
<p><strong>(b)</strong> The matrix $A$ is nilpotent if and only if $A^n=O$.<br />
<span id="more-98"></span><br />

<h2>Hint.</h2>
<p>Hint for (a)</p>
<ol>
<li>$(\Rightarrow)$ Consider $A \mathbf{x}=\lambda \mathbf{x}$, where $\lambda$ is an eigenvalue of $A$ and $\mathbf{x}$ is an eigenvector corresponding to $\lambda$.</li>
<li>$(\Leftarrow)$ Consider triangulation or Jordan normal/canonical form of $A$. Or use Cayley-Hamilton theorem.</li>
</ol>
<h2> Proof of (a). </h2>
<p>$(\Rightarrow)$<br />
Suppose the matrix $A$ is nilpotent. Namely there exists $k \in \N$ such that $A^k=O$. Let $\lambda$ be an eigenvalue of $A$ and let $\mathbf{x}$ be the eigenvector corresponding to the eigenvalue $\lambda$.<br />
Then they satisfy the equality $A\mathbf{x}=\lambda \mathbf{x}$. Multiplying this equality by $A$ on the left, we have<br />
\[A^2\mathbf{x}=\lambda A\mathbf{x}=\lambda^2 \mathbf{x} .\]
Repeatedly multiplying by $A$, we obtain that $A^k \mathbf{x}=\lambda^k \mathbf{x}$. (To prove this statement, use mathematical induction.)<br />
Now since $A^k=O$, we get $\lambda^k \mathbf{x}=0_n$, $n$-dimensional zero vector.<br />
Since $\mathbf{x}$ is an eigenvector and hence nonzero by definition, we obtain that $\lambda^k=0$, and hence $\lambda=0$.</p>
<hr />
<p>$(\Leftarrow)$<br />
Now we assume that all the eigenvalues of the matrix $A$ are zero.<br />
We prove that $A$ is nilpotent.<br />
There exists an invertible $n\times n$ matrix $P$ such that $P^{-1} A P$ is an upper triangular matrix whose diagonal entries are eigenvalues of $A$.<br />
(This is always possible. Study a triangularizable matrix or Jordan normal/canonical form.)</p>
<p>Hence we have<br />
\[P^{-1} A P= \begin{bmatrix}<br />
0 &amp; * &amp; \cdots &amp; * \\<br />
0 &amp; 0 &amp; \cdots &amp; * \\<br />
\vdots &amp; \vdots &amp; \ddots &amp; \vdots \\<br />
0 &amp; 0 &amp; \cdots &amp; 0<br />
\end{bmatrix}.<br />
\]
<p>Then we have $(P^{-1}AP)^n=O$. This implies that $P^{-1} A^n P=O$ and thus $A^n=POP^{-1}=O$.<br />
 Therefore the matrix $A$ is nilpotent.</p>
<h4>Another proof of $(\Leftarrow)$ using Cayley-Hamilton theorem</h4>
<p>Suppose that all the eigenvalues of the matrix $A$ are zero.<br />
Then the characteristic polynomial of the matrix $A$ is<br />
\[p(t)=\det(A-tI)=\pm t^n.\]
<p>Hence by the Cayley-Hamilton theorem says that<br />
\[p(A)=\pm A^n=O,\]
the zero matrix.</p>
<p>Thus, $A$ is nilpotent.</p>
<p>Note also that this method also proves the part (b).</p>
<h2> Proof of (b). </h2>
<p>If $A^n=O$, then by definition the matrix $A$ is nilpotent.<br />
On the other hand, suppose $A$ is nilpotent. Then by Part (a), the eigenvalues of $A$ are all zero. Then by the same argument of the proof of part (a) $(\Leftarrow)$, we have $A^n=O$.</p>
<h2>Comment.</h2>
<p>Part (b) implies the following.</p>
<p>Suppose that you are given $n \times n$ matrix $B$.<br />
You calculate the power $B^n$, and if it is not zero, then the power $B^k$ is never going to be the zero matrix $O$ no matter how large the number $k$ is.</p>
<h2> Related Question. </h2>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Prove that every diagonalizable nilpotent matrix is the zero matrix.</div>
<p>See the post &#8628;<br />
<a href="//yutsumura.com/every-diagonalizable-nilpotent-matrix-is-the-zero-matrix/" target="_blank">Every Diagonalizable Nilpotent Matrix is the Zero Matrix</a><br />
for a proof of this problem.</p>
<button class="simplefavorite-button has-count" data-postid="98" data-siteid="1" data-groupid="1" data-favoritecount="49" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">49</span></button><p>The post <a href="https://yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/" target="_blank">Nilpotent Matrix and Eigenvalues of the Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/feed/</wfw:commentRss>
		<slash:comments>3</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">98</post-id>	</item>
		<item>
		<title>Determinant/Trace and Eigenvalues of a Matrix</title>
		<link>https://yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/</link>
				<comments>https://yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/#comments</comments>
				<pubDate>Fri, 22 Jul 2016 06:15:00 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[determinant]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[Jordan canonical form]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[trace]]></category>
		<category><![CDATA[triangularizable matrix]]></category>
		<category><![CDATA[upper triangular matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=81</guid>
				<description><![CDATA[<p>Let $A$ be an $n\times n$ matrix and let $\lambda_1, \dots, \lambda_n$ be its eigenvalues. Show that (1) $$\det(A)=\prod_{i=1}^n \lambda_i$$ (2) $$\tr(A)=\sum_{i=1}^n \lambda_i$$ Here $\det(A)$ is the determinant of the matrix $A$ and $\tr(A)$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/" target="_blank">Determinant/Trace and Eigenvalues of a Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 9</h2>
<p>Let $A$ be an $n\times n$ matrix and let $\lambda_1, \dots, \lambda_n$ be its eigenvalues.<br />
Show that</p>
<p><strong>(1) </strong> $$\det(A)=\prod_{i=1}^n \lambda_i$$</p>
<p><strong>(2)</strong> $$\tr(A)=\sum_{i=1}^n \lambda_i$$</p>
<p>Here $\det(A)$ is the determinant of the matrix $A$ and $\tr(A)$ is the trace of the matrix $A$.</p>
<p>Namely, prove that (1) the determinant of $A$ is the product of its eigenvalues, and (2) the trace of $A$ is the sum of the eigenvalues.<br />
<span id="more-81"></span>We give two different proofs.</p>
<h2> Plan 1. </h2>
<ol>
<li>Use the definition of eigenvalues (the characteristic polynomial).</li>
<li>Compare coefficients.</li>
</ol>
<h2> Plan 2. </h2>
<ol>
<li>Make $A$ upper triangular matrix or in the Jordan normal/canonical form.</li>
<li>Use the property of determinants and traces.</li>
</ol>
<h2> Proof. [Method 1]</h2>
<p><strong>(1)</strong> Recall that eigenvalues are roots of the characteristic polynomial $p(\lambda)=\det(A-\lambda I_n)$.<br />
It follows that we have<br />
\begin{align*}<br />
&amp;\det(A-\lambda I_n) \\<br />
&amp;=\begin{vmatrix}<br />
a_{1 1}- \lambda &amp; a_{1 2} &amp; \cdots &amp; a_{1,n} \\<br />
a_{2 1} &amp; a_{2 2} -\lambda &amp; \cdots &amp; a_{2,n} \\<br />
\vdots &amp; \vdots &amp; \ddots &amp; \vdots \\<br />
a_{n 1} &amp; a_{m 2} &amp; \cdots &amp; a_{n n}-\lambda<br />
\end{vmatrix} =\prod_{i=1}^n (\lambda_i-\lambda). \tag{*}<br />
\end{align*}</p>
<p>Letting $\lambda=0$, we see that $\det(A)=\prod_{i=1}^n \lambda_i$ and this completes the proof of part (a).</p>
<hr />
<p><strong>(2)</strong> Compare the coefficients of $\lambda^{n-1}$ of the both sides of (*).<br />
The coefficient of $\lambda^{n-1}$ of the determinant on the left side of (*) is</p>
<p>$$(-1)^{n-1}(a_{11}+a_{22}+\cdots a_{n n})=(-1)^{n-1}\tr(A).$$<br />
The coefficient of $\lambda^{n-1}$ of the determinant on the right side of (*) is<br />
$$(-1)^{n-1}\sum_{i=1}^n \lambda_i.$$<br />
Thus we have $\tr(A)=\sum_{i=1}^n \lambda_i$.</p>
<h2> Proof. [Method 2]</h2>
<p>Observe that there exists an $n \times n$ invertible matrix $P$ such that<br />
\[P^{-1} A P= \begin{bmatrix}<br />
\lambda_1 &amp; * &amp; \cdots &amp; * \\<br />
0 &amp; \lambda_2 &amp; \cdots &amp; * \\<br />
\vdots &amp; \vdots &amp; \ddots &amp; \vdots \\<br />
0 &amp; 0 &amp; \cdots &amp; \lambda_n \tag{**}<br />
\end{bmatrix}.<br />
\]
This is an upper triangular matrix and diagonal entries are eigenvalues.<br />
(If this is not familiar to you, then study a &#8220;triangularizable matrix&#8221; or &#8220;Jordan normal/canonical form&#8221;.)</p>
<p><strong>(1)</strong> Since the determinant of an upper triangular matrix is the product of diagonal entries, we have<br />
\begin{align*}<br />
\prod_{i=1}^n \lambda_i &amp; =\det(P^{-1} A P)=\det(P^{-1}) \det(A) \det(P) \\<br />
&amp;= \det(P)^{-1}\det(A) \det(P)=\det(A),<br />
\end{align*}<br />
where we used the multiplicative property of the determinant.</p>
<hr />
<p><strong>(2)</strong> We take the trace of both sides of (**) and get<br />
\begin{align*}<br />
\sum_{i=1}^n \lambda_i =\tr(P^{-1}AP) =\tr(A).<br />
\end{align*}<br />
(Here for the last equality we used the property of the trace that $\tr(AB)=\tr(BA)$ for any $n\times n$ matrices $A$ and $B$.)<br />
Thus we obtained the result $\tr(A)=\sum_{i=1}^n \lambda_i$.</p>
<h2>Comment.</h2>
<p>The proof of (1) in the first method is simple, but that of (2) requires a bit observation, especially when we find the coefficient of the left-hand side.</p>
<p>The proof of (2) in the second method is simpler although you need to know about the Jordan normal/canonical form.</p>
<p>These two formulas relate the determinant and the trace, and the eigenvalue of a matrix in a very simple way.</p>
<button class="simplefavorite-button has-count" data-postid="81" data-siteid="1" data-groupid="1" data-favoritecount="40" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">40</span></button><p>The post <a href="https://yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/" target="_blank">Determinant/Trace and Eigenvalues of a Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">81</post-id>	</item>
	</channel>
</rss>
