<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>diagonalizable matrix &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/diagonalizable-matrix/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Mon, 18 Dec 2017 22:45:38 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.6</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>A Diagonalizable Matrix which is Not Diagonalized by a Real Nonsingular Matrix</title>
		<link>https://yutsumura.com/a-diagonalizable-matrix-which-is-not-diagonalized-by-a-real-nonsingular-matrix/</link>
				<comments>https://yutsumura.com/a-diagonalizable-matrix-which-is-not-diagonalized-by-a-real-nonsingular-matrix/#respond</comments>
				<pubDate>Fri, 13 Oct 2017 03:25:20 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[complex eigenvalue]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[diagonalization of a matrix]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nonsingular matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5081</guid>
				<description><![CDATA[<p>Prove that the matrix \[A=\begin{bmatrix} 0 &#038; 1\\ -1&#038; 0 \end{bmatrix}\] is diagonalizable. Prove, however, that $A$ cannot be diagonalized by a real nonsingular matrix. That is, there is no real nonsingular matrix $S$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/a-diagonalizable-matrix-which-is-not-diagonalized-by-a-real-nonsingular-matrix/" target="_blank">A Diagonalizable Matrix which is Not Diagonalized by a Real Nonsingular Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 584</h2>
<p>	Prove that the matrix<br />
	\[A=\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  -1&#038; 0<br />
		\end{bmatrix}\]
		is diagonalizable.<br />
		Prove, however, that $A$ cannot be diagonalized by a real nonsingular matrix.<br />
		That is, there is no real nonsingular matrix $S$ such that $S^{-1}AS$ is a diagonal matrix.</p>
<p>&nbsp;<br />
<span id="more-5081"></span><br />

<h2> Proof. </h2>
<p>		We first find the eigenvalues of $A$ by computing its characteristic polynomial $p(t)$.<br />
			We have<br />
			\begin{align*}<br />
		p(t)=\det(A-tI)=\begin{vmatrix}<br />
		  -t &#038; 1\\<br />
		  -1&#038; -t<br />
		\end{vmatrix}=t^2+1.<br />
		\end{align*}<br />
		Solving $p(t)=t^2+1=0$, we obtain two distinct eigenvalues $\pm i$ of $A$.<br />
		Hence the matrix $A$ is diagonalizable.</p>
<hr />
<p>		To prove the second statement, assume, on the contrary, that $A$ is diagonalizable by a real nonsingular matrix $S$.<br />
		Then we have<br />
		\[S^{-1}AS=\begin{bmatrix}<br />
		  i &#038; 0\\<br />
		  0&#038; -i<br />
		\end{bmatrix}\]
		by <a href="//yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/" rel="noopener" target="_blank">diagonalization</a>.<br />
		As the matrices $A, S$ are real, the left-hand side is a real matrix.<br />
		Taking the complex conjugate of both sides, we obtain<br />
		\[\begin{bmatrix}<br />
		  -i &#038; 0\\<br />
		  0&#038; i<br />
		\end{bmatrix}=\overline{\begin{bmatrix}<br />
		  i &#038; 0\\<br />
		  0&#038; -i<br />
		\end{bmatrix}}=\overline{S^{-1}AS}=S^{-1}AS=\begin{bmatrix}<br />
		  i &#038; 0\\<br />
		  0&#038; -i<br />
		\end{bmatrix}.\]
		This equality is clearly impossible.<br />
		Hence the matrix $A$ cannot be diagonalized by a real nonsingular matrix.</p>
<button class="simplefavorite-button has-count" data-postid="5081" data-siteid="1" data-groupid="1" data-favoritecount="20" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">20</span></button><p>The post <a href="https://yutsumura.com/a-diagonalizable-matrix-which-is-not-diagonalized-by-a-real-nonsingular-matrix/" target="_blank">A Diagonalizable Matrix which is Not Diagonalized by a Real Nonsingular Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/a-diagonalizable-matrix-which-is-not-diagonalized-by-a-real-nonsingular-matrix/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5081</post-id>	</item>
		<item>
		<title>An Example of a Matrix that Cannot Be a Commutator</title>
		<link>https://yutsumura.com/an-example-of-a-matrix-that-cannot-be-a-commutator/</link>
				<comments>https://yutsumura.com/an-example-of-a-matrix-that-cannot-be-a-commutator/#respond</comments>
				<pubDate>Tue, 19 Sep 2017 03:32:46 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[commutator]]></category>
		<category><![CDATA[commutator subgroup]]></category>
		<category><![CDATA[determinant of a matrix]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[eigenvalues]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nonsingular matrix]]></category>
		<category><![CDATA[trace of a matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4915</guid>
				<description><![CDATA[<p>Let $I$ be the $2\times 2$ identity matrix. Then prove that $-I$ cannot be a commutator $[A, B]:=ABA^{-1}B^{-1}$ for any $2\times 2$ matrices $A$ and $B$ with determinant $1$. &#160; Proof. Assume that $[A,&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/an-example-of-a-matrix-that-cannot-be-a-commutator/" target="_blank">An Example of a Matrix that Cannot Be a Commutator</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 565</h2>
<p>	Let $I$ be the $2\times 2$ identity matrix.<br />
	Then prove that $-I$ cannot be a commutator $[A, B]:=ABA^{-1}B^{-1}$ for any $2\times 2$ matrices $A$ and $B$ with determinant $1$.</p>
<p>&nbsp;<br />
<span id="more-4915"></span></p>
<h2> Proof. </h2>
<p>		Assume that $[A, B]=-I$. Then $ABA^{-1}B^{-1}=-I$ implies<br />
		\[ABA^{-1}=-B. \tag{*}\]
		Taking the trace, we have<br />
		\[-\tr(B)=\tr(-B)=\tr(ABA^{-1})=tr(BAA^{-1})=\tr(B),\]
		hence the trace $\tr(B)=0$.<br />
		Thus, the characteristic polynomial of $B$ is<br />
		\[x^2-\tr(B)x+\det(B)=x^2+1.\]
		Hence the eigenvalues of $B$ are $\pm i$.</p>
<hr />
<p>		Note that the matrix $\begin{bmatrix}<br />
		  0 &#038; -1\\<br />
		  1&#038; 0<br />
		\end{bmatrix}$ has also eigenvalues $\pm i$.<br />
		Thus this matrix is similar to the matrix $B$ as both matrices are similar to the diagonal matrix $\begin{bmatrix}<br />
		  i &#038; 0\\<br />
		  0&#038; -i<br />
		\end{bmatrix}$.<br />
		Let $P$ be a nonsingular matrix such that<br />
		\[B&#8217;:=P^{-1}BP=\begin{bmatrix}<br />
		  0 &#038; -1\\<br />
		  1&#038; 0<br />
		\end{bmatrix}.\]
		Let $A&#8217;=P^{-1}AP$. </p>
<hr />
<p>		The relation (*) is equivalent to $AB=-BA$.<br />
		Using this we have<br />
		\begin{align*}<br />
		A&#8217;B&#8217;&#038;=(P^{-1}AP)(P^{-1}BP)=P^{-1}(AB)P\\<br />
		&#038;=P^{-1}(-BA)P=-(P^{-1}BP)(P^{-1}AP)=-B&#8217;A&#8217;.<br />
		\end{align*}</p>
<p>		Let $A&#8217;=\begin{bmatrix}<br />
		  a &#038; b\\<br />
		  c&#038; d<br />
		\end{bmatrix}$.<br />
		Then $A&#8217;B&#8217;=-B&#8217;A&#8217;$ gives<br />
		\begin{align*}<br />
		\begin{bmatrix}<br />
		  a &#038; b\\<br />
		  c&#038; d<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		  0 &#038; -1\\<br />
		  1&#038; 0<br />
		\end{bmatrix}<br />
		=-\begin{bmatrix}<br />
		  0 &#038; -1\\<br />
		  1&#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  a &#038; b\\<br />
		  c&#038; d<br />
		\end{bmatrix}\\[6pt]
		\Leftrightarrow<br />
		\begin{bmatrix}<br />
		  b &#038; -a\\<br />
		  d&#038; -c<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  c &#038; d\\<br />
		  -a&#038; -b<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Hence we obtain $d=-a$ and $c=b$.</p>
<hr />
<p>		Then<br />
		\begin{align*}<br />
		1=\det(A)=\det(PA&#8217;P^{-1})=\det(A&#8217;)=\begin{vmatrix}<br />
		  a &#038; b\\<br />
		  b&#038; -a<br />
		\end{vmatrix}=-a^2-b^2,<br />
		\end{align*}<br />
		which is impossible.<br />
		Therefore, the matrix $-I$ cannot be written as a commutator $[A, B]$ for any $2\times 2$ matrices $A, B$ with determinant $1$.</p>
<button class="simplefavorite-button has-count" data-postid="4915" data-siteid="1" data-groupid="1" data-favoritecount="20" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">20</span></button><p>The post <a href="https://yutsumura.com/an-example-of-a-matrix-that-cannot-be-a-commutator/" target="_blank">An Example of a Matrix that Cannot Be a Commutator</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/an-example-of-a-matrix-that-cannot-be-a-commutator/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4915</post-id>	</item>
		<item>
		<title>Every Diagonalizable Nilpotent Matrix is the Zero Matrix</title>
		<link>https://yutsumura.com/every-diagonalizable-nilpotent-matrix-is-the-zero-matrix/</link>
				<comments>https://yutsumura.com/every-diagonalizable-nilpotent-matrix-is-the-zero-matrix/#comments</comments>
				<pubDate>Mon, 10 Jul 2017 07:16:07 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nilpotent]]></category>
		<category><![CDATA[nilpotent matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3547</guid>
				<description><![CDATA[<p>Prove that if $A$ is a diagonalizable nilpotent matrix, then $A$ is the zero matrix $O$. &#160; Definition (Nilpotent Matrix) A square matrix $A$ is called nilpotent if there exists a positive integer $k$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/every-diagonalizable-nilpotent-matrix-is-the-zero-matrix/" target="_blank">Every Diagonalizable Nilpotent Matrix is the Zero Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 504</h2>
<p> Prove that if $A$ is a diagonalizable nilpotent matrix, then $A$ is the zero matrix $O$.</p>
<p>&nbsp;<br />
<span id="more-3547"></span><br />

<h3>Definition (Nilpotent Matrix)</h3>
<p>A square matrix $A$ is called <strong>nilpotent</strong> if there exists a positive integer $k$ such that $A^k=O$.</p>
<h2> Proof. </h2>
<h3>Main Part</h3>
<p>		Since $A$ is diagonalizable, there is a nonsingular matrix $S$ such that $S^{-1}AS$ is a diagonal matrix whose diagonal entries are eigenvalues of $A$.</p>
<p>		As we show below, the only eigenvalue of any nilpotent matrix is $0$.<br />
		Thus, $S^{-1}AS$ is the zero matrix.<br />
	Hence $A=SOS^{-1}=O$.</p>
<h3>The only eigenvalue of each nilpotent matrix is $0$</h3>
<p>	It remains to show that the fact we used above: the only eigenvalue of the nilpotent matrix $A$ is $0$.</p>
<p>	Let $\lambda$ be an eigenvalue of $A$ and let $\mathbf{v}$ be an eigenvector corresponding to $\lambda$.<br />
	That is,<br />
	\[A\mathbf{v}=\lambda \mathbf{v}, \tag{*}\]
<p>	Since $A$ is nilpotent, there exists a positive integer $k$ such that $A^k=O$.</p>
<p>	Then we use the relation (*) inductively and obtain<br />
	\begin{align*}<br />
	A^k\mathbf{v}&#038;=A^{k-1}A\mathbf{v}\\<br />
	&#038;=\lambda A^{k-1}\mathbf{v} &#038;&#038; \text{by (*)}\\<br />
	&#038;=\lambda A^{k-2}A\mathbf{v}\\<br />
	&#038;=\lambda^2 A^{k-2}\mathbf{v} &#038;&#038; \text{by (*)}\\<br />
	&#038;=\dots =\lambda^k \mathbf{v}.<br />
	\end{align*}</p>
<p>	Hence we have<br />
	\[\mathbf{0}=O\mathbf{v}=A^k\mathbf{v}=\lambda^k \mathbf{v}.\]
<p>	Note that the eigenvector $\mathbf{v}$ is a nonzero vector by definition.<br />
	Thus, we must have $\lambda^k=0$, hence $\lambda=0$.<br />
	This proves that the only eigenvalue of the nilpotent matrix $A$ is $0$, and this completes the proof.</p>
<h3>Another Proof of the Fact</h3>
<p>	Even though the fact proved above is true regardless of diagonalizability of $A$, we can make use that $A$ is diagonalizable to prove the fact as follows.</p>
<p>	Let $\lambda_1, \dots, \lambda_n$ be the eigenvalues of the $n\times n$ nilpotent matrix $A$.<br />
	Then we have<br />
	\[S^{-1}AS=D,\]
	where<br />
	\[D:=\begin{bmatrix}<br />
	  \lambda_1 &#038; 0 &#038; \dots &#038;   0 \\<br />
	  0 &#038;\lambda_2 &#038;  \dots &#038; 0  \\<br />
	  \vdots  &#038; \vdots &#038; \ddots &#038; \vdots \\<br />
	  0 &#038; 0 &#038; \dots &#038; \lambda_n<br />
	\end{bmatrix}.\]
<p>	Then we have<br />
\(\require{cancel}\)<br />
	\begin{align*}<br />
	D^k&#038;=(S^{-1}AS)^k\\<br />
	&#038;=(S^{-1}A\cancel{S})(\cancel{S}^{-1}A\cancel{S})\cdots (\cancel{S}^{-1}AS)\\<br />
	&#038;=S^{-1}A^kS\\<br />
	&#038;=S^{-1}OS=O.<br />
	\end{align*}</p>
<p>	Since<br />
	\[D^k=\begin{bmatrix}<br />
	  \lambda_1^k &#038; 0 &#038; \dots &#038;   0 \\<br />
	  0 &#038;\lambda_2^k &#038;  \dots &#038; 0  \\<br />
	  \vdots  &#038; \vdots &#038; \ddots &#038; \vdots \\<br />
	  0 &#038; 0 &#038; \dots &#038; \lambda_n^k<br />
	\end{bmatrix},\]
	it yields that $\lambda_i^=0$, and hence $\lambda_i=0$ for $i=1, \dots, n$.</p>
<h2> Related Question. </h2>
<p>The converse of the above fact is also true: if the only eigenvalue of $A$ is $0$, then $A$ is a nilpotent matrix.</p>
<p>See the post &#8628;<br />
<a href="//yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/" target="_blank">Nilpotent Matrix and Eigenvalues of the Matrix</a><br />
for a proof of this fact.</p>
<button class="simplefavorite-button has-count" data-postid="3547" data-siteid="1" data-groupid="1" data-favoritecount="35" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">35</span></button><p>The post <a href="https://yutsumura.com/every-diagonalizable-nilpotent-matrix-is-the-zero-matrix/" target="_blank">Every Diagonalizable Nilpotent Matrix is the Zero Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/every-diagonalizable-nilpotent-matrix-is-the-zero-matrix/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3547</post-id>	</item>
		<item>
		<title>Diagonalize the 3 by 3 Matrix Whose Entries are All One</title>
		<link>https://yutsumura.com/diagonalize-the-3-by-3-matrix-whose-entries-are-all-one/</link>
				<comments>https://yutsumura.com/diagonalize-the-3-by-3-matrix-whose-entries-are-all-one/#comments</comments>
				<pubDate>Tue, 27 Jun 2017 04:17:54 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3311</guid>
				<description><![CDATA[<p>Diagonalize the matrix \[A=\begin{bmatrix} 1 &#038; 1 &#038; 1 \\ 1 &#038;1 &#038;1 \\ 1 &#038; 1 &#038; 1 \end{bmatrix}.\] Namely, find a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/diagonalize-the-3-by-3-matrix-whose-entries-are-all-one/" target="_blank">Diagonalize the 3 by 3 Matrix Whose Entries are All One</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 483</h2>
<p>	Diagonalize the matrix<br />
	\[A=\begin{bmatrix}<br />
	  1 &#038; 1 &#038; 1 \\<br />
	   1 &#038;1 &#038;1 \\<br />
	   1 &#038; 1 &#038; 1<br />
	\end{bmatrix}.\]
	Namely, find a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.</p>
<p>(<em>The Ohio State University, Linear Algebra Final Exam Problem</em>)</p>
<p>&nbsp;<br />
<span id="more-3311"></span><br />

<h2>Hint.</h2>
<p>To diagonalize the matrix $A$, we need to find eigenvalues $A$ and bases of eigenspaces.</p>
<p>For a procedure of the diagonalization, see the post &#8220;<a href="//yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/" target="_blank">How to Diagonalize a Matrix. Step by Step Explanation.</a>&#8220;.</p>
<p>Below, we will find eigenvalues and eigenvectors without using the characteristic polynomial although you may use it.</p>
<h2>Solution.</h2>
<p>		We use an indirect way to find eigenvalues and eigenvectors.<br />
		(We will not use the characteristic polynomial.)</p>
<p>		Applying the elementary row operations, we have<br />
		\begin{align*}<br />
	A=\begin{bmatrix}<br />
	  1 &#038; 1 &#038; 1 \\<br />
	   1 &#038;1 &#038;1 \\<br />
	   1 &#038; 1 &#038; 1<br />
	\end{bmatrix}\xrightarrow{\substack{R_2-R_1\\ R_3-R_1}}<br />
	\begin{bmatrix}<br />
	  1 &#038; 1 &#038; 1 \\<br />
	   0 &#038;0 &#038;0 \\<br />
	   0 &#038; 0 &#038; 0<br />
	\end{bmatrix}.<br />
	\end{align*}<br />
	Hence the solutions $\mathbf{x}$ of $A\mathbf{x}=\mathbf{0}$ satisfy<br />
	\[x_1=-x_2-x_3.\]
	Thus, every vector in the null space is of the form<br />
	\[\mathbf{x}=\begin{bmatrix}<br />
	  -x_2-x_3 \\<br />
	   x_2 \\<br />
	    x_3<br />
	  \end{bmatrix}=x_2\begin{bmatrix}<br />
	  -1 \\<br />
	   1 \\<br />
	    0<br />
	  \end{bmatrix}+x_3\begin{bmatrix}<br />
	  -1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix}\]
	  for some scalars $x_2, x_3$.</p>
<p>	  It follows that<br />
	  \[\left\{\, \begin{bmatrix}<br />
	  -1 \\<br />
	   1 \\<br />
	    0<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  -1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix}  \,\right\}\]
	  is a basis of the null space $\calN(A)$.</p>
<p>	  Hence $0$ is an eigenvalue and the geometric multiplicity corresponding to $0$, which is the nullity of $A$, is $2$.<br />
	  It follows that the algebraic multiplicity of the eigenvalue $0$ is either $2$ or $3$.<br />
	  We see that it is $2$ shortly.</p>
<hr />
<p>	  Note that by inspection we have<br />
	  \[A\begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    1<br />
	  \end{bmatrix}=3\begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    1<br />
	  \end{bmatrix}.\]
	  This yields that $3$ is an eigenvalue of $A$ and $\begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    1<br />
	  \end{bmatrix}$ is a corresponding eigenvector.</p>
<p>	  The sum of algebraic multiplicities of all eigenvalues of $A$ is $3$.<br />
	  Hence the algebraic multiplicity of $0$ must be $2$, and that of $3$ must be $1$.<br />
	  In particular, the vector $\begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    1<br />
	  \end{bmatrix}$ forms a basis of the eigenspace $E_3$.</p>
<hr />
<p>	  In summary so far, we have eigenvalues $0, 3$ and basis vectors of eigenspaces are<br />
	  \[\begin{bmatrix}<br />
	  -1 \\<br />
	   1 \\<br />
	    0<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  -1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix} \text{ and } \begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    1<br />
	  \end{bmatrix},\]
	  respectively.</p>
<p>	  Thus, we put<br />
	  \[S=\begin{bmatrix}<br />
	  -1 &#038; -1 &#038; 1 \\<br />
	   1 &#038;0 &#038;1 \\<br />
	   0 &#038; 1 &#038; 1<br />
	\end{bmatrix}\]
	and obtain<br />
	\[S^{-1}AS=D,\]
	where<br />
	\[D=\begin{bmatrix}<br />
	  0 &#038; 0 &#038; 0 \\<br />
	   0 &#038;0 &#038;0 \\<br />
	   0 &#038; 0 &#038; 3<br />
	\end{bmatrix}.\]
<h2>Final Exam Problems and Solution. (Linear Algebra Math 2568 at the Ohio State University) </h2>
<p>This problem is one of the final exam problems of Linear Algebra course at the Ohio State University (Math 2568).</p>
<p>The other problems can be found from the links below.</p>
<ol>
<li><a href="//yutsumura.com/find-all-the-eigenvalues-of-4-by-4-matrix/" target="_blank">Find All the Eigenvalues of 4 by 4 Matrix</a></li>
<li><a href="//yutsumura.com/find-a-basis-of-the-eigenspace-corresponding-to-a-given-eigenvalue/" target="_blank">Find a Basis of the Eigenspace Corresponding to a Given Eigenvalue</a></li>
<li><a href="//yutsumura.com/diagonalize-a-2-by-2-matrix-if-diagonalizable/" target="_blank">Diagonalize a 2 by 2 Matrix if Diagonalizable</a></li>
<li><a href="//yutsumura.com/find-an-orthonormal-basis-of-the-range-of-a-linear-transformation/" target="_blank">Find an Orthonormal Basis of the Range of a Linear Transformation</a></li>
<li><a href="//yutsumura.com/the-product-of-two-nonsingular-matrices-is-nonsingular/" target="_blank">The Product of Two Nonsingular Matrices is Nonsingular</a></li>
<li><a href="//yutsumura.com/determine-wether-given-subsets-in-r4-are-subspaces-or-not/" target="_blank">Determine Whether Given Subsets in ℝ4 R 4  are Subspaces or Not</a></li>
<li><a href="//yutsumura.com/find-a-basis-of-the-vector-space-of-polynomials-of-degree-2-or-less-among-given-polynomials/" target="_blank">Find a Basis of the Vector Space of Polynomials of Degree 2 or Less Among Given Polynomials</a></li>
<li><a href="//yutsumura.com/find-values-of-a-b-c-such-that-the-given-matrix-is-diagonalizable/" target="_blank">Find Values of $a , b , c$  such that the Given Matrix is Diagonalizable</a></li>
<li><a href="//yutsumura.com/idempotent-matrix-and-its-eigenvalues/" target="_blank">Idempotent Matrix and its Eigenvalues</a></li>
<li>Diagonalize the 3 by 3 Matrix Whose Entries are All One (This page)</li>
<li><a href="//yutsumura.com/given-the-characteristic-polynomial-find-the-rank-of-the-matrix/" target="_blank">Given the Characteristic Polynomial, Find the Rank of the Matrix</a></li>
<li><a href="//yutsumura.com/compute-a10mathbfv-using-eigenvalues-and-eigenvectors-of-the-matrix-a/" target="_blank">Compute $A^{10}\mathbf{v}$  Using Eigenvalues and Eigenvectors of the Matrix $A$</a></li>
<li><a href="//yutsumura.com/determine-whether-there-exists-a-nonsingular-matrix-satisfying-a4aba22a3/" target="_blank">Determine Whether There Exists a Nonsingular Matrix Satisfying $A^4=ABA^2+2A^3$</a></li>
</ol>
<button class="simplefavorite-button has-count" data-postid="3311" data-siteid="1" data-groupid="1" data-favoritecount="59" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">59</span></button><p>The post <a href="https://yutsumura.com/diagonalize-the-3-by-3-matrix-whose-entries-are-all-one/" target="_blank">Diagonalize the 3 by 3 Matrix Whose Entries are All One</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/diagonalize-the-3-by-3-matrix-whose-entries-are-all-one/feed/</wfw:commentRss>
		<slash:comments>6</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3311</post-id>	</item>
		<item>
		<title>Find Values of $a, b, c$ such that the Given Matrix is Diagonalizable</title>
		<link>https://yutsumura.com/find-values-of-a-b-c-such-that-the-given-matrix-is-diagonalizable/</link>
				<comments>https://yutsumura.com/find-values-of-a-b-c-such-that-the-given-matrix-is-diagonalizable/#comments</comments>
				<pubDate>Tue, 27 Jun 2017 04:03:21 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[upper triangular matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3309</guid>
				<description><![CDATA[<p>For which values of constants $a, b$ and $c$ is the matrix \[A=\begin{bmatrix} 7 &#038; a &#038; b \\ 0 &#038;2 &#038;c \\ 0 &#038; 0 &#038; 3 \end{bmatrix}\] diagonalizable? (The Ohio State University,&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/find-values-of-a-b-c-such-that-the-given-matrix-is-diagonalizable/" target="_blank">Find Values of $a, b, c$ such that the Given Matrix is Diagonalizable</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 482</h2>
<p>		For which values of constants $a, b$ and $c$ is the matrix<br />
		\[A=\begin{bmatrix}<br />
		  7 &#038; a &#038; b \\<br />
		   0 &#038;2 &#038;c \\<br />
		   0 &#038; 0 &#038; 3<br />
		\end{bmatrix}\]
		diagonalizable?</p>
<p>(<em>The Ohio State University, Linear Algebra Final Exam Problem</em>)</p>
<p>&nbsp;<br />
<span id="more-3309"></span><br />

<h2>Solution.</h2>
<p>			Note that the matrix $A$ is an upper triangular matrix.<br />
			Hence the eigenvalues of $A$ are diagonal entries $7, 2, 3$.</p>
<p>			So the $3\times 3$ matrix $A$ has three distinct eigenvalues.<br />
			This implies that $A$ is diagonalizable.</p>
<p>			Hence, regardless of the values of $a, b, c$, the matrix $A$ is always diagonalizable.<br />
			Thus, $a, b, c$ can take arbitrary values.</p>
<h2>Final Exam Problems and Solution. (Linear Algebra Math 2568 at the Ohio State University) </h2>
<p>This problem is one of the final exam problems of Linear Algebra course at the Ohio State University (Math 2568).</p>
<p>The other problems can be found from the links below.</p>
<ol>
<li><a href="//yutsumura.com/find-all-the-eigenvalues-of-4-by-4-matrix/" target="_blank">Find All the Eigenvalues of 4 by 4 Matrix</a></li>
<li><a href="//yutsumura.com/find-a-basis-of-the-eigenspace-corresponding-to-a-given-eigenvalue/" target="_blank">Find a Basis of the Eigenspace Corresponding to a Given Eigenvalue</a></li>
<li><a href="//yutsumura.com/diagonalize-a-2-by-2-matrix-if-diagonalizable/" target="_blank">Diagonalize a 2 by 2 Matrix if Diagonalizable</a></li>
<li><a href="//yutsumura.com/find-an-orthonormal-basis-of-the-range-of-a-linear-transformation/" target="_blank">Find an Orthonormal Basis of the Range of a Linear Transformation</a></li>
<li><a href="//yutsumura.com/the-product-of-two-nonsingular-matrices-is-nonsingular/" target="_blank">The Product of Two Nonsingular Matrices is Nonsingular</a></li>
<li><a href="//yutsumura.com/determine-wether-given-subsets-in-r4-are-subspaces-or-not/" target="_blank">Determine Whether Given Subsets in ℝ4 R 4  are Subspaces or Not</a></li>
<li><a href="//yutsumura.com/find-a-basis-of-the-vector-space-of-polynomials-of-degree-2-or-less-among-given-polynomials/" target="_blank">Find a Basis of the Vector Space of Polynomials of Degree 2 or Less Among Given Polynomials</a></li>
<li>Find Values of $a , b , c$  such that the Given Matrix is Diagonalizable</li>
<li><a href="//yutsumura.com/idempotent-matrix-and-its-eigenvalues/" target="_blank">Idempotent Matrix and its Eigenvalues</a></li>
<li><a href="//yutsumura.com/diagonalize-the-3-by-3-matrix-whose-entries-are-all-one/" target="_blank">Diagonalize the 3 by 3 Matrix Whose Entries are All One</a></li>
<li><a href="//yutsumura.com/given-the-characteristic-polynomial-find-the-rank-of-the-matrix/" target="_blank">Given the Characteristic Polynomial, Find the Rank of the Matrix</a></li>
<li><a href="//yutsumura.com/compute-a10mathbfv-using-eigenvalues-and-eigenvectors-of-the-matrix-a/" target="_blank">Compute $A^{10}\mathbf{v}$  Using Eigenvalues and Eigenvectors of the Matrix $A$</a></li>
<li><a href="//yutsumura.com/determine-whether-there-exists-a-nonsingular-matrix-satisfying-a4aba22a3/" target="_blank">Determine Whether There Exists a Nonsingular Matrix Satisfying $A^4=ABA^2+2A^3$</a></li>
</ol>
<button class="simplefavorite-button has-count" data-postid="3309" data-siteid="1" data-groupid="1" data-favoritecount="27" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">27</span></button><p>The post <a href="https://yutsumura.com/find-values-of-a-b-c-such-that-the-given-matrix-is-diagonalizable/" target="_blank">Find Values of $a, b, c$ such that the Given Matrix is Diagonalizable</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/find-values-of-a-b-c-such-that-the-given-matrix-is-diagonalizable/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3309</post-id>	</item>
		<item>
		<title>A Matrix Equation of a Symmetric Matrix and the Limit of its Solution</title>
		<link>https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/</link>
				<comments>https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/#respond</comments>
				<pubDate>Thu, 15 Jun 2017 15:57:43 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[Berkeley]]></category>
		<category><![CDATA[Berkeley.LA]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[orthonormal basis]]></category>
		<category><![CDATA[qualifying exam]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3141</guid>
				<description><![CDATA[<p>Let $A$ be a real symmetric $n\times n$ matrix with $0$ as a simple eigenvalue (that is, the algebraic multiplicity of the eigenvalue $0$ is $1$), and let us fix a vector $\mathbf{v}\in \R^n$.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/" target="_blank">A Matrix Equation of a Symmetric Matrix and the Limit of its Solution</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 457</h2>
<p>	Let $A$ be a real symmetric $n\times n$ matrix with $0$ as a simple eigenvalue (that is, the algebraic multiplicity of the eigenvalue $0$ is $1$), and let us fix a vector $\mathbf{v}\in \R^n$.</p>
<p><strong>(a)</strong> Prove that for sufficiently small positive real $\epsilon$, the equation<br />
	\[A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}\]
	has a unique solution $\mathbf{x}=\mathbf{x}(\epsilon) \in \R^n$.</p>
<p><strong>(b)</strong> Evaluate<br />
	\[\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)\]
	 in terms of $\mathbf{v}$, the eigenvectors of $A$, and the inner product $\langle\, ,\,\rangle$ on $\R^n$.</p>
<p>&nbsp;<br />
(<em>University of California, Berkeley, Linear Algebra Qualifying Exam</em>)</p>
<p><span id="more-3141"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that $A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}$has a unique solution $\mathbf{x}=\mathbf{x}(\epsilon) \in \R^n$.</h3>
<p>Recall that the <a href="//yutsumura.com/eigenvalues-of-a-hermitian-matrix-are-real-numbers/" target="_blank">eigenvalues of a real symmetric matrices are all real numbers</a> and it is diagonalizable by an orthogonal matrix.</p>
<p>		Note that the equation $A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}$ can be written as<br />
		\[(A+\epsilon I)\mathbf{x}=\mathbf{v}, \tag{*}\]
		where $I$ is the $n\times n$ identity matrix. Thus to show that the equation (*) has a unique solution, it suffices to show that the matrix $A+\epsilon I$ is invertible.</p>
<p>		Since $A$ is diagonalizable, there exists an invertible matrix $S$ such that<br />
		\[S^{-1}AS=\begin{bmatrix}<br />
		 \lambda_1  &#038; 0 &#038; \cdots &#038; 0 \\<br />
		0 &#038; \lambda_2 &#038; \cdots &#038; 0\\<br />
		\vdots  &#038; \vdots  &#038; \ddots &#038; \vdots  \\<br />
		0 &#038; 0 &#038; \cdots &#038; \lambda_n<br />
		\end{bmatrix},\]
		where $\lambda_i$ are eigenvalues of $A$.<br />
		Since the algebraic multiplicity of $0$ is $1$, without loss of generality, we may assume that $\lambda_1=0$ and $\lambda_i, i > 1$ are nonzero.</p>
<p>		Then we have<br />
		\begin{align*}<br />
	S^{-1}(A+\epsilon I)S&#038;=S^{-1}AS+\epsilon I=\begin{bmatrix}<br />
			 \epsilon  &#038; 0 &#038; \cdots &#038; 0 \\<br />
			0 &#038; \epsilon+\lambda_2 &#038; \cdots &#038; 0\\<br />
			\vdots  &#038; \vdots  &#038; \ddots &#038; \vdots  \\<br />
			0 &#038; 0 &#038; \cdots &#038; \epsilon+\lambda_n<br />
			\end{bmatrix}.<br />
	\end{align*}</p>
<hr />
<p>			If $\epsilon > 0$ is smaller than the lengths of $|\lambda_i|, i > 1$, then none of the diagonal entries $\epsilon+ \lambda_i$ are zero.</p>
<p>			Hence we have<br />
			\begin{align*}<br />
	\det(A+\epsilon I)&#038;=\det(S)^{-1}\det(A+\epsilon I)\det(S)\\<br />
	&#038;=\det\left(\,  S^{-1}(A+\epsilon I) S \,\right)\\<br />
	&#038;=\epsilon(\epsilon+\lambda_2)\cdots (\epsilon+\lambda_n)\neq 0.<br />
	\end{align*}<br />
	Since $\det(A+\epsilon I)\neq 0$, it yields that $A$ is invertible, hence the equation (*) has a unique solution<br />
	\[\mathbf{x}(\epsilon)=(A+\epsilon I)^{-1}\mathbf{v}.\]
<h4>Remark</h4>
<p> This result is in general true for any square matrix.<br />
	Instead of using the diagonalization, we can use the triangulation of a matrix.</p>
<h3>(b) Evaluate $\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)$</h3>
<p> As noted earlier that a real symmetric matrix can be diagonalizable by an orthogonal matrix.<br />
			This means that there is an eigenvector $\mathbf{v}_i$ corresponding to the eigenvalue $\lambda_i$ for each $i$ such that the eigenvectors $\mathbf{v}_i$ form an orthonormal basis of $\R^n$.<br />
			That is,<br />
			\begin{align*}<br />
	A\mathbf{v}_i=\lambda_i \mathbf{v}_i \\<br />
	\langle \mathbf{v}_i,\mathbf{v}_j \rangle=\delta_{i,j},<br />
	\end{align*}<br />
	where $\delta_{i,j}$ is the Kronecker delta symbol, where $\delta_{i,i}=1, \delta_{i,j}=0$ if $i\neq j$.<br />
	From this, we deduce that<br />
	\begin{align*}<br />
	(A+\epsilon I)\mathbf{v}_i=(\lambda_i+\epsilon)\mathbf{v}_i\\<br />
	(A+\epsilon I)^{-1}\mathbf{v}_i=\frac{1}{\lambda_i+\epsilon}\mathbf{v}_i. \tag{**}<br />
	\end{align*}<br />
	Using the basis $\{\mathbf{v}_i\}$, we write<br />
	\[\mathbf{v}=\sum_{i=1}^nc_i \mathbf{v}_i\]
	for some $c_i\in \R$.</p>
<hr />
<p>	Then we compute<br />
	\begin{align*}<br />
	A\mathbf{x}(\epsilon)&#038;=A(A+\epsilon I)^{-1}\mathbf{v} &#038;&#038; \text{by part (a)}\\<br />
	&#038;=A(A+\epsilon I)^{-1}\left(\, \sum_{i=1}^nc_i \mathbf{v}_i  \,\right)\\<br />
	&#038;=\sum_{i=1}^n c_iA(A+\epsilon I)^{-1}\mathbf{v}_i\\<br />
	&#038;=\sum _{i=1}^n c_iA\left(\,  \frac{1}{\lambda_i+\epsilon}\mathbf{v}_i \,\right) &#038;&#038; \text{by (**)}\\<br />
	&#038;=\sum_{i=1}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i &#038;&#038; \text{since $A\mathbf{v}_i=\lambda_i\mathbf{v}_i$}\\<br />
	&#038;=\sum_{i=2}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i &#038;&#038; \text{since $\lambda_1=0$}.<br />
	\end{align*}</p>
<hr />
<p>	Therefore we have<br />
	\begin{align*}<br />
	\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)&#038;=\lim_{\epsilon \to 0^+}\left(\,  \mathbf{v}-A\mathbf{x}(\epsilon) \,\right)\\<br />
	&#038;=\mathbf{v}-\lim_{\epsilon \to 0^+}\left(\,  A\mathbf{x}(\epsilon) \,\right)\\<br />
	&#038;= \sum_{i=1}^nc_i\mathbf{v}_i-\lim_{\epsilon \to 0^+}\left(\,  \sum_{i=2}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i \,\right)\\<br />
	&#038;=\sum_{i=1}c_i \mathbf{v}_i-\sum_{i=2}^n c_i \mathbf{v}_i\\<br />
	&#038;=c_1\mathbf{v}_1.<br />
	\end{align*}<br />
	Using the orthonormality of the basis $\{\mathbf{v}_i\}$, we have<br />
	\[\langle\mathbf{v}, \mathbf{v}_1 \rangle=\sum_{i=1}^n \langle c_i\mathbf{v}_i, \mathbf{v}_1 \rangle=c_1.\]
<p>	Hence the required expression is<br />
	\[\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)=\langle\mathbf{v}, \mathbf{v}_1 \rangle\mathbf{v}_1,\]
	where $\mathbf{v}_1$ is the unit eigenvector corresponding to the eigenvalue $0$.</p>
<button class="simplefavorite-button has-count" data-postid="3141" data-siteid="1" data-groupid="1" data-favoritecount="13" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">13</span></button><p>The post <a href="https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/" target="_blank">A Matrix Equation of a Symmetric Matrix and the Limit of its Solution</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3141</post-id>	</item>
		<item>
		<title>Quiz 13 (Part 1) Diagonalize a Matrix</title>
		<link>https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/</link>
				<comments>https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/#comments</comments>
				<pubDate>Fri, 21 Apr 2017 20:16:42 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[algebraic multiplicity]]></category>
		<category><![CDATA[defective matrix]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[eigenspace]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[geometric multiplicity]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[quiz]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2718</guid>
				<description><![CDATA[<p>Let \[A=\begin{bmatrix} 2 &#038; -1 &#038; -1 \\ -1 &#038;2 &#038;-1 \\ -1 &#038; -1 &#038; 2 \end{bmatrix}.\] Determine whether the matrix $A$ is diagonalizable. If it is diagonalizable, then diagonalize $A$. That is,&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1) Diagonalize a Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 385</h2>
<p>	Let<br />
	\[A=\begin{bmatrix}<br />
	  2 &#038; -1 &#038; -1 \\<br />
	   -1 &#038;2 &#038;-1 \\<br />
	   -1 &#038; -1 &#038; 2<br />
	\end{bmatrix}.\]
	Determine whether the matrix $A$ is diagonalizable. If it is diagonalizable, then diagonalize $A$.<br />
	That is, find a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.</p>
<p>&nbsp;<br />
<span id="more-2718"></span><br />

We give two solutions.<br />
The first solution is a standard method of diagonalization.<br />
For a review of the process of diagonalization, see the post &#8220;<a href="//yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/" target="_blank">How to diagonalize a matrix. Step by step explanation.</a>&#8221;</p>
<p>The second solution is a more indirect method to find eigenvalues and eigenvectors.</p>
<h2>Solution 1.</h2>
<p>	We claim that the matrix $A$ is diagonalizable.<br />
	One way to see this is to note that $A$ is a real symmetric matrix, and hence it is diagonalizable.</p>
<p>	Alternatively, we can compute eigenspaces and check whether $A$ is not defective (namely, the algebraic multiplicity and the geometric multiplicity of each eigenvalue of $A$ are the same.</p>
<hr />
<p>	To diagonalize the matrix $A$, we need to find eigenvalues and three linearly independent eigenvectors.</p>
<p>	We compute the characteristic polynomial of $A$ as follows:<br />
	\begin{align*}<br />
	p(t)&#038;=\det(A-tI)\\<br />
	&#038;=\begin{bmatrix}<br />
	  2-t &#038; -1 &#038; -1 \\<br />
	   -1 &#038;2-t &#038;-1 \\<br />
	   -1 &#038; -1 &#038; 2-t<br />
	\end{bmatrix}\\<br />
	&#038;=(2-t)\begin{bmatrix}<br />
	  2-t &#038; -1\\<br />
	  -1&#038; 2-t<br />
	\end{bmatrix}<br />
	-(-1)\begin{bmatrix}<br />
	  -1 &#038; -1\\<br />
	  -1&#038; 2-t<br />
	\end{bmatrix}+(-1)\begin{bmatrix}<br />
	  -1 &#038; 2-t\\<br />
	  -1&#038; -1<br />
	\end{bmatrix} \\<br />
	&#038;\text{(by the first row cofactor expansion)}\\<br />
	&#038;=-t(t-3)^2.<br />
	\end{align*}<br />
	Since eigenvalues are the roots of the characteristic polynomial, eigenvalues of $A$ are $0$ and $3$ with algebraic multiplicity $1$ and $2$, respectively.</p>
<p>	(If you did not confirm that $A$ is diagonalizable yet, then at this point we know that the geometric multiplicity of $\lambda=0$ is $1$ since the geometric multiplicity is alway greater than $0$ and less than or equal to the algebraic multiplicity. However the geometric multiplicity of $\lambda=3$ is either $1$ or $2$.)</p>
<hr />
<p>	Next, we determine the eigenspace $E_{\lambda}$ and its basis for each eigenvalue of $A$.</p>
<p>	For $\lambda=3$, we find solutions of $(A-3I)\mathbf{x}=\mathbf{0}$.<br />
	We have<br />
	\begin{align*}<br />
	A-3I&#038;=\begin{bmatrix}<br />
	-1 &#038; -1 &#038; -1 \\<br />
	-1 &#038;-1 &#038;-1 \\<br />
	-1 &#038; -1 &#038; -1<br />
	\end{bmatrix}<br />
	\xrightarrow{\substack{R_2-R_1\\R_3-R_1}}<br />
	\begin{bmatrix}<br />
	-1 &#038; -1 &#038; -1 \\<br />
	0 &#038;0 &#038;0\\<br />
	0&#038;0&#038;0<br />
	\end{bmatrix}<br />
	\xrightarrow{-R_1}<br />
	\begin{bmatrix}<br />
	1 &#038; 1 &#038; 1 \\<br />
	0 &#038;0 &#038;0\\<br />
	0&#038;0&#038;0<br />
	\end{bmatrix}.<br />
	\end{align*}<br />
	Hence a solution must satisfy $x_1=-x_2-x_3$, and thus the eigenspace is<br />
	\[ E_3=\left\{\, \mathbf{x} \in \R^3 \quad \middle| \quad \mathbf{x}=x_2\begin{bmatrix}<br />
		-1\\<br />
		1\\<br />
		0<br />
		\end{bmatrix}+x_3\begin{bmatrix}<br />
		-1\\<br />
		0\\<br />
		1<br />
		\end{bmatrix} \text{ for any } x_2, x_3 \in \C \,\right\}.\]
		From this expression, it is straightforward to check that the set<br />
		\[\left\{\begin{bmatrix}<br />
		-1\\<br />
		1\\<br />
		0<br />
		\end{bmatrix}, \begin{bmatrix}<br />
		-1\\<br />
		0\\<br />
		1<br />
		\end{bmatrix}  \,\right\}\]
		is a basis of $E_3$.<br />
		(Hence the geometric multiplicity of $\lambda=3$ is $2$, and $A$ is not defective and diagonalizable.)</p>
<hr />
<p>		For $\lambda=0$, we solve $(A-0I)\mathbf{x}=\mathbf{0}$, thus $A\mathbf{x}=\mathbf{0}$.<br />
		We have<br />
		\begin{align*}<br />
		A&#038;=\begin{bmatrix}<br />
		2 &#038; -1 &#038; -1 \\<br />
		-1 &#038;2 &#038;-1 \\<br />
		-1 &#038; -1 &#038; 2<br />
		\end{bmatrix}<br />
		\xrightarrow{R_1 \leftrightarrow R_2}<br />
		\begin{bmatrix}<br />
		-1 &#038;2 &#038;-1 \\<br />
		2 &#038; -1 &#038; -1 \\<br />
		-1 &#038; -1 &#038; 2<br />
		\end{bmatrix}<br />
		\xrightarrow{-R_1}<br />
		\begin{bmatrix}<br />
		1 &#038; -2 &#038; 1 \\<br />
		2 &#038; -1 &#038; -1 \\<br />
		-1 &#038; -1 &#038; 2<br />
		\end{bmatrix}\\<br />
		&#038;\xrightarrow{\substack{R_2-2R_1\\ R_3+R_1}}<br />
		\begin{bmatrix}<br />
		1 &#038; -2 &#038; 1 \\<br />
		0 &#038; 3 &#038; -3 \\<br />
		0 &#038; -3 &#038; 3<br />
		\end{bmatrix}<br />
		\xrightarrow{R_3+R_2}<br />
			\begin{bmatrix}<br />
			1 &#038; -2 &#038; 1 \\<br />
			0 &#038; 3 &#038; -3 \\<br />
			0 &#038; 0 &#038; 0<br />
			\end{bmatrix}<br />
			\xrightarrow{\frac{1}{3}R_2}<br />
				\begin{bmatrix}<br />
				1 &#038; -2 &#038; 1 \\<br />
				0 &#038; 1 &#038; -1 \\<br />
				0 &#038; 0 &#038; 0<br />
				\end{bmatrix}\\<br />
				&#038;\xrightarrow{R_1+2R_2}<br />
					\begin{bmatrix}<br />
					1 &#038; 0 &#038; -1 \\<br />
					0 &#038; 1 &#038; -1 \\<br />
					0 &#038; 0 &#038; 0<br />
					\end{bmatrix}.<br />
		\end{align*}<br />
		Hence any solution satisfies<br />
		\[x_1=x_3 \text{ and } x_2=x_3.\]
		Therefore, the eigenspace is<br />
		\[E_0=\left\{\, \mathbf{x} \in \R^3 \quad \middle| \quad x_3\begin{bmatrix}<br />
		1\\<br />
		1\\<br />
		1<br />
		\end{bmatrix} \text{ for any } x_3 \in \C \,\right\},\]
		and a basis of $E_0$ is<br />
		\[\left\{\, \begin{bmatrix}<br />
		1\\<br />
		1\\<br />
		1<br />
		\end{bmatrix} \,\right\}.\]
<hr />
<p>		Let<br />
		\[\mathbf{u}_1=\begin{bmatrix}<br />
		-1\\<br />
		1\\<br />
		0<br />
		\end{bmatrix}, \mathbf{u}_2=\begin{bmatrix}<br />
		-1\\<br />
		0\\<br />
		1<br />
		\end{bmatrix}, \mathbf{u}_3=\begin{bmatrix}<br />
		1\\<br />
		1\\<br />
		1<br />
		\end{bmatrix}.\]
		Note that $\{\mathbf{u}_1, \mathbf{u}_2\}$ is a basis of $E_3$ and $\{\mathbf{u}_3\}$ is a basis of $E_0$.<br />
		Thus, it follows that the vectors $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are linearly independent eigenvectors.<br />
		Put<br />
		\[S=[\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3]= \begin{bmatrix}<br />
		-1 &#038; -1 &#038; 1\\<br />
		1&#038; 0&#038; 1\\<br />
		0&#038; 1 &#038; 1<br />
		\end{bmatrix}.\]
		Since the columns of $S$ are linearly independent, the matrix $S$ is nonsingular. Then the procedure of the diagonalization yields that<br />
		\[S^{-1}AS=\begin{bmatrix}<br />
		\mathbf{3} &#038; 0 &#038; 0\\<br />
		0&#038; \mathbf{3}&#038; 0\\<br />
		0 &#038; 0&#038; \mathbf{0}<br />
		\end{bmatrix},\]
		where diagonal entries are eigenvalues corresponding to the eigenvector $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ with this order.</p>
<p>		&nbsp;&nbsp;</p>
<h2>Solution 2.</h2>
<p>		The second solution uses a different method to find eigenvalues and eigenvectors.<br />
		Let $B=A-3I$. Then every entry of $B$ is $-1$.<br />
		We find eigenvalues $\lambda$ and eigenvectors of $B$. Then the eigenvalues of $A$ are $\lambda+3$ and eigenvectors are the same.</p>
<p>		First we reduce the matrix $B$ as follows:<br />
		\begin{align*}<br />
		B&#038;=\begin{bmatrix}<br />
	-1&#038;-1&#038;-1\\<br />
	-1&#038;-1&#038;-1\\<br />
	-1&#038;-1&#038;-1<br />
		\end{bmatrix}<br />
		\xrightarrow{\substack{R_2-R_1\\ R_3-R_1}}<br />
		\begin{bmatrix}<br />
		-1&#038;-1&#038;-1\\<br />
		0&#038;0&#038;0\\<br />
		0&#038;0&#038;0<br />
		\end{bmatrix}<br />
		\xrightarrow{-R_1}<br />
		\begin{bmatrix}<br />
	1&#038;1&#038;1\\<br />
		0&#038;0&#038;0\\<br />
		0&#038;0&#038;0<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Hence the rank of $B$ is $1$, and the nullity of $B$ is $2$  by the rank-nullity theorem. It follows that $\lambda=0$ is an eigenvalue of $B$.<br />
		Note that the eigenspace $E_0$ corresponding to $\lambda=0$ is the null space of $A$. From the reduction above, we see that the null space consists of vectors<br />
		\[\mathbf{x}=x_2\begin{bmatrix}<br />
		-1\\1\\0<br />
		\end{bmatrix}<br />
		+x_3\begin{bmatrix}<br />
		-1\\0\\1<br />
		\end{bmatrix}\]
		for any complex numbers $x_2, x_3$.<br />
		It follows that<br />
		\[\begin{bmatrix}<br />
		-1\\1\\0<br />
		\end{bmatrix},<br />
		\begin{bmatrix}<br />
		-1\\0\\1<br />
		\end{bmatrix}\]
		are basis vectors of eigenspace $E_0$. Hence the geometric multiplicity of $\lambda=0$ is $2$.<br />
	(The algebraic multiplicity is either $2$ or $3$. We will see it must be $2$.)</p>
<hr />
<p>		Since all the entries of $B$ are $-1$, by inspection, we find that the vector<br />
		\[\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}\]
		is an eigenvector corresponding to the eigenvalue $-3$.<br />
		In fact, we have<br />
		\begin{align*}<br />
		B\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}<br />
		&#038;=\begin{bmatrix}<br />
		-1&#038;-1&#038;-1\\<br />
		-1&#038;-1&#038;-1\\<br />
		-1&#038;-1&#038;-1<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}<br />
		=\begin{bmatrix}<br />
		-3\\-3\\-3<br />
		\end{bmatrix}<br />
		=-3\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}.<br />
		\end{align*}</p>
<p>		Since the algebraic multiplicity of $\lambda=0$ is either $2$ or $3$, and the sum of all the algebraic multiplicities is equal to $3$, the algebraic multiplicity of $\lambda=-3$ must be $1$ and that of $\lambda=0$ is $2$.<br />
		Hence the geometric multiplicity of $\lambda=-3$ is $1$. Thus<br />
		\[\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}\]
		is a basis vector of $E_{-3}$.</p>
<hr />
<p>		In a nutshell, we have obtained that eigenvalues of $B$ are $0$ and $-3$ and basis vectors of $E_0$ and $E_{-3}$ are<br />
		\[\begin{bmatrix}<br />
		-1\\1\\0<br />
		\end{bmatrix},<br />
		\begin{bmatrix}<br />
		-1\\0\\1<br />
		\end{bmatrix} \text{ and } \begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix},\]
		respectively.</p>
<p>		Since $A=B+3I$, the eigenvalues of $A$ are $0+3=3$ and $-3+3=0$ and the corresponding eigenvectors are the same. Thus<br />
			\[\begin{bmatrix}<br />
			-1\\1\\0<br />
			\end{bmatrix},<br />
			\begin{bmatrix}<br />
			-1\\0\\1<br />
			\end{bmatrix} \text{ and } \begin{bmatrix}<br />
			1\\1\\1<br />
			\end{bmatrix}\]
			are the basis vectors of the eigenspace $E_3$ and $E_0$ of $A$, respectively.</p>
<hr />
<p>As in solution 1, we put<br />
			\[S=\begin{bmatrix}<br />
			-1 &#038; -1 &#038; 1\\<br />
			1&#038; 0&#038; 1\\<br />
			0&#038; 1 &#038; 1<br />
			\end{bmatrix}.\]
			Then we have<br />
			\[S^{-1}AS=\begin{bmatrix}<br />
			\mathbf{3} &#038; 0 &#038; 0\\<br />
			0&#038; \mathbf{3}&#038; 0\\<br />
			0 &#038; 0&#038; \mathbf{0}<br />
			\end{bmatrix},\]
			where diagonal entries are eigenvalues corresponding to the eigenvector $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ with this order.</p>
<h2>Comment.</h2>
<p>This is the first problem of Quiz 13 (Take Home Quiz) for Math 2568 (Introduction to Linear Algebra) at OSU in Spring 2017.</p>
<h3>List of Quiz Problems of Linear Algebra (Math 2568) at OSU in Spring 2017</h3>
<p>There were 13 weekly quizzes. Here is the list of links to the quiz problems and solutions.</p>
<ul>
<li><a href="//yutsumura.com/quiz-1-gauss-jordan-elimination-homogeneous-system-math-2568-spring-2017/" target="_blank">Quiz 1. Gauss-Jordan elimination / homogeneous system. </a></li>
<li><a href="//yutsumura.com/quiz-2-the-vector-form-for-the-general-solution-transpose-matrices-math-2568-spring-2017/" target="_blank">Quiz 2. The vector form for the general solution / Transpose matrices. </a></li>
<li><a href="//yutsumura.com/quiz-3-condition-that-vectors-are-linearly-dependent-orthogonal-vectors-are-linearly-independent/" target="_blank">Quiz 3. Condition that vectors are linearly dependent/ orthogonal vectors are linearly independent</a></li>
<li><a href="//yutsumura.com/quiz-4-inverse-matrix-nonsingular-matrix-satisfying-a-relation/" target="_blank">Quiz 4. Inverse matrix/ Nonsingular matrix satisfying a relation</a></li>
<li><a href="//yutsumura.com/quiz-5-example-and-non-example-of-subspaces-in-3-dimensional-space/" target="_blank">Quiz 5. Example and non-example of subspaces in 3-dimensional space</a></li>
<li><a href="//yutsumura.com/quiz-6-determine-vectors-in-null-space-range-find-a-basis-of-null-space/" target="_blank">Quiz 6. Determine vectors in null space, range / Find a basis of null space</a></li>
<li><a href="//yutsumura.com/quiz-7-find-a-basis-of-the-range-rank-and-nullity-of-a-matrix/" target="_blank">Quiz 7. Find a basis of the range, rank, and nullity of a matrix</a></li>
<li><a href="//yutsumura.com/quiz-8-determine-subsets-are-subspaces-functions-taking-integer-values-set-of-skew-symmetric-matrices/" target="_blank">Quiz 8. Determine subsets are subspaces: functions taking integer values / set of skew-symmetric matrices</a></li>
<li><a href="//yutsumura.com/quiz-9-find-a-basis-of-the-subspace-spanned-by-four-matrices/" target="_blank">Quiz 9. Find a basis of the subspace spanned by four matrices</a></li>
<li><a href="//yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/" target="_blank">Quiz 10. Find orthogonal basis / Find value of linear transformation</a></li>
<li><a href="//yutsumura.com/quiz-11-find-eigenvalues-and-eigenvectors-properties-of-determinants/" target="_blank">Quiz 11. Find eigenvalues and eigenvectors/ Properties of determinants</a></li>
<li><a href="//yutsumura.com/quiz-12-find-eigenvalues-and-their-algebraic-and-geometric-multiplicities/" target="_blank">Quiz 12. Find eigenvalues and their algebraic and geometric multiplicities</a></li>
<li><a href="//yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1). Diagonalize a matrix.</a></li>
<li><a href="//yutsumura.com/quiz-13-part-2-find-eigenvalues-and-eigenvectors-of-a-special-matrix/" target="_blank">Quiz 13 (Part 2). Find eigenvalues and eigenvectors of a special matrix</a></li>
</ul>
<button class="simplefavorite-button has-count" data-postid="2718" data-siteid="1" data-groupid="1" data-favoritecount="66" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">66</span></button><p>The post <a href="https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1) Diagonalize a Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/feed/</wfw:commentRss>
		<slash:comments>9</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2718</post-id>	</item>
		<item>
		<title>Determine Dimensions of Eigenspaces From Characteristic Polynomial of Diagonalizable Matrix</title>
		<link>https://yutsumura.com/determine-dimensions-of-eigenspaces-from-characteristic-polynomial-of-diagonalizable-matrix/</link>
				<comments>https://yutsumura.com/determine-dimensions-of-eigenspaces-from-characteristic-polynomial-of-diagonalizable-matrix/#respond</comments>
				<pubDate>Fri, 21 Apr 2017 02:44:30 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[algebraic multiplicity]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[geometric multiplicity]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[null space]]></category>
		<category><![CDATA[nullity]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2710</guid>
				<description><![CDATA[<p>Let $A$ be an $n\times n$ matrix with the characteristic polynomial \[p(t)=t^3(t-1)^2(t-2)^5(t+2)^4.\] Assume that the matrix $A$ is diagonalizable. (a) Find the size of the matrix $A$. (b) Find the dimension of the eigenspace&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/determine-dimensions-of-eigenspaces-from-characteristic-polynomial-of-diagonalizable-matrix/" target="_blank">Determine Dimensions of Eigenspaces From Characteristic Polynomial of Diagonalizable Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 384</h2>
<p>		Let $A$ be an $n\times n$ matrix with the characteristic polynomial<br />
	\[p(t)=t^3(t-1)^2(t-2)^5(t+2)^4.\]
	Assume that the matrix $A$ is diagonalizable. </p>
<p><strong>(a)</strong> Find the size of the matrix $A$. </p>
<p><strong>(b)</strong> Find the dimension of the eigenspace $E_2$ corresponding to the eigenvalue $\lambda=2$.</p>
<p><strong>(c)</strong> Find the nullity of $A$.</p>
<p>(<em>The Ohio State University, Linear Algebra Final Exam Problem</em>)<br />
&nbsp;<br />
<span id="more-2710"></span><br />

<h2>Hint/Definition.</h2>
<ul>
<li>Recall that when a matrix is diagonalizable, the algebraic multiplicity of each eigenvalue is the same as the geometric multiplicity.</li>
<li>The geometric multiplicity of an eigenvalue $\lambda$ is the dimension of the eigenspace $E_{\lambda}=\calN(A-\lambda I)$ corresponding to $\lambda$.</li>
<li>The nullity of $A$ is the dimension of the null space $\calN(A)$ of $A$.</li>
</ul>
<h2>Solution.</h2>
<h3>(a) Find the size of the matrix $A$.</h3>
<p>In general, if $A$ is an $n\times n$ matrix, then its characteristic polynomials has degree $n$.<br />
		Since the degree of $p(t)$ is $14$, the size of $A$ is $14 \times 14$.</p>
<h3>(b) Find the dimension of the eigenspace $E_2$ corresponding to the eigenvalue $\lambda=2$.</h3>
<p> Note that the dimension of the eigenspace $E_2$ is the geometric multiplicity of the eigenvalue $\lambda=2$ by definition.</p>
<p>		From the characteristic polynomial $p(t)$, we see that $\lambda=2$ is an eigenvalue of $A$ with algebraic multiplicity $5$.<br />
		Since $A$ is diagonalizable, the algebraic multiplicity of each eigenvalue is the same as the geometric multiplicity.</p>
<p>		It follows that the geometric multiplicity of $\lambda=2$ is $5$, hence the dimension of the eigenspace $E_2$ is $5$.</p>
<h3>(c) Find the nullity of $A$.</h3>
<p> We first observe that $\lambda=0$ is an eigenvalue of $A$ with algebraic multiplicity $3$ from the characteristic polynomial.</p>
<p>	    By definition, the nullity of $A$ is the dimension of the null space $\calN(A)$, and furthermore the null space $\calN(A)$ is the eigenspace $E_0$.<br />
	    Thus, the nullity of $A$ is the same as the geometric multiplicity of the eigenvalue $\lambda=0$.</p>
<p>	    Since $A$ is diagonalizable, the algebraic and geometric multiplicities are the same. Hence the nullity of $A$ is $3$.</p>
<button class="simplefavorite-button has-count" data-postid="2710" data-siteid="1" data-groupid="1" data-favoritecount="23" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">23</span></button><p>The post <a href="https://yutsumura.com/determine-dimensions-of-eigenspaces-from-characteristic-polynomial-of-diagonalizable-matrix/" target="_blank">Determine Dimensions of Eigenspaces From Characteristic Polynomial of Diagonalizable Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/determine-dimensions-of-eigenspaces-from-characteristic-polynomial-of-diagonalizable-matrix/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2710</post-id>	</item>
		<item>
		<title>Normal Nilpotent Matrix is Zero Matrix</title>
		<link>https://yutsumura.com/normal-nilpotent-matrix-is-zero-matrix/</link>
				<comments>https://yutsumura.com/normal-nilpotent-matrix-is-zero-matrix/#respond</comments>
				<pubDate>Wed, 15 Mar 2017 06:11:34 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[conjugate transpose]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nilpotent matrix]]></category>
		<category><![CDATA[normal matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2435</guid>
				<description><![CDATA[<p>A complex square ($n\times n$) matrix $A$ is called normal if \[A^* A=A A^*,\] where $A^*$ denotes the conjugate transpose of $A$, that is $A^*=\bar{A}^{\trans}$. A matrix $A$ is said to be nilpotent if&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/normal-nilpotent-matrix-is-zero-matrix/" target="_blank">Normal Nilpotent Matrix is Zero Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 336</h2>
<p>A complex square ($n\times n$) matrix $A$ is called <strong>normal</strong> if<br />
		 \[A^* A=A A^*,\]
		where $A^*$ denotes the conjugate transpose of $A$, that is $A^*=\bar{A}^{\trans}$.<br />
		A matrix $A$ is said to be <strong><a href="//yutsumura.com/tag/nilpotent-matrix/" target="_blank">nilpotent</a></strong> if there exists a positive integer $k$ such that $A^k$ is the zero matrix.</p>
<p><strong>(a)</strong> Prove that if $A$ is both normal and nilpotent, then $A$ is the zero matrix.<br />
		You may use the fact that every normal matrix is diagonalizable.</p>
<p><strong>(b)</strong> Give a proof of (a) without referring to eigenvalues and diagonalization.</p>
<p><strong>(c)</strong> Let $A, B$ be $n\times n$ complex matrices. Prove that if $A$ is normal and $B$ is nilpotent such that $A+B=I$, then $A=I$, where $I$ is the $n\times n$ identity matrix.</p>
<p>&nbsp;<br />
<span id="more-2435"></span><br />

<h2> Proof. </h2>
<h3>(a) If $A$ is normal and nilpotent, then $A=O$</h3>
<p>Since $A$ is normal, it is diagonalizable. Thus there exists an invertible matrix $P$ such that $P^{-1}AP=D$, where $D$ is a diagonal matrix whose diagonal entries are eigenvalues of $A$. </p>
<p>Since $A$ is nilpotent, all the eigenvalues of $A$ are $0$. (See the post &#8220;<a href="//yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/" target="_blank">Nilpotent matrix and eigenvalues of the matrix</a>&#8221; for the proof.)<br />
		Hence the diagonal entries of $D$ are zero, and we have $D=O$, the zero matrix.</p>
<p>		It follows that we have<br />
		\begin{align*}<br />
	A=PDP^{-1}=POP^{-1}=O.<br />
	\end{align*}<br />
	Therefore, every normal nilpotent matrix must be a zero matrix.</p>
<h3>(b) Give a proof of (a) without referring to eigenvalues and diagonalization.</h3>
<p> Since $A$ is nilpotent, there exists a positive integer $k$ such that $A^k=O$.<br />
	We prove by induction on $k$ that $A=O$.<br />
	The base case $k=1$ is trivial.</p>
<hr />
<p>	Suppose $k>1$ and the case $k-1$ holds. Let $B=A^{k-1}$. Note that since $A$ is normal, the matrix $B$ is also normal.<br />
	For any vector $x \in \C^n$, we compute the length of the vector $B^*Bx$ as follows.<br />
	\begin{align*}<br />
	&#038;\|B^*Bx\|=(B^*Bx)^*(B^*Bx) &#038;&#038; \text{by definition of the length}\\<br />
	&#038;=x^*B^*(BB^*)Bx\\<br />
	&#038;=x^*B^*(B^*B)Bx &#038;&#038; \text{since $B$ is normal}\\<br />
	&#038;=x^* (B^*)^2B^2x,<br />
	\end{align*}<br />
	and the last expression is $O$ since $B^2=A^{2k-2}=O$ as $k \geq 2$ implies $2k-2 \geq k$.<br />
	Hence we have $B^*Bx=\mathbf{0}$ for every $x\in \C^n$.</p>
<hr />
<p>	This yields that<br />
	\begin{align*}<br />
	\|Bx\|&#038;=(Bx)^*(Bx)\\<br />
	&#038;=x^*B^*Bx=0,<br />
	\end{align*}<br />
	for every $x\in \C^n$, and hence $B=O$.</p>
<p>	By the induction hypothesis, $A^{k-1}=O$ implies $A=O$, and the induction is completed.<br />
	So the matrix $A$ must be the zero matrix.</p>
<h3>(c) If $A$ is normal and $B$ is nilpotent such that $A+B=I$, then $A=I$</h3>
<p> We claim that the matrix $B$ is normal as well. If this claim is proved, then part (a) yields that $B=O$ since $B$ is a nilpotent normal matrix, which implies the result $A=I$.</p>
<p>	To prove the claim, we compute<br />
	\begin{align*}<br />
	B^* B&#038;=(I-A)^* (I-A)\\<br />
	&#038;=(I-A^*)(I-A)\\<br />
	&#038;=I-A-A^*+A^*A,<br />
	\end{align*}<br />
	and<br />
	\begin{align*}<br />
	B B^*&#038;=(I-A) (I-A)^*\\<br />
	&#038;=(I-A)(I-A^*)\\<br />
	&#038;=I-A^*-A+AA^*\\<br />
	&#038;=I-A^*-A+A^*A \qquad \text{ since $A$ is normal.}<br />
	\end{align*}<br />
	It follows that we have $B^* B=BB^*$, and thus $B$ is normal.<br />
	Hence, the claim is proved.</p>
<button class="simplefavorite-button has-count" data-postid="2435" data-siteid="1" data-groupid="1" data-favoritecount="25" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">25</span></button><p>The post <a href="https://yutsumura.com/normal-nilpotent-matrix-is-zero-matrix/" target="_blank">Normal Nilpotent Matrix is Zero Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/normal-nilpotent-matrix-is-zero-matrix/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2435</post-id>	</item>
		<item>
		<title>Given Graphs of Characteristic Polynomial of Diagonalizable Matrices, Determine the Rank of Matrices</title>
		<link>https://yutsumura.com/given-graphs-of-characteristic-polynomial-of-diagonal-matrices-determine-the-rank-of-matrices/</link>
				<comments>https://yutsumura.com/given-graphs-of-characteristic-polynomial-of-diagonal-matrices-determine-the-rank-of-matrices/#respond</comments>
				<pubDate>Tue, 13 Dec 2016 03:25:01 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[algebraic multiplicity]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[eigenspace]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[figure]]></category>
		<category><![CDATA[geometric multiplicity]]></category>
		<category><![CDATA[graph]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[null space]]></category>
		<category><![CDATA[nullity]]></category>
		<category><![CDATA[nullity of a matrix]]></category>
		<category><![CDATA[rank]]></category>
		<category><![CDATA[rank of a matrix]]></category>
		<category><![CDATA[rank-nullity theorem]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1549</guid>
				<description><![CDATA[<p>Let $A, B, C$ are $2\times 2$ diagonalizable matrices. The graphs of characteristic polynomials of $A, B, C$ are shown below. The red graph is for $A$, the blue one for $B$, and the&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/given-graphs-of-characteristic-polynomial-of-diagonal-matrices-determine-the-rank-of-matrices/" target="_blank">Given Graphs of Characteristic Polynomial of Diagonalizable Matrices, Determine the Rank of Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 217</h2>
<p>Let $A, B, C$ are $2\times 2$ diagonalizable matrices. </p>
<p>The graphs of characteristic polynomials of $A, B, C$ are shown below. The red graph is for $A$, the blue one for $B$, and the green one for $C$.</p>
<p>From this information, determine the rank of the matrices $A, B,$ and $C$.</p>
<div id="attachment_1550" style="width: 743px" class="wp-caption alignnone"><img aria-describedby="caption-attachment-1550" src="https://i1.wp.com/yutsumura.com/wp-content/uploads/2016/12/graphs-of-characteristic-polynomials.jpg?resize=733%2C464" alt="Graphs of characteristic polynomials" width="733" height="464" class="size-full wp-image-1550" srcset="https://i1.wp.com/yutsumura.com/wp-content/uploads/2016/12/graphs-of-characteristic-polynomials.jpg?w=733&amp;ssl=1 733w, https://i1.wp.com/yutsumura.com/wp-content/uploads/2016/12/graphs-of-characteristic-polynomials.jpg?resize=300%2C190&amp;ssl=1 300w" sizes="(max-width: 733px) 100vw, 733px" data-recalc-dims="1" /><p id="caption-attachment-1550" class="wp-caption-text">Graphs of characteristic polynomials</p></div>
&nbsp;<br />
<span id="more-1549"></span><br />

<h2>Hint.</h2>
<p>Observe that a null space a matrix $M$ is the same as the eigenspace $E_0$ corresponding to the eigenvalue $\lambda=0$ (if 0 is an eigenvalue).<br />
So what you need to see is whether the graphs pass though the origin.</p>
<p>Also, since the matrices are diagonalizable, the algebraic multiplicities are the same as the geometric multiplicities (the dimension of the eigenspace).<br />
You can determine the algebraic multiplicities from the graph by looking at whether the graphs are tangential to $x$-axis or not.</p>
<h2>Solution.</h2>
<p>	We first determine the nullities of the matrices, and then using the rank-nullity theorem we obtain the ranks.</p>
<p>	In general the nullity of any matrix $M$ is the dimension of the null space<br />
\[\calN(M)=\{\mathbf{x}\in \R^2 \mid M\mathbf{x}=\mathbf{0}\}\]
and the null space is the same as the eigenspace<br />
\[E_{0}=\{\mathbf{x}\in \R^2\mid M\mathbf{x}=0\mathbf{x}=\mathbf{0}\}\]
 corresponding to the eigenvalue $\lambda=0$ (if any).</p>
<p>From this observation, we see that the nullity of a matrix $M$ is the geometric multiplicity of the eigenspace $E_0$ associated to the eigenvalue $0$ (if any).</p>
<h3>The Nullity of the Matrix $A$</h3>
<p>Now we determine the nullity of the matrix $A$.<br />
The graph of the characteristic polynomial $p_A(\lambda)$ of $A$ passes through the origin $(0,0)$.</p>
<p>Thus $\lambda=0$ is a root of $p_A(\lambda)$ and hence $\lambda=0$ is an eigenvalue.<br />
Since the $x$-axis is tangential to the graph of $p_A(\lambda)$, the algebraic multiplicity of $\lambda=0$ is $2$. Since the matrix $A$ is diagonalizable, the geometric multiplicity is the same as the algebraic multiplicity. Therefore the nullity of $A$ is $2$.</p>
<h3>The Nullity of the Matrix $B$</h3>
<p>Next, the graph of characteristic polynomial $p_B(\lambda)$ of $B$ does not pass through the origin $(0,0)$. Thus $\lambda=0$ is not an eigenvalue of $B$. This yields that the matrix $B=B-0I$ is nonsingular matrix, and hence we have the null space $\calN(B)=\{0\}$ and the nullity of $B$ is zero.</p>
<h3>The Nullity of the Matrix $C$</h3>
<p>Third, the origin $(0,0)$ is on the graphs of characteristic polynomial $p_C(\lambda)$ of $C$, but the $x$-axis is not tangential to the graph of $p_C(\lambda)$.<br />
Therefore the algebraic (hence geometric) multiplicity of $\lambda=0$ is $1$. Thus the nullity of $C$ is $1$. </p>
<h3>Ranks by the Rank-Nullity Theorem</h3>
<p>Finally, using the rank-nullity theorem<br />
\[\text{rank}+\text{nullity}=2,\]
we obtain the rank of $A, B, C$ are $0, 2, 1$, respectively.</p>
<button class="simplefavorite-button has-count" data-postid="1549" data-siteid="1" data-groupid="1" data-favoritecount="7" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">7</span></button><p>The post <a href="https://yutsumura.com/given-graphs-of-characteristic-polynomial-of-diagonal-matrices-determine-the-rank-of-matrices/" target="_blank">Given Graphs of Characteristic Polynomial of Diagonalizable Matrices, Determine the Rank of Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/given-graphs-of-characteristic-polynomial-of-diagonal-matrices-determine-the-rank-of-matrices/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1549</post-id>	</item>
	</channel>
</rss>
