<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>defective matrix &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/defective-matrix/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Mon, 18 Dec 2017 22:45:38 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.6</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>True or False. Every Diagonalizable Matrix is Invertible</title>
		<link>https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/</link>
				<comments>https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/#respond</comments>
				<pubDate>Mon, 05 Jun 2017 06:49:55 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[algebraic multiplicity]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[defective matrix]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[geometric multiplicity]]></category>
		<category><![CDATA[invertible matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[true or false]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3010</guid>
				<description><![CDATA[<p>Is every diagonalizable matrix invertible? &#160; Solution. The answer is No. Counterexample We give a counterexample. Consider the $2\times 2$ zero matrix. The zero matrix is a diagonal matrix, and thus it is diagonalizable.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/" target="_blank">True or False. Every Diagonalizable Matrix is Invertible</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 439</h2>
<p> Is every diagonalizable matrix invertible?</p>
<p>&nbsp;<br />
<span id="more-3010"></span><br />

<h2> Solution. </h2>
<p>The answer is No.</p>
<h3>Counterexample</h3>
<p>We give a counterexample. Consider the $2\times 2$ zero matrix.<br />
		The zero matrix is a diagonal matrix, and thus it is diagonalizable.<br />
		However, the zero matrix is not invertible as its determinant is zero.</p>
<h3>More Theoretical Explanation</h3>
<p>Let us give a more theoretical explanation.<br />
		If an $n\times n$ matrix $A$ is diagonalizable, then there exists an invertible matrix $P$ such that<br />
		\[P^{-1}AP=\begin{bmatrix}<br />
				 \lambda_1  &#038; 0 &#038; \cdots &#038; 0 \\<br />
				0 &#038; \lambda_2 &#038; \cdots &#038; 0 \\<br />
				\vdots  &#038; \vdots  &#038; \ddots &#038; \vdots  \\<br />
				0 &#038; 0 &#038; \cdots &#038; \lambda_n<br />
				\end{bmatrix},\]
				where $\lambda_1, \dots, \lambda_n$ are eigenvalues of $A$.<br />
				Then we consider the determinants of the matrices of both sides.<br />
			The determinant of the left hand side is<br />
			\begin{align*}<br />
	\det(P^{-1}AP)=\det(P)^{-1}\det(A)\det(P)=\det(A).<br />
	\end{align*}<br />
	On the other hand, the determinant of the right hand side is the product<br />
	\[\lambda_1\lambda_2\cdots \lambda_n\]
	since the right matrix is diagonal.<br />
	Hence we obtain<br />
	\[\det(A)=\lambda_1\lambda_2\cdots \lambda_n.\]
	(Note that it is always true that the determinant of a matrix is the product of its eigenvalues regardless diagonalizability.<br />
 See the post &#8220;<a href="//yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/" target="_blank">Determinant/trace and eigenvalues of a matrix</a>&#8220;.)</p>
<p>	Hence if one of the eigenvalues of $A$ is zero, then the determinant of $A$ is zero, and hence $A$ is not invertible.</p>
<p>	The true statement is:</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">a diagonal matrix is invertible if and only if its eigenvalues are nonzero.</div>
<h3>Is Every Invertible Matrix Diagonalizable?</h3>
<p>	Note that it is not true that every invertible matrix is diagonalizable.</p>
<p>	For example, consider the matrix<br />
	\[A=\begin{bmatrix}<br />
	  1 &#038; 1\\<br />
	  0&#038; 1<br />
	\end{bmatrix}.\]
	The determinant of $A$ is $1$, hence $A$ is invertible.<br />
	The characteristic polynomial of $A$ is<br />
	\begin{align*}<br />
	p(t)=\det(A-tI)=\begin{vmatrix}<br />
	  1-t &#038; 1\\<br />
	  0&#038; 1-t<br />
	\end{vmatrix}=(1-t)^2.<br />
	\end{align*}<br />
	Thus, the eigenvalue of $A$ is $1$ with algebraic multiplicity $2$.<br />
	We have<br />
	\[A-I=\begin{bmatrix}<br />
	  0 &#038; 1\\<br />
	  0&#038; 0<br />
	\end{bmatrix}\]
	and thus eigenvectors corresponding to the eigenvalue $1$ are<br />
	\[a\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}\]
	for any nonzero scalar $a$.<br />
	Thus, the geometric multiplicity of the eigenvalue $1$ is $1$.<br />
	Since the geometric multiplicity is strictly less than the algebraic multiplicity, the matrix $A$ is defective and not diagonalizable.</p>
<h3>Is There a Matrix that is Not Diagonalizable and Not Invertible?</h3>
<p>Finally, note that there is a matrix which is not diagonalizable and not invertible.<br />
	For example, the matrix $\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  0&#038; 0<br />
\end{bmatrix}$ is such a matrix.</p>
<h2>Summary </h2>
<p>There are all possibilities.</p>
<ol>
<li>Diagonalizable, but not invertible.<br />
Example: \[\begin{bmatrix}<br />
  0 &#038; 0\\<br />
  0&#038; 0<br />
\end{bmatrix}.\]</li>
<li>Invertible, but not diagonalizable.<br />
Example: \[\begin{bmatrix}<br />
  1 &#038; 1\\<br />
  0&#038; 1<br />
\end{bmatrix}\]</li>
<li>Not diagonalizable and Not invertible.<br />
Example: \[\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  0&#038; 0<br />
\end{bmatrix}.\]</li>
<li>Diagonalizable and invertible<br />
Example: \[\begin{bmatrix}<br />
  1 &#038; 0\\<br />
  0&#038; 1<br />
\end{bmatrix}.\]</li>
</ol>
<button class="simplefavorite-button has-count" data-postid="3010" data-siteid="1" data-groupid="1" data-favoritecount="73" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">73</span></button><p>The post <a href="https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/" target="_blank">True or False. Every Diagonalizable Matrix is Invertible</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3010</post-id>	</item>
		<item>
		<title>Quiz 13 (Part 1) Diagonalize a Matrix</title>
		<link>https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/</link>
				<comments>https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/#comments</comments>
				<pubDate>Fri, 21 Apr 2017 20:16:42 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[algebraic multiplicity]]></category>
		<category><![CDATA[defective matrix]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[eigenspace]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[geometric multiplicity]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[quiz]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2718</guid>
				<description><![CDATA[<p>Let \[A=\begin{bmatrix} 2 &#038; -1 &#038; -1 \\ -1 &#038;2 &#038;-1 \\ -1 &#038; -1 &#038; 2 \end{bmatrix}.\] Determine whether the matrix $A$ is diagonalizable. If it is diagonalizable, then diagonalize $A$. That is,&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1) Diagonalize a Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 385</h2>
<p>	Let<br />
	\[A=\begin{bmatrix}<br />
	  2 &#038; -1 &#038; -1 \\<br />
	   -1 &#038;2 &#038;-1 \\<br />
	   -1 &#038; -1 &#038; 2<br />
	\end{bmatrix}.\]
	Determine whether the matrix $A$ is diagonalizable. If it is diagonalizable, then diagonalize $A$.<br />
	That is, find a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.</p>
<p>&nbsp;<br />
<span id="more-2718"></span><br />

We give two solutions.<br />
The first solution is a standard method of diagonalization.<br />
For a review of the process of diagonalization, see the post &#8220;<a href="//yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/" target="_blank">How to diagonalize a matrix. Step by step explanation.</a>&#8221;</p>
<p>The second solution is a more indirect method to find eigenvalues and eigenvectors.</p>
<h2>Solution 1.</h2>
<p>	We claim that the matrix $A$ is diagonalizable.<br />
	One way to see this is to note that $A$ is a real symmetric matrix, and hence it is diagonalizable.</p>
<p>	Alternatively, we can compute eigenspaces and check whether $A$ is not defective (namely, the algebraic multiplicity and the geometric multiplicity of each eigenvalue of $A$ are the same.</p>
<hr />
<p>	To diagonalize the matrix $A$, we need to find eigenvalues and three linearly independent eigenvectors.</p>
<p>	We compute the characteristic polynomial of $A$ as follows:<br />
	\begin{align*}<br />
	p(t)&#038;=\det(A-tI)\\<br />
	&#038;=\begin{bmatrix}<br />
	  2-t &#038; -1 &#038; -1 \\<br />
	   -1 &#038;2-t &#038;-1 \\<br />
	   -1 &#038; -1 &#038; 2-t<br />
	\end{bmatrix}\\<br />
	&#038;=(2-t)\begin{bmatrix}<br />
	  2-t &#038; -1\\<br />
	  -1&#038; 2-t<br />
	\end{bmatrix}<br />
	-(-1)\begin{bmatrix}<br />
	  -1 &#038; -1\\<br />
	  -1&#038; 2-t<br />
	\end{bmatrix}+(-1)\begin{bmatrix}<br />
	  -1 &#038; 2-t\\<br />
	  -1&#038; -1<br />
	\end{bmatrix} \\<br />
	&#038;\text{(by the first row cofactor expansion)}\\<br />
	&#038;=-t(t-3)^2.<br />
	\end{align*}<br />
	Since eigenvalues are the roots of the characteristic polynomial, eigenvalues of $A$ are $0$ and $3$ with algebraic multiplicity $1$ and $2$, respectively.</p>
<p>	(If you did not confirm that $A$ is diagonalizable yet, then at this point we know that the geometric multiplicity of $\lambda=0$ is $1$ since the geometric multiplicity is alway greater than $0$ and less than or equal to the algebraic multiplicity. However the geometric multiplicity of $\lambda=3$ is either $1$ or $2$.)</p>
<hr />
<p>	Next, we determine the eigenspace $E_{\lambda}$ and its basis for each eigenvalue of $A$.</p>
<p>	For $\lambda=3$, we find solutions of $(A-3I)\mathbf{x}=\mathbf{0}$.<br />
	We have<br />
	\begin{align*}<br />
	A-3I&#038;=\begin{bmatrix}<br />
	-1 &#038; -1 &#038; -1 \\<br />
	-1 &#038;-1 &#038;-1 \\<br />
	-1 &#038; -1 &#038; -1<br />
	\end{bmatrix}<br />
	\xrightarrow{\substack{R_2-R_1\\R_3-R_1}}<br />
	\begin{bmatrix}<br />
	-1 &#038; -1 &#038; -1 \\<br />
	0 &#038;0 &#038;0\\<br />
	0&#038;0&#038;0<br />
	\end{bmatrix}<br />
	\xrightarrow{-R_1}<br />
	\begin{bmatrix}<br />
	1 &#038; 1 &#038; 1 \\<br />
	0 &#038;0 &#038;0\\<br />
	0&#038;0&#038;0<br />
	\end{bmatrix}.<br />
	\end{align*}<br />
	Hence a solution must satisfy $x_1=-x_2-x_3$, and thus the eigenspace is<br />
	\[ E_3=\left\{\, \mathbf{x} \in \R^3 \quad \middle| \quad \mathbf{x}=x_2\begin{bmatrix}<br />
		-1\\<br />
		1\\<br />
		0<br />
		\end{bmatrix}+x_3\begin{bmatrix}<br />
		-1\\<br />
		0\\<br />
		1<br />
		\end{bmatrix} \text{ for any } x_2, x_3 \in \C \,\right\}.\]
		From this expression, it is straightforward to check that the set<br />
		\[\left\{\begin{bmatrix}<br />
		-1\\<br />
		1\\<br />
		0<br />
		\end{bmatrix}, \begin{bmatrix}<br />
		-1\\<br />
		0\\<br />
		1<br />
		\end{bmatrix}  \,\right\}\]
		is a basis of $E_3$.<br />
		(Hence the geometric multiplicity of $\lambda=3$ is $2$, and $A$ is not defective and diagonalizable.)</p>
<hr />
<p>		For $\lambda=0$, we solve $(A-0I)\mathbf{x}=\mathbf{0}$, thus $A\mathbf{x}=\mathbf{0}$.<br />
		We have<br />
		\begin{align*}<br />
		A&#038;=\begin{bmatrix}<br />
		2 &#038; -1 &#038; -1 \\<br />
		-1 &#038;2 &#038;-1 \\<br />
		-1 &#038; -1 &#038; 2<br />
		\end{bmatrix}<br />
		\xrightarrow{R_1 \leftrightarrow R_2}<br />
		\begin{bmatrix}<br />
		-1 &#038;2 &#038;-1 \\<br />
		2 &#038; -1 &#038; -1 \\<br />
		-1 &#038; -1 &#038; 2<br />
		\end{bmatrix}<br />
		\xrightarrow{-R_1}<br />
		\begin{bmatrix}<br />
		1 &#038; -2 &#038; 1 \\<br />
		2 &#038; -1 &#038; -1 \\<br />
		-1 &#038; -1 &#038; 2<br />
		\end{bmatrix}\\<br />
		&#038;\xrightarrow{\substack{R_2-2R_1\\ R_3+R_1}}<br />
		\begin{bmatrix}<br />
		1 &#038; -2 &#038; 1 \\<br />
		0 &#038; 3 &#038; -3 \\<br />
		0 &#038; -3 &#038; 3<br />
		\end{bmatrix}<br />
		\xrightarrow{R_3+R_2}<br />
			\begin{bmatrix}<br />
			1 &#038; -2 &#038; 1 \\<br />
			0 &#038; 3 &#038; -3 \\<br />
			0 &#038; 0 &#038; 0<br />
			\end{bmatrix}<br />
			\xrightarrow{\frac{1}{3}R_2}<br />
				\begin{bmatrix}<br />
				1 &#038; -2 &#038; 1 \\<br />
				0 &#038; 1 &#038; -1 \\<br />
				0 &#038; 0 &#038; 0<br />
				\end{bmatrix}\\<br />
				&#038;\xrightarrow{R_1+2R_2}<br />
					\begin{bmatrix}<br />
					1 &#038; 0 &#038; -1 \\<br />
					0 &#038; 1 &#038; -1 \\<br />
					0 &#038; 0 &#038; 0<br />
					\end{bmatrix}.<br />
		\end{align*}<br />
		Hence any solution satisfies<br />
		\[x_1=x_3 \text{ and } x_2=x_3.\]
		Therefore, the eigenspace is<br />
		\[E_0=\left\{\, \mathbf{x} \in \R^3 \quad \middle| \quad x_3\begin{bmatrix}<br />
		1\\<br />
		1\\<br />
		1<br />
		\end{bmatrix} \text{ for any } x_3 \in \C \,\right\},\]
		and a basis of $E_0$ is<br />
		\[\left\{\, \begin{bmatrix}<br />
		1\\<br />
		1\\<br />
		1<br />
		\end{bmatrix} \,\right\}.\]
<hr />
<p>		Let<br />
		\[\mathbf{u}_1=\begin{bmatrix}<br />
		-1\\<br />
		1\\<br />
		0<br />
		\end{bmatrix}, \mathbf{u}_2=\begin{bmatrix}<br />
		-1\\<br />
		0\\<br />
		1<br />
		\end{bmatrix}, \mathbf{u}_3=\begin{bmatrix}<br />
		1\\<br />
		1\\<br />
		1<br />
		\end{bmatrix}.\]
		Note that $\{\mathbf{u}_1, \mathbf{u}_2\}$ is a basis of $E_3$ and $\{\mathbf{u}_3\}$ is a basis of $E_0$.<br />
		Thus, it follows that the vectors $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are linearly independent eigenvectors.<br />
		Put<br />
		\[S=[\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3]= \begin{bmatrix}<br />
		-1 &#038; -1 &#038; 1\\<br />
		1&#038; 0&#038; 1\\<br />
		0&#038; 1 &#038; 1<br />
		\end{bmatrix}.\]
		Since the columns of $S$ are linearly independent, the matrix $S$ is nonsingular. Then the procedure of the diagonalization yields that<br />
		\[S^{-1}AS=\begin{bmatrix}<br />
		\mathbf{3} &#038; 0 &#038; 0\\<br />
		0&#038; \mathbf{3}&#038; 0\\<br />
		0 &#038; 0&#038; \mathbf{0}<br />
		\end{bmatrix},\]
		where diagonal entries are eigenvalues corresponding to the eigenvector $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ with this order.</p>
<p>		&nbsp;&nbsp;</p>
<h2>Solution 2.</h2>
<p>		The second solution uses a different method to find eigenvalues and eigenvectors.<br />
		Let $B=A-3I$. Then every entry of $B$ is $-1$.<br />
		We find eigenvalues $\lambda$ and eigenvectors of $B$. Then the eigenvalues of $A$ are $\lambda+3$ and eigenvectors are the same.</p>
<p>		First we reduce the matrix $B$ as follows:<br />
		\begin{align*}<br />
		B&#038;=\begin{bmatrix}<br />
	-1&#038;-1&#038;-1\\<br />
	-1&#038;-1&#038;-1\\<br />
	-1&#038;-1&#038;-1<br />
		\end{bmatrix}<br />
		\xrightarrow{\substack{R_2-R_1\\ R_3-R_1}}<br />
		\begin{bmatrix}<br />
		-1&#038;-1&#038;-1\\<br />
		0&#038;0&#038;0\\<br />
		0&#038;0&#038;0<br />
		\end{bmatrix}<br />
		\xrightarrow{-R_1}<br />
		\begin{bmatrix}<br />
	1&#038;1&#038;1\\<br />
		0&#038;0&#038;0\\<br />
		0&#038;0&#038;0<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Hence the rank of $B$ is $1$, and the nullity of $B$ is $2$  by the rank-nullity theorem. It follows that $\lambda=0$ is an eigenvalue of $B$.<br />
		Note that the eigenspace $E_0$ corresponding to $\lambda=0$ is the null space of $A$. From the reduction above, we see that the null space consists of vectors<br />
		\[\mathbf{x}=x_2\begin{bmatrix}<br />
		-1\\1\\0<br />
		\end{bmatrix}<br />
		+x_3\begin{bmatrix}<br />
		-1\\0\\1<br />
		\end{bmatrix}\]
		for any complex numbers $x_2, x_3$.<br />
		It follows that<br />
		\[\begin{bmatrix}<br />
		-1\\1\\0<br />
		\end{bmatrix},<br />
		\begin{bmatrix}<br />
		-1\\0\\1<br />
		\end{bmatrix}\]
		are basis vectors of eigenspace $E_0$. Hence the geometric multiplicity of $\lambda=0$ is $2$.<br />
	(The algebraic multiplicity is either $2$ or $3$. We will see it must be $2$.)</p>
<hr />
<p>		Since all the entries of $B$ are $-1$, by inspection, we find that the vector<br />
		\[\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}\]
		is an eigenvector corresponding to the eigenvalue $-3$.<br />
		In fact, we have<br />
		\begin{align*}<br />
		B\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}<br />
		&#038;=\begin{bmatrix}<br />
		-1&#038;-1&#038;-1\\<br />
		-1&#038;-1&#038;-1\\<br />
		-1&#038;-1&#038;-1<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}<br />
		=\begin{bmatrix}<br />
		-3\\-3\\-3<br />
		\end{bmatrix}<br />
		=-3\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}.<br />
		\end{align*}</p>
<p>		Since the algebraic multiplicity of $\lambda=0$ is either $2$ or $3$, and the sum of all the algebraic multiplicities is equal to $3$, the algebraic multiplicity of $\lambda=-3$ must be $1$ and that of $\lambda=0$ is $2$.<br />
		Hence the geometric multiplicity of $\lambda=-3$ is $1$. Thus<br />
		\[\begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix}\]
		is a basis vector of $E_{-3}$.</p>
<hr />
<p>		In a nutshell, we have obtained that eigenvalues of $B$ are $0$ and $-3$ and basis vectors of $E_0$ and $E_{-3}$ are<br />
		\[\begin{bmatrix}<br />
		-1\\1\\0<br />
		\end{bmatrix},<br />
		\begin{bmatrix}<br />
		-1\\0\\1<br />
		\end{bmatrix} \text{ and } \begin{bmatrix}<br />
		1\\1\\1<br />
		\end{bmatrix},\]
		respectively.</p>
<p>		Since $A=B+3I$, the eigenvalues of $A$ are $0+3=3$ and $-3+3=0$ and the corresponding eigenvectors are the same. Thus<br />
			\[\begin{bmatrix}<br />
			-1\\1\\0<br />
			\end{bmatrix},<br />
			\begin{bmatrix}<br />
			-1\\0\\1<br />
			\end{bmatrix} \text{ and } \begin{bmatrix}<br />
			1\\1\\1<br />
			\end{bmatrix}\]
			are the basis vectors of the eigenspace $E_3$ and $E_0$ of $A$, respectively.</p>
<hr />
<p>As in solution 1, we put<br />
			\[S=\begin{bmatrix}<br />
			-1 &#038; -1 &#038; 1\\<br />
			1&#038; 0&#038; 1\\<br />
			0&#038; 1 &#038; 1<br />
			\end{bmatrix}.\]
			Then we have<br />
			\[S^{-1}AS=\begin{bmatrix}<br />
			\mathbf{3} &#038; 0 &#038; 0\\<br />
			0&#038; \mathbf{3}&#038; 0\\<br />
			0 &#038; 0&#038; \mathbf{0}<br />
			\end{bmatrix},\]
			where diagonal entries are eigenvalues corresponding to the eigenvector $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ with this order.</p>
<h2>Comment.</h2>
<p>This is the first problem of Quiz 13 (Take Home Quiz) for Math 2568 (Introduction to Linear Algebra) at OSU in Spring 2017.</p>
<h3>List of Quiz Problems of Linear Algebra (Math 2568) at OSU in Spring 2017</h3>
<p>There were 13 weekly quizzes. Here is the list of links to the quiz problems and solutions.</p>
<ul>
<li><a href="//yutsumura.com/quiz-1-gauss-jordan-elimination-homogeneous-system-math-2568-spring-2017/" target="_blank">Quiz 1. Gauss-Jordan elimination / homogeneous system. </a></li>
<li><a href="//yutsumura.com/quiz-2-the-vector-form-for-the-general-solution-transpose-matrices-math-2568-spring-2017/" target="_blank">Quiz 2. The vector form for the general solution / Transpose matrices. </a></li>
<li><a href="//yutsumura.com/quiz-3-condition-that-vectors-are-linearly-dependent-orthogonal-vectors-are-linearly-independent/" target="_blank">Quiz 3. Condition that vectors are linearly dependent/ orthogonal vectors are linearly independent</a></li>
<li><a href="//yutsumura.com/quiz-4-inverse-matrix-nonsingular-matrix-satisfying-a-relation/" target="_blank">Quiz 4. Inverse matrix/ Nonsingular matrix satisfying a relation</a></li>
<li><a href="//yutsumura.com/quiz-5-example-and-non-example-of-subspaces-in-3-dimensional-space/" target="_blank">Quiz 5. Example and non-example of subspaces in 3-dimensional space</a></li>
<li><a href="//yutsumura.com/quiz-6-determine-vectors-in-null-space-range-find-a-basis-of-null-space/" target="_blank">Quiz 6. Determine vectors in null space, range / Find a basis of null space</a></li>
<li><a href="//yutsumura.com/quiz-7-find-a-basis-of-the-range-rank-and-nullity-of-a-matrix/" target="_blank">Quiz 7. Find a basis of the range, rank, and nullity of a matrix</a></li>
<li><a href="//yutsumura.com/quiz-8-determine-subsets-are-subspaces-functions-taking-integer-values-set-of-skew-symmetric-matrices/" target="_blank">Quiz 8. Determine subsets are subspaces: functions taking integer values / set of skew-symmetric matrices</a></li>
<li><a href="//yutsumura.com/quiz-9-find-a-basis-of-the-subspace-spanned-by-four-matrices/" target="_blank">Quiz 9. Find a basis of the subspace spanned by four matrices</a></li>
<li><a href="//yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/" target="_blank">Quiz 10. Find orthogonal basis / Find value of linear transformation</a></li>
<li><a href="//yutsumura.com/quiz-11-find-eigenvalues-and-eigenvectors-properties-of-determinants/" target="_blank">Quiz 11. Find eigenvalues and eigenvectors/ Properties of determinants</a></li>
<li><a href="//yutsumura.com/quiz-12-find-eigenvalues-and-their-algebraic-and-geometric-multiplicities/" target="_blank">Quiz 12. Find eigenvalues and their algebraic and geometric multiplicities</a></li>
<li><a href="//yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1). Diagonalize a matrix.</a></li>
<li><a href="//yutsumura.com/quiz-13-part-2-find-eigenvalues-and-eigenvectors-of-a-special-matrix/" target="_blank">Quiz 13 (Part 2). Find eigenvalues and eigenvectors of a special matrix</a></li>
</ul>
<button class="simplefavorite-button has-count" data-postid="2718" data-siteid="1" data-groupid="1" data-favoritecount="66" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">66</span></button><p>The post <a href="https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1) Diagonalize a Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/feed/</wfw:commentRss>
		<slash:comments>9</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2718</post-id>	</item>
		<item>
		<title>How to Diagonalize a Matrix. Step by Step Explanation.</title>
		<link>https://yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/</link>
				<comments>https://yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/#comments</comments>
				<pubDate>Tue, 06 Dec 2016 04:59:26 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[algebraic multiplicity]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[defective matrix]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[eigenspace]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[geometric multiplicity]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[kernel of a matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[nonsingular matrix]]></category>
		<category><![CDATA[null space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1515</guid>
				<description><![CDATA[<p>In this post, we explain how to diagonalize a matrix if it is diagonalizable. As an example, we solve the following problem. Diagonalize the matrix \[A=\begin{bmatrix} 4 &#38; -3 &#38; -3 \\ 3 &#38;-2&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/" target="_blank">How to Diagonalize a Matrix. Step by Step Explanation.</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2>Problem 211</h2>
<p>In this post, we explain how to diagonalize a matrix if it is diagonalizable.</p>
<p>As an example, we solve the following problem.</p>
<p>Diagonalize the matrix<br />
\[A=\begin{bmatrix}<br />
4 &amp; -3 &amp; -3 \\<br />
3 &amp;-2 &amp;-3 \\<br />
-1 &amp; 1 &amp; 2<br />
\end{bmatrix}\]
by finding a nonsingular matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.</p>
<p>(Update 10/15/2017. A new example problem was added.)<br />
<span id="more-1515"></span><br />

Here we explain how to diagonalize a matrix. We only describe the procedure of diagonalization, and no justification will be given.<br />
The process can be summarized as follows. A concrete example is provided below, and several exercise problems are presented at the end of the post. </p>
<h2>Diagonalization Procedure</h2>
<p>Let $A$ be the $n\times n$ matrix that you want to diagonalize (if possible).</p>
<ol>
<li>Find the characteristic polynomial $p(t)$ of $A$.</li>
<li>Find eigenvalues $\lambda$ of the matrix $A$ and their algebraic multiplicities from the characteristic polynomial $p(t)$.</li>
<li>For each eigenvalue $\lambda$ of $A$, find a basis of the eigenspace $E_{\lambda}$.<br />
If there is an eigenvalue $\lambda$ such that the geometric multiplicity of $\lambda$, $\dim(E_{\lambda})$, is less than the algebraic multiplicity of $\lambda$, then the matrix $A$ is not diagonalizable. If not, $A$ is diagonalizable, and proceed to the next step.</li>
<li>If we combine all basis vectors for all eigenspaces, we obtained $n$ linearly independent eigenvectors $\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_n$.</li>
<li>Define the nonsingular matrix<br />
\[S=[\mathbf{v}_1 \mathbf{v}_2 \dots \mathbf{v}_n] .\]</li>
<li>Define the diagonal matrix $D$, whose $(i,i)$-entry is the eigenvalue $\lambda$ such that the $i$-th column vector $\mathbf{v}_i$ is in the eigenspace $E_{\lambda}$.</li>
<li>Then the matrix $A$ is diagonalized as \[ S^{-1}AS=D.\]</li>
</ol>
<h2>Example of a matrix diagonalization</h2>
<p>Now let us examine these steps with an example.<br />
Let us consider the following $3\times 3$ matrix.<br />
\[A=\begin{bmatrix}<br />
4 &amp; -3 &amp; -3 \\<br />
3 &amp;-2 &amp;-3 \\<br />
-1 &amp; 1 &amp; 2<br />
\end{bmatrix}.\]
We want to diagonalize the matrix if possible.</p>
<h3>Step 1: Find the characteristic polynomial</h3>
<p>The characteristic polynomial $p(t)$ of $A$ is<br />
\[p(t)=\det(A-tI)=\begin{vmatrix}<br />
4-t &amp; -3 &amp; -3 \\<br />
3 &amp;-2-t &amp;-3 \\<br />
-1 &amp; 1 &amp; 2-t<br />
\end{vmatrix}.\]
Using the cofactor expansion, we get<br />
\[p(t)=-(t-1)^2(t-2).\]
<h3>Step 2: Find the eigenvalues</h3>
<p>From the characteristic polynomial obtained in Step 1, we see that eigenvalues are<br />
\[\lambda=1 \text{ with algebraic multiplicity } 2\]
and<br />
\[\lambda=2 \text{ with algebraic multiplicity } 1.\]
<h3>Step 3: Find the eigenspaces</h3>
<p>Let us first find the eigenspace $E_1$ corresponding to the eigenvalue $\lambda=1$.<br />
By definition, $E_1$ is the null space of the matrix<br />
\[A-I=\begin{bmatrix}<br />
3 &amp; -3 &amp; -3 \\<br />
3 &amp;-3 &amp;-3 \\<br />
-1 &amp; 1 &amp; 1<br />
\end{bmatrix}<br />
\rightarrow<br />
\begin{bmatrix}<br />
1 &amp; -1 &amp; -1 \\<br />
0 &amp;0 &amp;0 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}\]
by elementary row operations.<br />
Hence if $(A-I)\mathbf{x}=\mathbf{0}$ for $\mathbf{x}\in \R^3$, we have<br />
\[x_1=x_2+x_3.\]
Therefore, we have<br />
\begin{align*}<br />
E_1=\calN(A-I)=\left \{\quad \mathbf{x}\in \R^3 \quad \middle| \quad \mathbf{x}=x_2\begin{bmatrix}<br />
1 \\<br />
1 \\<br />
0<br />
\end{bmatrix}+x_3\begin{bmatrix}<br />
1 \\<br />
0 \\<br />
1<br />
\end{bmatrix} \quad \right \}.<br />
\end{align*}<br />
From this, we see that the set<br />
\[\left\{\quad\begin{bmatrix}<br />
1 \\<br />
1 \\<br />
0<br />
\end{bmatrix},\quad \begin{bmatrix}<br />
1 \\<br />
0 \\<br />
1<br />
\end{bmatrix}\quad \right\}\]
is a basis for the eigenspace $E_1$.<br />
Thus, the dimension of $E_1$, which is the geometric multiplicity of $\lambda=1$, is $2$.</p>
<p>Similarly, we find a basis of the eigenspace $E_2=\calN(A-2I)$ for the eigenvalue $\lambda=2$.<br />
We have<br />
\begin{align*}<br />
A-2I=\begin{bmatrix}<br />
2 &amp; -3 &amp; -3 \\<br />
3 &amp;-4 &amp;-3 \\<br />
-1 &amp; 1 &amp; 0<br />
\end{bmatrix}<br />
\rightarrow \cdots \rightarrow \begin{bmatrix}<br />
1 &amp; 0 &amp; 3 \\<br />
0 &amp;1 &amp;3 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}<br />
\end{align*}<br />
by elementary row operations.<br />
Then if $(A-2I)\mathbf{x}=\mathbf{0}$ for $\mathbf{x}\in \R^3$, then we have<br />
\[x_1=-3x_3 \text{ and } x_2=-3x_3.\]
Therefore we obtain<br />
\begin{align*}<br />
E_2=\calN(A-2I)=\left \{\quad \mathbf{x}\in \R^3 \quad \middle| \quad \mathbf{x}=x_3\begin{bmatrix}<br />
-3 \\<br />
-3 \\<br />
1<br />
\end{bmatrix} \quad \right \}.<br />
\end{align*}<br />
From this we see that the set<br />
\[\left \{ \quad \begin{bmatrix}<br />
-3 \\<br />
-3 \\<br />
1<br />
\end{bmatrix} \quad \right \}\]
is a basis for the eigenspace $E_2$ and the geometric multiplicity is $1$.</p>
<p>Since for both eigenvalues, the geometric multiplicity is equal to the algebraic multiplicity, the matrix $A$ is not defective, and hence diagonalizable.</p>
<h3>Step 4: Determine linearly independent eigenvectors</h3>
<p>From Step 3, the vectors<br />
\[\mathbf{v}_1=\begin{bmatrix}<br />
1 \\<br />
1 \\<br />
0<br />
\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
1 \\<br />
0 \\<br />
1<br />
\end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}<br />
-3 \\<br />
-3 \\<br />
1<br />
\end{bmatrix} \]
are linearly independent eigenvectors.</p>
<h3>Step 5: Define the invertible matrix $S$</h3>
<p>Define the matrix $S=[\mathbf{v}_1 \mathbf{v}_2 \mathbf{v}_3]$. Thus we have<br />
\[S=\begin{bmatrix}<br />
1 &amp; 1 &amp; -3 \\<br />
1 &amp;0 &amp;-3 \\<br />
0 &amp; 1 &amp; 1<br />
\end{bmatrix}\]
and the matrix $S$ is nonsingular (since the column vectors are linearly independent).</p>
<h3>Step 6: Define the diagonal matrix $D$</h3>
<p>Define the diagonal matrix<br />
\[D=\begin{bmatrix}<br />
1 &amp; 0 &amp; 0 \\<br />
0 &amp;1 &amp;0 \\<br />
0 &amp; 0 &amp; 2<br />
\end{bmatrix}.\]
Note that $(1,1)$-entry of $D$ is $1$ because the first column vector $\mathbf{v}_1=\begin{bmatrix}<br />
1 \\<br />
1 \\<br />
0<br />
\end{bmatrix}$ of $S$ is in the eigenspace $E_1$, that is, $\mathbf{v}_1$ is an eigenvector corresponding to eigenvalue $\lambda=1$.<br />
Similarly, the $(2,2)$-entry of $D$ is $1$ because the second column $\mathbf{v}_2=\begin{bmatrix}<br />
1 \\<br />
0 \\<br />
1<br />
\end{bmatrix}$ of $S$ is in $E_1$.<br />
The $(3,3)$-entry of $D$ is $2$ because the third column vector $\mathbf{v}_3=\begin{bmatrix}<br />
-3 \\<br />
-3 \\<br />
1<br />
\end{bmatrix}$ of $S$ is in $E_2$.</p>
<p>(The order you arrange the vectors $\mathbf{v}_1, \mathbf{v_2}, \mathbf{v}_3$ to form $S$ does not matter but once you made $S$, then the order of the diagonal entries is determined by $S$, that is, the order of eigenvectors in $S$.)</p>
<h3>Step 7: Finish the diagonalization</h3>
<p>Finally, we can diagonalize the matrix $A$ as<br />
\[S^{-1}AS=D,\]
where<br />
\[S=\begin{bmatrix}<br />
1 &amp; 1 &amp; -3 \\<br />
1 &amp;0 &amp;-3 \\<br />
0 &amp; 1 &amp; 1<br />
\end{bmatrix} \text{ and } D=\begin{bmatrix}<br />
1 &amp; 0 &amp; 0 \\<br />
0 &amp;1 &amp;0 \\<br />
0 &amp; 0 &amp; 2<br />
\end{bmatrix}.\]
(Here you don&#8217;t have to find the inverse matrix $S^{-1}$ unless you are asked to do so.)</p>
<p>&nbsp;<br />
&nbsp;</p>
<h2>Diagonalization Problems and Examples</h2>
<p>Check out the following problems about the diagonalization of a matrix to see if you understand the procedure.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Diagonalize \[A=\begin{bmatrix}<br />
	  1 &#038; 2\\<br />
	  4&#038; 3<br />
	\end{bmatrix}\]
and compute $A^{100}$.
</div>
<p>For a solution of this problem and related questions, see the post &#8220;<a href="//yutsumura.com/diagonalize-a-2-by-2-matrix-a-and-calculate-the-power-a100/" target="_blank">Diagonalize a 2 by 2 Matrix $A$ and Calculate the Power $A^{100}$</a>&#8220;.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Determine whether the matrix<br />
	\[A=\begin{bmatrix}<br />
	  0 &#038; 1 &#038; 0 \\<br />
	   -1 &#038;0 &#038;0 \\<br />
	   0 &#038; 0 &#038; 2<br />
	\end{bmatrix}\]
	is diagonalizable. If it is diagonalizable, then find the invertible matrix $S$ and a diagonal matrix $D$ such that $S^{-1}AS=D$.</div>
<p>For a solution, check out the post &#8220;<a href="//yutsumura.com/diagonalize-the-3-by-3-matrix-if-it-is-diagonalizable/" target="_blank">Diagonalize the 3 by 3 Matrix if it is Diagonalizable</a>&#8220;.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Let<br />
	\[A=\begin{bmatrix}<br />
	  2 &#038; -1 &#038; -1 \\<br />
	   -1 &#038;2 &#038;-1 \\<br />
	   -1 &#038; -1 &#038; 2<br />
	\end{bmatrix}.\]
	Determine whether the matrix $A$ is diagonalizable. If it is diagonalizable, then diagonalize $A$.
</div>
<p>For a solution, see the post &#8220;<a href="//yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1) Diagonalize a matrix.</a>&#8220;.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Diagonalize the matrix<br />
	\[A=\begin{bmatrix}<br />
	  1 &#038; 1 &#038; 1 \\<br />
	   1 &#038;1 &#038;1 \\<br />
	   1 &#038; 1 &#038; 1<br />
	\end{bmatrix}.\]
</div>
<p>In the solution given in the post &#8220;<a href="//yutsumura.com/diagonalize-the-3-by-3-matrix-whose-entries-are-all-one/" target="_blank">Diagonalize the 3 by 3 Matrix Whose Entries are All One</a>&#8220;, we use an indirect method to find eigenvalues and eigenvectors.</p>
<p>The next problem is a diagonalization problem of a matrix with variables.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Diagonalize the complex matrix<br />
\[A=\begin{bmatrix}<br />
  a &#038; b-a\\<br />
  0&#038; b<br />
		\end{bmatrix}.\]
Using the result of the diagonalization, compute $A^k$ for each $k\in \N$.
</div>
<p>The solution is given in the post&#8628;<br />
<a href="//yutsumura.com/diagonalize-the-upper-triangular-matrix-and-find-the-power-of-the-matrix/" rel="noopener" target="_blank">Diagonalize the Upper Triangular Matrix and Find the Power of the Matrix</a></p>
<h3>A Hermitian Matrix can be diagonalized by a unitary matrix</h3>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Theorem</strong>. If $A$ is a Hermitian matrix, then $A$ can be diagonalized by a unitary matrix $U$.
</div>
<p>This means that there exists a unitary matrix $U$ such that $U^{-1}AU$ is a diagonal matrix.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Diagonalize the Hermitian matrix<br />
\[A=\begin{bmatrix}<br />
  1 &#038; i\\<br />
  -i&#038; 1<br />
		\end{bmatrix}\]
by a unitary matrix.
</div>
<p>The solution is given in the post &#8628;<br />
<a href="//yutsumura.com/diagonalize-the-2times-2-hermitian-matrix-by-a-unitary-matrix/" rel="noopener" target="_blank">Diagonalize the $2\times 2$ Hermitian Matrix by a Unitary Matrix</a></p>
<h3> More diagonalization problems </h3>
<p>More Problems related to the diagonalization of a matrix are gathered in the following page:</p>
<p><a href="//yutsumura.com/linear-algebra/diagonalization-of-matrices/" rel="noopener" target="_blank">Diagonalization of Matrices</a></p>
<button class="simplefavorite-button has-count" data-postid="1515" data-siteid="1" data-groupid="1" data-favoritecount="131" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">131</span></button><p>The post <a href="https://yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/" target="_blank">How to Diagonalize a Matrix. Step by Step Explanation.</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/how-to-diagonalize-a-matrix-step-by-step-explanation/feed/</wfw:commentRss>
		<slash:comments>12</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1515</post-id>	</item>
	</channel>
</rss>
