<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>orthogonal matrix &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/orthogonal-matrix/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Sun, 19 Nov 2017 16:29:17 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Is the Set of All Orthogonal Matrices a Vector Space?</title>
		<link>https://yutsumura.com/is-the-set-of-all-orthogonal-matrices-a-vector-space/</link>
				<comments>https://yutsumura.com/is-the-set-of-all-orthogonal-matrices-a-vector-space/#respond</comments>
				<pubDate>Fri, 17 Nov 2017 18:01:52 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[transpose matrix]]></category>
		<category><![CDATA[vector space]]></category>
		<category><![CDATA[zero vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5533</guid>
				<description><![CDATA[<p>An $n\times n$ matrix $A$ is called orthogonal if $A^{\trans}A=I$. Let $V$ be the vector space of all real $2\times 2$ matrices. Consider the subset \[W:=\{A\in V \mid \text{$A$ is an orthogonal matrix}\}.\] Prove&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/is-the-set-of-all-orthogonal-matrices-a-vector-space/" target="_blank">Is the Set of All Orthogonal Matrices a Vector Space?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 611</h2>
<p>An $n\times n$ matrix $A$ is called <strong>orthogonal</strong> if $A^{\trans}A=I$.<br />
Let $V$ be the vector space of all real $2\times 2$ matrices.</p>
<p>Consider the subset<br />
\[W:=\{A\in V \mid \text{$A$ is an orthogonal matrix}\}.\]
Prove or disprove that $W$ is a subspace of $V$.</p>
<p>&nbsp;<br />
<span id="more-5533"></span><br />

<h2>Solution.</h2>
<p>	We claim that $W$ is not a subspace of $V$.</p>
<p>	One way to see that $W$ is not a subspace of $V$ is to note that the zero vector $O$ in $V$, which is the $n\times n$ zero matrix, is not in $W$ as we have $O^{\trans}O=O\neq I$.<br />
	Thus, $W$ is not a subspace of $V$.</p>
<h3>Another approach</h3>
<p>	You may also show that scalar multiplication (or addition) is not closed in $W$.</p>
<p>	For example, the identity matrix $I$ is orthogonal as $I^{\trans}I=I$, and thus $I$ is an element in $W$.<br />
	However, the scalar product $2I$ is not orthogonal since<br />
	\[(2I)^{\trans}(2I)=4I\neq I.\]
<button class="simplefavorite-button has-count" data-postid="5533" data-siteid="1" data-groupid="1" data-favoritecount="62" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">62</span></button><p>The post <a href="https://yutsumura.com/is-the-set-of-all-orthogonal-matrices-a-vector-space/" target="_blank">Is the Set of All Orthogonal Matrices a Vector Space?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/is-the-set-of-all-orthogonal-matrices-a-vector-space/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5533</post-id>	</item>
		<item>
		<title>Use the Cayley-Hamilton Theorem to Compute the Power $A^{100}$</title>
		<link>https://yutsumura.com/use-the-cayley-hamilton-theorem-to-compute-the-power-a100/</link>
				<comments>https://yutsumura.com/use-the-cayley-hamilton-theorem-to-compute-the-power-a100/#respond</comments>
				<pubDate>Fri, 23 Jun 2017 17:51:41 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[Cayley-Hamilton theorem]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[complex eigenvalue]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[Kyushu]]></category>
		<category><![CDATA[Kyushu.LA]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[power of a matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3260</guid>
				<description><![CDATA[<p>Let $A$ be a $3\times 3$ real orthogonal matrix with $\det(A)=1$. (a) If $\frac{-1+\sqrt{3}i}{2}$ is one of the eigenvalues of $A$, then find the all the eigenvalues of $A$. (b) Let \[A^{100}=aA^2+bA+cI,\] where $I$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/use-the-cayley-hamilton-theorem-to-compute-the-power-a100/" target="_blank">Use the Cayley-Hamilton Theorem to Compute the Power $A^{100}$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 471</h2>
<p>	Let $A$ be a $3\times 3$ real orthogonal matrix with $\det(A)=1$.</p>
<p><strong>(a)</strong> If $\frac{-1+\sqrt{3}i}{2}$ is one of the eigenvalues of $A$, then find the all the eigenvalues of $A$.</p>
<p><strong>(b)</strong> Let<br />
	\[A^{100}=aA^2+bA+cI,\]
	where $I$ is the $3\times 3$ identity matrix.<br />
	Using the Cayley-Hamilton theorem, determine $a, b, c$.</p>
<p>(<em>Kyushu University, Linear Algebra Exam Problem</em>)<br />
&nbsp;<br />
<span id="more-3260"></span><br />

<h2>Solution.</h2>
<h3>(a) Find the all the eigenvalues of $A$.</h3>
<p>Since $A$ is a real matrix and $\frac{-1+\sqrt{3}i}{2}$ is a complex eigenvalue, its conjugate $\frac{-1-\sqrt{3}i}{2}$ is also an eigenvalue of $A$.<br />
	As $A$ is a $3\times 3$ matrix, it has one more eigenvalue $\lambda$.</p>
<p>	Note that <a href="//yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/" target="_blank">the product of all eigenvalues of $A$ is the determinant of $A$</a>.<br />
	Thus, we have<br />
	\[\frac{-1+\sqrt{3}i}{2} \cdot \frac{-1-\sqrt{3}i}{2}\cdot \lambda =\det(A)=1.\]
	Solving this, we obtain $\lambda=1$.<br />
	Therefore, the eigenvalues of $A$ are<br />
	\[\frac{-1+\sqrt{3}i}{2}, \frac{-1-\sqrt{3}i}{2}, 1.\]
<h3>(a) Using the Cayley-Hamilton theorem, determine $a, b, c$.</h3>
<p> To use the Cayley-Hamilton theorem, we first need to determine the characteristic polynomial $p(t)=\det(A-tI)$ of $A$.<br />
	Since we found all the eigenvalues of $A$ in part (a) and the roots of characteristic polynomials are the eigenvalues, we know that<br />
	\begin{align*}<br />
	p(t)&#038;=-\left(\,  t-\frac{-1+\sqrt{3}i}{2} \,\right)\left(\,  t-\frac{-1-\sqrt{3}i}{2} \,\right)(t-1) \tag{*}\\<br />
	&#038;=-(t^2+t+1)(t-1)\\<br />
	&#038;=-t^3+1.<br />
	\end{align*}<br />
	(Remark that if your definition of the characteristic polynomial is $\det(tI-A)$, then the first negative sign in (*) should be omitted.)</p>
<p>	Then the Cayley-Hamilton theorem yields that<br />
	\[P(A)=-A^3+I=O,\]
	where $O$ is the $3\times 3$ zero matrix.</p>
<p>	Hence we have $A^3=I$.<br />
	We compute<br />
	\begin{align*}<br />
	A^{100}=(A^3)^{33}A=I^{33}A=IA=A.<br />
	\end{align*}</p>
<p>	Thus, we conclude that $a=0, b=1, c=0$.</p>
<h2>Comment.</h2>
<p>	Observe that we did not use the assumption that $A$ is orthogonal.</p>
<button class="simplefavorite-button has-count" data-postid="3260" data-siteid="1" data-groupid="1" data-favoritecount="80" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">80</span></button><p>The post <a href="https://yutsumura.com/use-the-cayley-hamilton-theorem-to-compute-the-power-a100/" target="_blank">Use the Cayley-Hamilton Theorem to Compute the Power $A^{100}$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/use-the-cayley-hamilton-theorem-to-compute-the-power-a100/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3260</post-id>	</item>
		<item>
		<title>If $A$ is a Skew-Symmetric Matrix, then $I+A$ is Nonsingular and $(I-A)(I+A)^{-1}$ is Orthogonal</title>
		<link>https://yutsumura.com/if-a-is-a-skew-symmetric-matrix-then-ia-is-nonsingular-and-i-aia-1-is-orthogonal/</link>
				<comments>https://yutsumura.com/if-a-is-a-skew-symmetric-matrix-then-ia-is-nonsingular-and-i-aia-1-is-orthogonal/#comments</comments>
				<pubDate>Wed, 21 Jun 2017 21:21:15 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[determinant]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[invertible matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nonsingular matrix]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[skew-symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3237</guid>
				<description><![CDATA[<p>Let $A$ be an $n\times n$ real skew-symmetric matrix. (a) Prove that the matrices $I-A$ and $I+A$ are nonsingular. (b) Prove that \[B=(I-A)(I+A)^{-1}\] is an orthogonal matrix. &#160; Proof. (a) Prove that the matrices&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/if-a-is-a-skew-symmetric-matrix-then-ia-is-nonsingular-and-i-aia-1-is-orthogonal/" target="_blank">If $A$ is a Skew-Symmetric Matrix, then $I+A$ is Nonsingular and $(I-A)(I+A)^{-1}$ is Orthogonal</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 468</h2>
<p> Let $A$ be an $n\times n$ real skew-symmetric matrix.</p>
<p><strong>(a)</strong> Prove that the matrices $I-A$ and $I+A$ are nonsingular.</p>
<p><strong>(b)</strong> Prove that<br />
	\[B=(I-A)(I+A)^{-1}\]
	is an orthogonal matrix.</p>
<p>&nbsp;<br />
<span id="more-3237"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that the matrices $I-A$ and $I+A$ are nonsingular.</h3>
<p>The eigenvalues of a skew-symmetric matrix are either $0$ or purely imaginary numbers.<br />
(See the post &#8220;<a href="//yutsumura.com/eigenvalues-of-real-skew-symmetric-matrix-are-zero-or-purely-imaginary-and-the-rank-is-even/" target="_blank">Eigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is Even</a>&#8221; for a proof of this fact.)</p>
<p>		Namely, the eigenvalues of $A$ are of the form $ib$, where $i=\sqrt{-1}$ and $b$ is a real number.</p>
<p>		The eigenvalues of the matrix $I\pm A$ are of the form $1\pm \lambda$, where $\lambda$ is an eigenvalue of $A$.<br />
		Since $\lambda=ib$, we have $1\pm \lambda \neq 0$.<br />
		Thus, $I\pm A$ do not have $0$ as an eigenvalue.</p>
<p>		Since <a href="//yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/" target="_blank">the determinant is the product of all eigenvalues</a>, it follows that the determinants of the matrices $I\pm A$ are not zero, hence they are nonsingular.</p>
<h3>(b) Prove that $B=(I-A)(I+A)^{-1}$ is an orthogonal matrix.</h3>
<p> Note that by part (a), the matrix $I+A$ is nonsingular, hence it is invertible.<br />
		Thus the expression $B=(I-A)(I+A)^{-1}$ is well-defined.</p>
<p>		Our goal is to show that $B^{\trans}B=I$.<br />
		Recall the following basic properties of transpose and inverse matrices.</p>
<ol>
<li>$(AB)^{\trans}=B^{\trans} A^{\trans}$</li>
<li>$(A^{-1})^{\trans}=(A^{\trans})^{-1}$ if $A$ is invertible.</li>
<li>$(A+B)^{\trans}=A^{\trans}+B^{\trans}$.</li>
</ol>
<p>		We have<br />
		\begin{align*}<br />
	B^{\trans}&#038;=\left(\,  (I-A)(I+A)^{-1} \,\right)^{\trans}\\<br />
	&#038;=\left(\, (I+A)^{-1}  \,\right)^{\trans}(I-A)^{\trans} &#038;&#038; \text{by property 1}\\<br />
	&#038;=\left(\, (I+A)^{\trans}  \,\right)^{-1}(I-A)^{\trans} &#038;&#038; \text{by property 2}\\<br />
	&#038;=(I^{\trans}+A^{\trans})^{-1}(I^{\trans}-A^{\trans}) &#038;&#038; \text{by property 3}\\<br />
	&#038;=(I+A^{\trans})^{-1}(I-A^{\trans}) &#038;&#038; \text{since } I^{\trans}=I\\<br />
	&#038;=(I-A)^{-1}(I+A),<br />
	\end{align*}<br />
	where the last step follows since $A$ is skew-symmetric: $A^{\trans}=-A$.</p>
<p>	Hence we have<br />
	\begin{align*}<br />
	B^{\trans} B&#038;=(I-A)^{-1}(I+A)(I-A)(I+A)^{-1}.<br />
	\end{align*}</p>
<p>	We note that the middle two matrices $I+A$ and $I-A$ commutes.<br />
	In fact we have<br />
	\begin{align*}<br />
	(I+A)(I-A)&#038;=(I+A)I-(I+A)A=I+A-A-A^2=I-A^2 \text{ and }\\<br />
	(I-A)(I+A)&#038;=(I-A)I+(I-A)A=I-A+A-A^2=I-A^2.<br />
	\end{align*}<br />
	It yields that<br />
	\[(I+A)(I-A)=(I-A)(I+A) \tag{*}.\]
<p>	Hence we have<br />
	\begin{align*}<br />
	&#038;B^{\trans} B\\<br />
	&#038;=(I-A)^{-1}(I+A)(I-A)(I+A)^{-1}\\<br />
	&#038;=(I-A)^{-1}(I-A)(I+A)(I+A)^{-1} &#038;&#038; \text{ by (*)}\\<br />
	&#038;=I\cdot I=I,<br />
	\end{align*}<br />
	and we have obtained $B^{\trans}B=I$.<br />
	Therefore, the matrix $B$ is an orthogonal matrix.</p>
<button class="simplefavorite-button has-count" data-postid="3237" data-siteid="1" data-groupid="1" data-favoritecount="68" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">68</span></button><p>The post <a href="https://yutsumura.com/if-a-is-a-skew-symmetric-matrix-then-ia-is-nonsingular-and-i-aia-1-is-orthogonal/" target="_blank">If $A$ is a Skew-Symmetric Matrix, then $I+A$ is Nonsingular and $(I-A)(I+A)^{-1}$ is Orthogonal</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/if-a-is-a-skew-symmetric-matrix-then-ia-is-nonsingular-and-i-aia-1-is-orthogonal/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3237</post-id>	</item>
		<item>
		<title>A Matrix Equation of a Symmetric Matrix and the Limit of its Solution</title>
		<link>https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/</link>
				<comments>https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/#respond</comments>
				<pubDate>Thu, 15 Jun 2017 15:57:43 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[Berkeley]]></category>
		<category><![CDATA[Berkeley.LA]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[orthonormal basis]]></category>
		<category><![CDATA[qualifying exam]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3141</guid>
				<description><![CDATA[<p>Let $A$ be a real symmetric $n\times n$ matrix with $0$ as a simple eigenvalue (that is, the algebraic multiplicity of the eigenvalue $0$ is $1$), and let us fix a vector $\mathbf{v}\in \R^n$.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/" target="_blank">A Matrix Equation of a Symmetric Matrix and the Limit of its Solution</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 457</h2>
<p>	Let $A$ be a real symmetric $n\times n$ matrix with $0$ as a simple eigenvalue (that is, the algebraic multiplicity of the eigenvalue $0$ is $1$), and let us fix a vector $\mathbf{v}\in \R^n$.</p>
<p><strong>(a)</strong> Prove that for sufficiently small positive real $\epsilon$, the equation<br />
	\[A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}\]
	has a unique solution $\mathbf{x}=\mathbf{x}(\epsilon) \in \R^n$.</p>
<p><strong>(b)</strong> Evaluate<br />
	\[\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)\]
	 in terms of $\mathbf{v}$, the eigenvectors of $A$, and the inner product $\langle\, ,\,\rangle$ on $\R^n$.</p>
<p>&nbsp;<br />
(<em>University of California, Berkeley, Linear Algebra Qualifying Exam</em>)</p>
<p><span id="more-3141"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that $A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}$has a unique solution $\mathbf{x}=\mathbf{x}(\epsilon) \in \R^n$.</h3>
<p>Recall that the <a href="//yutsumura.com/eigenvalues-of-a-hermitian-matrix-are-real-numbers/" target="_blank">eigenvalues of a real symmetric matrices are all real numbers</a> and it is diagonalizable by an orthogonal matrix.</p>
<p>		Note that the equation $A\mathbf{x}+\epsilon\mathbf{x}=\mathbf{v}$ can be written as<br />
		\[(A+\epsilon I)\mathbf{x}=\mathbf{v}, \tag{*}\]
		where $I$ is the $n\times n$ identity matrix. Thus to show that the equation (*) has a unique solution, it suffices to show that the matrix $A+\epsilon I$ is invertible.</p>
<p>		Since $A$ is diagonalizable, there exists an invertible matrix $S$ such that<br />
		\[S^{-1}AS=\begin{bmatrix}<br />
		 \lambda_1  &#038; 0 &#038; \cdots &#038; 0 \\<br />
		0 &#038; \lambda_2 &#038; \cdots &#038; 0\\<br />
		\vdots  &#038; \vdots  &#038; \ddots &#038; \vdots  \\<br />
		0 &#038; 0 &#038; \cdots &#038; \lambda_n<br />
		\end{bmatrix},\]
		where $\lambda_i$ are eigenvalues of $A$.<br />
		Since the algebraic multiplicity of $0$ is $1$, without loss of generality, we may assume that $\lambda_1=0$ and $\lambda_i, i > 1$ are nonzero.</p>
<p>		Then we have<br />
		\begin{align*}<br />
	S^{-1}(A+\epsilon I)S&#038;=S^{-1}AS+\epsilon I=\begin{bmatrix}<br />
			 \epsilon  &#038; 0 &#038; \cdots &#038; 0 \\<br />
			0 &#038; \epsilon+\lambda_2 &#038; \cdots &#038; 0\\<br />
			\vdots  &#038; \vdots  &#038; \ddots &#038; \vdots  \\<br />
			0 &#038; 0 &#038; \cdots &#038; \epsilon+\lambda_n<br />
			\end{bmatrix}.<br />
	\end{align*}</p>
<hr />
<p>			If $\epsilon > 0$ is smaller than the lengths of $|\lambda_i|, i > 1$, then none of the diagonal entries $\epsilon+ \lambda_i$ are zero.</p>
<p>			Hence we have<br />
			\begin{align*}<br />
	\det(A+\epsilon I)&#038;=\det(S)^{-1}\det(A+\epsilon I)\det(S)\\<br />
	&#038;=\det\left(\,  S^{-1}(A+\epsilon I) S \,\right)\\<br />
	&#038;=\epsilon(\epsilon+\lambda_2)\cdots (\epsilon+\lambda_n)\neq 0.<br />
	\end{align*}<br />
	Since $\det(A+\epsilon I)\neq 0$, it yields that $A$ is invertible, hence the equation (*) has a unique solution<br />
	\[\mathbf{x}(\epsilon)=(A+\epsilon I)^{-1}\mathbf{v}.\]
<h4>Remark</h4>
<p> This result is in general true for any square matrix.<br />
	Instead of using the diagonalization, we can use the triangulation of a matrix.</p>
<h3>(b) Evaluate $\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)$</h3>
<p> As noted earlier that a real symmetric matrix can be diagonalizable by an orthogonal matrix.<br />
			This means that there is an eigenvector $\mathbf{v}_i$ corresponding to the eigenvalue $\lambda_i$ for each $i$ such that the eigenvectors $\mathbf{v}_i$ form an orthonormal basis of $\R^n$.<br />
			That is,<br />
			\begin{align*}<br />
	A\mathbf{v}_i=\lambda_i \mathbf{v}_i \\<br />
	\langle \mathbf{v}_i,\mathbf{v}_j \rangle=\delta_{i,j},<br />
	\end{align*}<br />
	where $\delta_{i,j}$ is the Kronecker delta symbol, where $\delta_{i,i}=1, \delta_{i,j}=0$ if $i\neq j$.<br />
	From this, we deduce that<br />
	\begin{align*}<br />
	(A+\epsilon I)\mathbf{v}_i=(\lambda_i+\epsilon)\mathbf{v}_i\\<br />
	(A+\epsilon I)^{-1}\mathbf{v}_i=\frac{1}{\lambda_i+\epsilon}\mathbf{v}_i. \tag{**}<br />
	\end{align*}<br />
	Using the basis $\{\mathbf{v}_i\}$, we write<br />
	\[\mathbf{v}=\sum_{i=1}^nc_i \mathbf{v}_i\]
	for some $c_i\in \R$.</p>
<hr />
<p>	Then we compute<br />
	\begin{align*}<br />
	A\mathbf{x}(\epsilon)&#038;=A(A+\epsilon I)^{-1}\mathbf{v} &#038;&#038; \text{by part (a)}\\<br />
	&#038;=A(A+\epsilon I)^{-1}\left(\, \sum_{i=1}^nc_i \mathbf{v}_i  \,\right)\\<br />
	&#038;=\sum_{i=1}^n c_iA(A+\epsilon I)^{-1}\mathbf{v}_i\\<br />
	&#038;=\sum _{i=1}^n c_iA\left(\,  \frac{1}{\lambda_i+\epsilon}\mathbf{v}_i \,\right) &#038;&#038; \text{by (**)}\\<br />
	&#038;=\sum_{i=1}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i &#038;&#038; \text{since $A\mathbf{v}_i=\lambda_i\mathbf{v}_i$}\\<br />
	&#038;=\sum_{i=2}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i &#038;&#038; \text{since $\lambda_1=0$}.<br />
	\end{align*}</p>
<hr />
<p>	Therefore we have<br />
	\begin{align*}<br />
	\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)&#038;=\lim_{\epsilon \to 0^+}\left(\,  \mathbf{v}-A\mathbf{x}(\epsilon) \,\right)\\<br />
	&#038;=\mathbf{v}-\lim_{\epsilon \to 0^+}\left(\,  A\mathbf{x}(\epsilon) \,\right)\\<br />
	&#038;= \sum_{i=1}^nc_i\mathbf{v}_i-\lim_{\epsilon \to 0^+}\left(\,  \sum_{i=2}^n c_i\frac{\lambda_i}{\lambda_i+\epsilon}\mathbf{v}_i \,\right)\\<br />
	&#038;=\sum_{i=1}c_i \mathbf{v}_i-\sum_{i=2}^n c_i \mathbf{v}_i\\<br />
	&#038;=c_1\mathbf{v}_1.<br />
	\end{align*}<br />
	Using the orthonormality of the basis $\{\mathbf{v}_i\}$, we have<br />
	\[\langle\mathbf{v}, \mathbf{v}_1 \rangle=\sum_{i=1}^n \langle c_i\mathbf{v}_i, \mathbf{v}_1 \rangle=c_1.\]
<p>	Hence the required expression is<br />
	\[\lim_{\epsilon \to 0^+} \epsilon \mathbf{x}(\epsilon)=\langle\mathbf{v}, \mathbf{v}_1 \rangle\mathbf{v}_1,\]
	where $\mathbf{v}_1$ is the unit eigenvector corresponding to the eigenvalue $0$.</p>
<button class="simplefavorite-button has-count" data-postid="3141" data-siteid="1" data-groupid="1" data-favoritecount="13" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">13</span></button><p>The post <a href="https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/" target="_blank">A Matrix Equation of a Symmetric Matrix and the Limit of its Solution</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/a-matrix-equation-of-a-symmetric-matrix-and-the-limit-of-its-solution/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3141</post-id>	</item>
		<item>
		<title>Inequality about Eigenvalue of a Real Symmetric Matrix</title>
		<link>https://yutsumura.com/inequality-about-eigenvalue-of-a-real-symmetric-matrix/</link>
				<comments>https://yutsumura.com/inequality-about-eigenvalue-of-a-real-symmetric-matrix/#comments</comments>
				<pubDate>Tue, 13 Jun 2017 02:18:00 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[orthonormal basis]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3090</guid>
				<description><![CDATA[<p>Let $A$ be an $n\times n$ real symmetric matrix. Prove that there exists an eigenvalue $\lambda$ of $A$ such that for any vector $\mathbf{v}\in \R^n$, we have the inequality \[\mathbf{v}\cdot A\mathbf{v} \leq \lambda \&#124;\mathbf{v}\&#124;^2.\]&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/inequality-about-eigenvalue-of-a-real-symmetric-matrix/" target="_blank">Inequality about Eigenvalue of a Real Symmetric Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 451</h2>
<p> Let $A$ be an $n\times n$ real symmetric matrix.<br />
	Prove that there exists an eigenvalue $\lambda$ of $A$ such that for any vector $\mathbf{v}\in \R^n$, we have the inequality<br />
	\[\mathbf{v}\cdot A\mathbf{v} \leq \lambda \|\mathbf{v}\|^2.\]
<p>&nbsp;<br />
<span id="more-3090"></span><br />
&nbsp;</p>
<h2> Proof. </h2>
<p>		Recall that <a href="//yutsumura.com/eigenvalues-of-a-hermitian-matrix-are-real-numbers/" target="_blank">all the eigenvalues of a symmetric matrices are real numbers</a>.<br />
		Let $\lambda_1, \dots, \lambda_n$ be eigenvalues of $A$.</p>
<p>		Since these eigenvalues are real numbers, there is the largest one.<br />
		Let $\lambda$ be the largest eigenvalue of $A$.<br />
		With this choice of $\lambda$ we show that the inequality<br />
		\[\mathbf{v}\cdot A\mathbf{v} \leq \lambda \|\mathbf{v}\|^2\]
		holds for any $\mathbf{v}\in \R^n$.</p>
<hr />
<p>		Also recall that for a real symmetric matrix, there are eigenvalues $\mathbf{v}_1, \dots, \mathbf{v}_n$ corresponding to $\lambda_1, \dots, \lambda_n$ such that<br />
		\[B=\{\mathbf{v}_1, \dots, \mathbf{v}_n\}\]
		form an orthonormal basis of $\R^n$.<br />
		(This statement is equivalent to that every real symmetric matrix is diagonalizable by an orthogonal matrix.)</p>
<hr />
<p>		Let $\mathbf{v}$ be an arbitrary vector in $\R^n$.<br />
		Then since $B$ is a basis of $\R^n$, we can write<br />
		\[\mathbf{v}=c_1\mathbf{v}_1+\dots+c_n\mathbf{v}_n\]
		for some $c_1, \dots, c_n\in \R$.</p>
<p>		Then we calculate<br />
		\begin{align*}<br />
	 A\mathbf{v}&#038;=A(c_1\mathbf{v}_1+\dots+c_n\mathbf{v}_n)\\<br />
	 &#038;=c_1A\mathbf{v}_1+\dots+c_nA\mathbf{v}_n\\<br />
	 &#038;=c_1\lambda_1\mathbf{v}_1+\dots+c_n\lambda_n\mathbf{v}_n<br />
	\end{align*}<br />
	since $A\mathbf{v}_i=\lambda_i\mathbf{v}_i$ for $i=1, \dots, n$.</p>
<hr />
<p>	Using this, we have<br />
	\begin{align*}<br />
	\mathbf{v}\cdot A\mathbf{v}&#038;=(c_1\mathbf{v}_1+\dots+c_n\mathbf{v}_n)\cdot (c_1\lambda_1\mathbf{v}_1+\dots+c_n\lambda_n\mathbf{v}_n)\\<br />
	&#038;=c_1^2\lambda_1+\cdots+c_n^2\lambda_n.<br />
	\end{align*}</p>
<p>	Here, we used that $B=\{\mathbf{v}_1, \dots, \mathbf{v}_n\}$ is an orthonormal basis of $\R^3$.<br />
	That is, we used the properties<br />
	\begin{align*}<br />
	\mathbf{v}_i\cdot \mathbf{v}_j=\begin{cases}<br />
		1 &#038; \text{if } i=j\\<br />
		0 &#038; \text{if } i\neq j.<br />
	\end{cases}<br />
	\end{align*}</p>
<hr />
<p>	Since $\lambda$ is the largest eigenvalue of $A$, we have<br />
	\begin{align*}<br />
	\mathbf{v}\cdot A\mathbf{v}&#038;=c_1^2\lambda_1+\cdots+c_n^2\lambda_n\\<br />
	&#038; \leq c_1^2\lambda+\cdots+c_n^2\lambda\\<br />
	&#038;=\lambda(c_2^2+\cdots+c_n^2)\\<br />
	&#038;=\lambda \|\mathbf{v}\|^2.<br />
	\end{align*}<br />
	Hence the required inequality holds.</p>
<button class="simplefavorite-button has-count" data-postid="3090" data-siteid="1" data-groupid="1" data-favoritecount="18" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">18</span></button><p>The post <a href="https://yutsumura.com/inequality-about-eigenvalue-of-a-real-symmetric-matrix/" target="_blank">Inequality about Eigenvalue of a Real Symmetric Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/inequality-about-eigenvalue-of-a-real-symmetric-matrix/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3090</post-id>	</item>
		<item>
		<title>Eigenvalues of Orthogonal Matrices Have Length 1. Every $3\times 3$ Orthogonal Matrix Has 1 as an Eigenvalue</title>
		<link>https://yutsumura.com/eigenvalues-of-orthogonal-matrices-have-length-1-every-3times-3-orthogonal-matrix-has-1-as-an-eigenvalue/</link>
				<comments>https://yutsumura.com/eigenvalues-of-orthogonal-matrices-have-length-1-every-3times-3-orthogonal-matrix-has-1-as-an-eigenvalue/#respond</comments>
				<pubDate>Thu, 18 May 2017 01:16:37 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[determinant]]></category>
		<category><![CDATA[determinant of a matrix]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[length]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[magnitude]]></category>
		<category><![CDATA[norm]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2915</guid>
				<description><![CDATA[<p>(a) Let $A$ be a real orthogonal $n\times n$ matrix. Prove that the length (magnitude) of each eigenvalue of $A$ is $1$. (b) Let $A$ be a real orthogonal $3\times 3$ matrix and suppose&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/eigenvalues-of-orthogonal-matrices-have-length-1-every-3times-3-orthogonal-matrix-has-1-as-an-eigenvalue/" target="_blank">Eigenvalues of Orthogonal Matrices Have Length 1. Every \times 3$ Orthogonal Matrix Has 1 as an Eigenvalue</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 419</h2>
<p><strong>(a)</strong> Let $A$ be a real orthogonal $n\times n$ matrix. Prove that the length (magnitude) of each eigenvalue of $A$ is $1$.</p>
<hr />
<p><strong>(b)</strong> Let $A$ be a real orthogonal $3\times 3$ matrix and suppose that the determinant of $A$ is $1$. Then prove that $A$ has $1$ as an eigenvalue.</p>
<p>&nbsp;<br />
<span id="more-2915"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that the length (magnitude) of each eigenvalue of $A$ is $1$</h3>
<p> Let $A$ be a real orthogonal $n\times n$ matrix.<br />
					Let $\lambda$ be an eigenvalue of $A$ and let $\mathbf{v}$ be a corresponding eigenvector.<br />
					Then we have<br />
					\[A\mathbf{v}=\lambda \mathbf{v}.\]
					It follows from this we have<br />
					\[\|A\mathbf{v}\|^2=\|\lambda \mathbf{v}\|^2=|\lambda|^2\|\mathbf{v}\|^2.\]
					The left hand side becomes<br />
					\begin{align*}<br />
		&#038;\|A\mathbf{v}\|^2\\<br />
		&#038;=\overline{(A\mathbf{v})}^{\trans}(A\mathbf{v}) &#038;&#038; \text{by definition of the length}\\<br />
		&#038;=\bar{\mathbf{v}}^{\trans}A^{\trans}A\mathbf{v} &#038;&#038; \text{because $A$ is real}\\<br />
		&#038;=\bar{\mathbf{v}}^{\trans}\mathbf{v} &#038;&#038; \text{because $A^{\trans}A=I$ as $A$ is orthogonal}\\<br />
		&#038;=\|\mathbf{v}\|^2 &#038;&#038; \text{by definition of the length.}<br />
		\end{align*}</p>
<p>		It follows that we obtain<br />
		\[\|\mathbf{v}\|^2=|\lambda|^2\|\mathbf{v}\|^2.\]
		Since $\mathbf{v}$ is an eigenvector, it is non-zero, and hence $\|\mathbf{v}\|\neq 0$.<br />
		Canceling $\|\mathbf{v}\|$, we have<br />
		\[|\lambda|^2=1.\]
		Since the length is non-negative, we obtain<br />
		\[|\lambda|=1,\]
		as required.</p>
<h3>(b) Prove that $A$ has $1$ as an eigenvalue.</h3>
<p> Let $A$ be a real orthogonal $3\times 3$ matrix with $\det(A)=1$.<br />
		Let us consider the characteristic polynomial $p(t)=\det(A-tI)$ of $A$.<br />
		The roots of $p(t)$ are eigenvalues of $A$.</p>
<p>		Since $A$ is a real $3\times 3$ matrix, the degree of the polynomial $p(t)$ is $3$ and the coefficients are real.<br />
		Thus, there are two cases to consider:</p>
<ol>
<li>there are three real eigenvalues $\alpha, \beta, \gamma$, and</li>
<li>there is one real eigenvalue $\alpha$ and a complex conjugate pair $\beta, \bar{\beta}$ of eigenvalues.</li>
</ol>
<p>		Let us first deal with case 1.<br />
		By part (a), the lengths of eigenvalues $\alpha, \beta, \gamma$ are $1$. Since they are real numbers, we have<br />
		\[\alpha=\pm 1, \beta=\pm 1, \gamma=\pm 1.\]
		Recall that the product of all eigenvalues of $A$ is the determinant of $A$.<br />
(For a proof, see the post &#8220;<a href="//yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/" target="_blank">Determinant/trace and eigenvalues of a matrix</a>&#8220;.)<br />
		Thus we have<br />
		\[\alpha \beta \gamma=\det(A)=1.\]
		Thus, at least one of $\alpha, \beta, \gamma$ is $1$.</p>
<p>		Next, we consider case 2. Again the lengths of eigenvalues $\alpha, \beta, \bar{\beta}$ are $1$.<br />
		Then we have<br />
		\begin{align*}<br />
		1&#038;=\det(A)=\alpha \beta \bar{\beta}\\<br />
		&#038;=\alpha |\beta|^2=\alpha.<br />
		\end{align*}</p>
<p>		Therefore, in either case, we see that $A$ has $1$ as an eigenvalue.</p>
<button class="simplefavorite-button has-count" data-postid="2915" data-siteid="1" data-groupid="1" data-favoritecount="41" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">41</span></button><p>The post <a href="https://yutsumura.com/eigenvalues-of-orthogonal-matrices-have-length-1-every-3times-3-orthogonal-matrix-has-1-as-an-eigenvalue/" target="_blank">Eigenvalues of Orthogonal Matrices Have Length 1. Every \times 3$ Orthogonal Matrix Has 1 as an Eigenvalue</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/eigenvalues-of-orthogonal-matrices-have-length-1-every-3times-3-orthogonal-matrix-has-1-as-an-eigenvalue/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2915</post-id>	</item>
		<item>
		<title>If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set?</title>
		<link>https://yutsumura.com/if-column-vectors-from-orthonormal-set-is-row-vectors-form-orthonormal-set/</link>
				<comments>https://yutsumura.com/if-column-vectors-from-orthonormal-set-is-row-vectors-form-orthonormal-set/#respond</comments>
				<pubDate>Mon, 27 Feb 2017 04:39:22 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[Berkeley]]></category>
		<category><![CDATA[Berkeley.LA]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[invertible]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[orthonormal set]]></category>
		<category><![CDATA[qualifying exam]]></category>
		<category><![CDATA[transpose]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2308</guid>
				<description><![CDATA[<p>Suppose that $A$ is a real $n\times n$ matrix. (a) Is it true that $A$ must commute with its transpose? (b) Suppose that the columns of $A$ (considered as vectors) form an orthonormal set.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/if-column-vectors-from-orthonormal-set-is-row-vectors-form-orthonormal-set/" target="_blank">If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 317</h2>
<p>Suppose that $A$ is a real $n\times n$ matrix.</p>
<p><strong>(a)</strong> Is it true that $A$ must commute with its transpose?</p>
<p><strong>(b)</strong> Suppose that the columns of $A$ (considered as vectors) form an orthonormal set.<br />
Is it true that the rows of $A$ must also form an orthonormal set?</p>
<p>(<em>University of California, Berkeley, Linear Algebra Qualifying Exam</em>)</p>
<p>&nbsp;<br />
<span id="more-2308"></span></p>

<h2>Solution.</h2>
<h3>(a) Is it true that $A$ must commute with its transpose?</h3>
<p>The answer is no. </p>
<p>We give a counterexample. Let<br />
\[A=\begin{bmatrix}<br />
1 &#038; -1\\<br />
0&#038; 2<br />
\end{bmatrix}.\]
Then the transpose of $A$ is<br />
\[A^{\trans}=\begin{bmatrix}<br />
1 &#038; 0\\<br />
-1&#038; 2<br />
\end{bmatrix}.\]
We compute<br />
\[AA^{\trans}=\begin{bmatrix}<br />
1 &#038; -1\\<br />
0&#038; 2<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
1 &#038; 0\\<br />
-1&#038; 2<br />
\end{bmatrix}<br />
=<br />
\begin{bmatrix}<br />
2 &#038; -2\\<br />
-2&#038; 4<br />
\end{bmatrix},\]
and<br />
\[A^{\trans}A=<br />
\begin{bmatrix}<br />
1 &#038; 0\\<br />
-1&#038; 2<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
1 &#038; -1\\<br />
0&#038; 2<br />
\end{bmatrix}<br />
=<br />
\begin{bmatrix}<br />
1 &#038; -1\\<br />
-1&#038; 5<br />
\end{bmatrix}.<br />
\]
Therefore, we see that<br />
\[AA^{\trans}\neq A^{\trans} A,\]
that is, $A$ does not commute with its transpose $A^{\trans}$.</p>
<h3>(b) Is it true that the rows of $A$ must also form an orthonormal set?</h3>
<p>The answer is yes. </p>
<p>Note that in general the column vectors of a matrix $M$ form an orthonormal set if and only if $M^{\trans}M=I$, where $I$ is the identity matrix. (Such a matrix is called orthogonal matrix.)</p>
<p>Thus, by assumption we have $A^{\trans} A=I$. Let $B=A^{\trans}$.<br />
Then the column vectors of $B$ is the row vectors of $A$. Hence it suffices to show that $B^{\trans}B=I$.</p>
<p>Since $A^{\trans} A=I$, we know that $A$ is invertible and the inverse $A^{-1}=A^{\trans}$.<br />
In particular, we have $A^{\trans} A=A A^{\trans}=I$. </p>
<p>We have<br />
\begin{align*}<br />
B^{\trans}B=(A^{\trans})^{\trans}A^{\trans}=(AA^{\trans})^{\trans}=I^{\trans}=I.<br />
\end{align*}<br />
Thus, we obtain $B^{\trans}B=I$ and by the general fact stated above, the column vectors of $B$ form an orthonormal set.<br />
 Hence the row column vectors of $A$ form an orthonormal set.</p>
<button class="simplefavorite-button has-count" data-postid="2308" data-siteid="1" data-groupid="1" data-favoritecount="36" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">36</span></button><p>The post <a href="https://yutsumura.com/if-column-vectors-from-orthonormal-set-is-row-vectors-form-orthonormal-set/" target="_blank">If Column Vectors Form Orthonormal set, is Row Vectors Form Orthonormal Set?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/if-column-vectors-from-orthonormal-set-is-row-vectors-form-orthonormal-set/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2308</post-id>	</item>
		<item>
		<title>Rotation Matrix in Space and its Determinant and Eigenvalues</title>
		<link>https://yutsumura.com/rotation-matrix-in-space-and-its-determinant-and-eigenvalues/</link>
				<comments>https://yutsumura.com/rotation-matrix-in-space-and-its-determinant-and-eigenvalues/#comments</comments>
				<pubDate>Wed, 14 Dec 2016 04:27:28 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[cofactor expansion]]></category>
		<category><![CDATA[determinant]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[orthonormal]]></category>
		<category><![CDATA[orthonormal vector]]></category>
		<category><![CDATA[rotation matrix]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose matrix]]></category>
		<category><![CDATA[trigonometry]]></category>
		<category><![CDATA[vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1559</guid>
				<description><![CDATA[<p>For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta &#038; -\sin\theta &#038; 0 \\ \sin\theta &#038;\cos\theta &#038;0 \\ 0 &#038; 0 &#038; 1 \end{bmatrix}.\]&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/rotation-matrix-in-space-and-its-determinant-and-eigenvalues/" target="_blank">Rotation Matrix in Space and its Determinant and Eigenvalues</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 218</h2>
<p> For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by<br />
\[A=\begin{bmatrix}<br />
  \cos\theta &#038; -\sin\theta &#038; 0 \\<br />
   \sin\theta &#038;\cos\theta &#038;0 \\<br />
   0 &#038; 0 &#038; 1<br />
\end{bmatrix}.\]
<p><strong>(a)</strong> Find the determinant of the matrix $A$.</p>
<p><strong>(b)</strong> Show that $A$ is an orthogonal matrix.</p>
<p><strong>(c)</strong> Find the eigenvalues of $A$.</p>
<p>&nbsp;<br />
<span id="more-1559"></span></p>

<h2>Solution.</h2>
<h3> (a) The determinant of the matrix $A$</h3>
<p> By the cofactor expansion corresponding to the third row, we compute<br />
	\begin{align*}<br />
\det(A)&#038;=\begin{vmatrix}<br />
  \cos\theta &#038; -\sin\theta &#038; 0 \\<br />
   \sin\theta &#038;\cos\theta &#038;0 \\<br />
   0 &#038; 0 &#038; 1<br />
\end{vmatrix}\\<br />
&#038;=0\cdot \begin{vmatrix}<br />
  -\sin \theta &#038; 0\\<br />
  \cos \theta&#038; 0<br />
\end{vmatrix}-0\cdot \begin{vmatrix}<br />
  \cos \theta &#038; 0\\<br />
  \sin \theta&#038; 0<br />
\end{vmatrix}+1\cdot \begin{vmatrix}<br />
  \cos \theta &#038; -\sin \theta\\<br />
  \sin \theta&#038; \cos \theta<br />
\end{vmatrix}\\<br />
&#038;=\cos^2 \theta +\sin^2 \theta\\<br />
&#038;=1.<br />
\end{align*}<br />
The last step follows from the famous trigonometry identity<br />
\[\cos^2 \theta +\sin^2 \theta=1.\]
Thus we have<br />
\[\det(A)=1.\]
<h3>(b) The matrix $A$ is an orthogonal matrix </h3>
<p>We give two solutions for part (b).</p>
<h4> The first solution of (b)</h4>
<p>The first solution computes $A^{\trans}A$ and show that it is the identity matrix $I$.<br />
We have<br />
\begin{align*}<br />
A^{\trans}A&#038;=\begin{bmatrix}<br />
  \cos\theta &#038; \sin\theta &#038; 0 \\<br />
   -\sin\theta &#038;\cos\theta &#038;0 \\<br />
   0 &#038; 0 &#038; 1<br />
\end{bmatrix}\begin{bmatrix}<br />
  \cos\theta &#038; -\sin\theta &#038; 0 \\<br />
   \sin\theta &#038;\cos\theta &#038;0 \\<br />
   0 &#038; 0 &#038; 1<br />
\end{bmatrix}\\<br />
&#038;=\begin{bmatrix}<br />
  \cos^2 \theta +\sin^2\theta &#038; 0 &#038; 0 \\<br />
   0 &#038;\cos^2 \theta+\sin^2 \theta &#038;0 \\<br />
   0 &#038; 0 &#038; 1<br />
\end{bmatrix}\\<br />
&#038;=\begin{bmatrix}<br />
  1 &#038; 0 &#038; 0 \\<br />
   0 &#038;1 &#038;0 \\<br />
   0 &#038; 0 &#038; 1<br />
\end{bmatrix}=I.<br />
\end{align*}<br />
Similarly, you can check that $AA^{\trans}=I$. Thus $A$ is an orthogonal matrix.</p>
<h4> The second solution of (b)</h4>
<p>The second proof uses the following fact: a matrix is orthogonal if and only its column vectors form an orthonormal set.<br />
Let<br />
\[A_1=\begin{bmatrix}<br />
  \cos \theta \\<br />
   \sin \theta \\<br />
    0<br />
  \end{bmatrix}, A_2=\begin{bmatrix}<br />
  -\sin\theta \\<br />
   \cos \theta \\<br />
    0<br />
  \end{bmatrix}, A_3=\begin{bmatrix}<br />
  0 \\<br />
   0 \\<br />
    1<br />
  \end{bmatrix}\]
  be the column vectors of the matrix $A$. The length of these vectors are all $1$. For example, we have<br />
 \begin{align*}<br />
||A_1||=\sqrt{(\cos\theta)^2+(\sin \theta)^2+0^2}=\sqrt{1}=1.<br />
\end{align*}<br />
Similarly, we have $||A_2||=||A_3||=1$.<br />
The dot (inner) product of $A_1$ and $A_2$ is<br />
\begin{align*}<br />
A_1\cdot A_2=\cos \theta \cdot (-\sin \theta)+\sin \theta \cdot \cos \theta +0\cdot 0=0.<br />
\end{align*}<br />
Similarly, we have $A_1\cdot A_3=A_2\cdot A_3=0$.<br />
Therefore, the column vectors $A_1, A_2, A_3$ are orthonormal vectors. Hence by the above fact, the matrix $A$ is orthogonal.</p>
<h3>(c) The eigenvalues of $A$</h3>
<p>We compute the characteristic polynomial $p(t)=\det(A-tI)$ as follows.<br />
\begin{align*}<br />
p(t)&#038;=\det(A-tI)=\begin{vmatrix}<br />
  \cos\theta-t &#038; -\sin\theta &#038; 0 \\<br />
   \sin\theta &#038;\cos\theta -t&#038;0 \\<br />
   0 &#038; 0 &#038; 1-t<br />
\end{vmatrix}\\<br />
&#038;=(1-t)\begin{vmatrix}<br />
  \cos \theta -t &#038; -\sin \theta\\<br />
  \sin \theta&#038; \cos \theta-t<br />
\end{vmatrix} \text{ by the third row cofactor expansion}\\<br />
&#038;=(1-t)(\cos^2 \theta -2t \cos \theta +t^2 +\sin^2 \theta)\\<br />
&#038;=(1-t)(t^2-(2\cos \theta)t+1).<br />
\end{align*}</p>
<p>The eigenvalues are roots of the characteristic polynomial $p(t)$, hence we solve<br />
\[p(t)=(1-t)(t^2-(2\cos \theta)t+1)=0.\]
One solution is $t=1$. The other solutions come from the quadratic polynomial in $p(t)$.<br />
By the quadratic formula, those solutions are<br />
\begin{align*}<br />
t&#038;=\cos\theta \pm\sqrt{\cos^2 \theta -1}\\<br />
&#038;=\cos\theta \pm \sqrt{-\sin^2 \theta}\\<br />
&#038;=\cos \theta \pm i \sin \theta<br />
\end{align*}<br />
since $\sin \theta\geq 0$ since $0 \leq \theta \leq \pi$.<br />
Therefore the eigenvalues of the matrix $A$ are<br />
\[1, \cos \theta \pm i \sin \theta.\]
<h2> Related Question. </h2>
<p>The following problem treats the rotation matrix in the plane.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Consider the $2\times 2$ matrix<br />
\[A=\begin{bmatrix}<br />
\cos \theta &amp; -\sin \theta\\<br />
\sin \theta&amp; \cos \theta \end{bmatrix},\]
where $\theta$ is a real number $0\leq \theta &lt; 2\pi$.</p>
<p>&nbsp;</p>
<p><strong>(a)</strong> Find the characteristic polynomial of the matrix $A$.</p>
<p><strong>(b)</strong> Find the eigenvalues of the matrix $A$.</p>
<p><strong>(c)</strong> Determine the eigenvectors corresponding to each of the eigenvalues of $A$.
</div>
<p>The solution is given in the post &#8628;<br />
<a href="//yutsumura.com/rotation-matrix-in-the-plane-and-its-eigenvalues-and-eigenvectors/" target="_blank">Rotation Matrix in the Plane and its Eigenvalues and Eigenvectors</a></p>
<button class="simplefavorite-button has-count" data-postid="1559" data-siteid="1" data-groupid="1" data-favoritecount="44" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">44</span></button><p>The post <a href="https://yutsumura.com/rotation-matrix-in-space-and-its-determinant-and-eigenvalues/" target="_blank">Rotation Matrix in Space and its Determinant and Eigenvalues</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/rotation-matrix-in-space-and-its-determinant-and-eigenvalues/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1559</post-id>	</item>
		<item>
		<title>Find the Inverse Matrix of a Matrix With Fractions</title>
		<link>https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/</link>
				<comments>https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/#respond</comments>
				<pubDate>Sat, 10 Dec 2016 04:44:13 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[augmented matrix]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[length]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[orthonormal vector]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1535</guid>
				<description><![CDATA[<p>Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} &#038; \frac{3}{7} &#038; \frac{6}{7} \\[6 pt] \frac{6}{7} &#038;\frac{2}{7} &#038;-\frac{3}{7} \\[6pt] -\frac{3}{7} &#038; \frac{6}{7} &#038; -\frac{2}{7} \end{bmatrix}.\] &#160; Hint. You may use the augmented matrix method&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/" target="_blank">Find the Inverse Matrix of a Matrix With Fractions</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 214</h2>
<p>Find the inverse matrix of the matrix<br />
\[A=\begin{bmatrix}<br />
  \frac{2}{7} &#038; \frac{3}{7} &#038; \frac{6}{7} \\[6 pt]
   \frac{6}{7} &#038;\frac{2}{7} &#038;-\frac{3}{7} \\[6pt]
   -\frac{3}{7} &#038; \frac{6}{7} &#038; -\frac{2}{7}<br />
\end{bmatrix}.\]
<p>&nbsp;<br />
<span id="more-1535"></span><br />

<h2>Hint.</h2>
<p>You may use the augmented matrix method to find the inverse matrix.<br />
Here we give an alternative way to find the inverse matrix by noting that $A$ is an orthogonal matrix.</p>
<p>Recall that a matrix $B$ is orthogonal if $B^{\trans}B=B^{\trans}B=I$.<br />
Thus, once we know $B$ is an orthogonal matrix, then the inverse matrix $B^{-1}$ is just the transpose matrix $B^{\trans}$.</p>
<p>Also, recall that a matrix $B$ is orthogonal if and only if the column vectors of $B$ form an orthonormal set.</p>
<h2>Solution.</h2>
<p>	We first show that $A$ is an orthogonal matrix.<br />
	To do this, it suffices to that the column vectors form an orthonormal set.</p>
<p>	Let<br />
	\[ \mathbf{v}_1= \begin{bmatrix}<br />
  \frac{2}{7} \\[6 pt]
   \frac{6}{7}  \\[6pt]
   -\frac{3}{7}<br />
\end{bmatrix},<br />
\mathbf{v}_2=\begin{bmatrix}<br />
  \frac{3}{7}  \\[6 pt]
  \frac{2}{7}  \\[6pt]
    \frac{6}{7} \end{bmatrix},<br />
 \mathbf{v}_3=\begin{bmatrix}<br />
  \frac{6}{7} \\[6 pt]
 -\frac{3}{7} \\[6pt]
 -\frac{2}{7}<br />
\end{bmatrix}\]
be the column vectors of $A$.</p>
<p>Then the length of the vector $\mathbf{v}_1$ is<br />
\[||\mathbf{v}_1||=\sqrt{(2/7)^2+(6/7)^2+(-3/7)^2}=1.\]
Similarly, we have $||\mathbf{v}_2||=||\mathbf{v}_3||=1$.<br />
Thus, column vectors are unit vectors.</p>
<p>The dot (inner) product of the vectors $\mathbf{v}_1$ and $\mathbf{v}_2$ is<br />
\[\mathbf{v}_1\cdot \mathbf{v}_2=\frac{2}{7}\cdot \frac{3}{7}+\frac{6}{7}\cdot \frac{2}{7}+\left( -\frac{3}{7}\right) \cdot \frac{6}{7}=0.\]
Similarly, we have<br />
\[\mathbf{v}_1\cdot \mathbf{v}_3=0, \quad \mathbf{v}_2\cdot \mathbf{v}_3=0.\]
<p>Therefore, the column vectors are orthogonal.<br />
Hence the column vectors of $A$ are orthonormal, and this implies that $A$ is an orthogonal matrix. Namely, $A^{\trans}=A^{-1}$.<br />
Thus the inverse matrix of $A$ is<br />
\[A^{-1}=\begin{bmatrix}<br />
  \frac{2}{7} &#038; \frac{6}{7} &#038; -\frac{3}{7} \\[6 pt]
   \frac{3}{7} &#038;\frac{2}{7} &#038;\frac{6}{7} \\[6pt]
   \frac{6}{7} &#038; -\frac{3}{7} &#038; -\frac{2}{7}<br />
\end{bmatrix}.\]
<button class="simplefavorite-button has-count" data-postid="1535" data-siteid="1" data-groupid="1" data-favoritecount="46" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">46</span></button><p>The post <a href="https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/" target="_blank">Find the Inverse Matrix of a Matrix With Fractions</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1535</post-id>	</item>
		<item>
		<title>Diagonalizable by an Orthogonal Matrix Implies a Symmetric Matrix</title>
		<link>https://yutsumura.com/diagonalizable-by-an-orthogonal-matrix-implies-a-symmetric-matrix/</link>
				<comments>https://yutsumura.com/diagonalizable-by-an-orthogonal-matrix-implies-a-symmetric-matrix/#respond</comments>
				<pubDate>Mon, 05 Dec 2016 05:56:33 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalizable matrix]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[symmetric matrix]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1508</guid>
				<description><![CDATA[<p>Let $A$ be an $n\times n$ matrix with real number entries. Show that if $A$ is diagonalizable by an orthogonal matrix, then $A$ is a symmetric matrix. &#160; Proof. Suppose that the matrix $A$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/diagonalizable-by-an-orthogonal-matrix-implies-a-symmetric-matrix/" target="_blank">Diagonalizable by an Orthogonal Matrix Implies a Symmetric Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 210</h2>
<p>Let $A$ be an $n\times n$ matrix with real number entries. </p>
<p>Show that if $A$ is diagonalizable by an orthogonal matrix, then $A$ is a symmetric matrix.</p>
<p>&nbsp;<br />
<span id="more-1508"></span></p>
<h2> Proof. </h2>
<p>	Suppose that the matrix $A$ is diagonalizable by an orthogonal matrix $Q$.<br />
	The orthogonality of the matrix $Q$ means that we have<br />
	\[Q^{\trans}Q=QQ^{\trans}=I, \tag{*}\]
	where $Q^{\trans}$ is the transpose matrix of $Q$ and $I$ is the $n\times n$ identity matrix.</p>
<hr />
<p>	Since $Q$ diagonalizes the matrix $A$, we have<br />
	\[Q^{-1}AQ=D,\]
	where $D$ is a diagonal matrix.<br />
	Equivalently, we have<br />
	\[A=QDQ^{-1} \tag{**}.\]
	Taking transpose of both sides, we obtain<br />
	\begin{align*}<br />
A^{\trans}&#038;=(QDQ^{-1})^{\trans}\\<br />
&#038;=(Q^{-1})^{\trans}D^{\trans} Q^{-1}\\<br />
&#038;=(Q^{-1})^{\trans}D Q^{-1} \text{ since } D \text{ is diagonal.}\tag{***}<br />
\end{align*}</p>
<hr />
<p>By (*), we observe that the inverse matrix of $Q$ is the transpose $Q^{\trans}$, that is, $Q^{-1}=Q^{\trans}$.<br />
It follows from this observation and (***) that we have<br />
\[A^{\trans}=QDQ^{-1}.\]
(Note that $(Q^{-1})^{\trans}=Q^{\trans \trans}=Q$.)</p>
<p>Comparing this with (**), we obtain<br />
\[A^{\trans}=A,\]
and hence $A$ is a symmetric matrix.</p>
<button class="simplefavorite-button has-count" data-postid="1508" data-siteid="1" data-groupid="1" data-favoritecount="25" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">25</span></button><p>The post <a href="https://yutsumura.com/diagonalizable-by-an-orthogonal-matrix-implies-a-symmetric-matrix/" target="_blank">Diagonalizable by an Orthogonal Matrix Implies a Symmetric Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/diagonalizable-by-an-orthogonal-matrix-implies-a-symmetric-matrix/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1508</post-id>	</item>
	</channel>
</rss>
