<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>length of a vector &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/length-of-a-vector/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Thu, 08 Feb 2018 05:52:55 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.6</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Dot Product, Lengths, and Distances of Complex Vectors</title>
		<link>https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/</link>
				<comments>https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/#respond</comments>
				<pubDate>Thu, 08 Feb 2018 05:52:55 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[complex vector]]></category>
		<category><![CDATA[distance]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[imaginary number]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal vectors]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6836</guid>
				<description><![CDATA[<p>For this problem, use the complex vectors \[ \mathbf{w}_1 = \begin{bmatrix} 1 + i \\ 1 &#8211; i \\ 0 \end{bmatrix} , \, \mathbf{w}_2 = \begin{bmatrix} -i \\ 0 \\ 2 &#8211; i \end{bmatrix}&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/" target="_blank">Dot Product, Lengths, and Distances of Complex Vectors</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 689</h2>
<p>For this problem, use the complex vectors<br />
\[ \mathbf{w}_1 = \begin{bmatrix} 1 + i \\ 1 &#8211; i \\ 0 \end{bmatrix} , \, \mathbf{w}_2 = \begin{bmatrix} -i \\ 0 \\ 2 &#8211; i \end{bmatrix} , \, \mathbf{w}_3 = \begin{bmatrix} 2+i \\ 1 &#8211; 3i \\ 2i \end{bmatrix} . \]
<p>Suppose $\mathbf{w}_4$ is another complex vector which is orthogonal to both $\mathbf{w}_2$ and $\mathbf{w}_3$, and satisfies $\mathbf{w}_1 \cdot \mathbf{w}_4 = 2i$ and $\| \mathbf{w}_4 \| = 3$.</p>
<p>Calculate the following expressions:</p>
<p><strong>(a)</strong> $ \mathbf{w}_1 \cdot \mathbf{w}_2 $. </p>
<p><strong>(b)</strong> $ \mathbf{w}_1 \cdot \mathbf{w}_3 $. </p>
<p><strong>(c)</strong> $((2+i)\mathbf{w}_1 &#8211; (1+i)\mathbf{w}_2 ) \cdot \mathbf{w}_4$.</p>
<p><strong>(d)</strong> $\| \mathbf{w}_1 \| , \| \mathbf{w}_2 \|$, and $\| \mathbf{w}_3 \|$.</p>
<p><strong>(e)</strong> $\| 3 \mathbf{w}_4 \|$.</p>
<p><strong>(f)</strong> What is the distance between $\mathbf{w}_2$ and $\mathbf{w}_3$?</p>
<p>&nbsp;<br />
<span id="more-6836"></span><br />

<h2>Solution.</h2>
<h3>(a) $ \mathbf{w}_1 \cdot \mathbf{w}_2 $. </h3>
<p>\[ \mathbf{w}_1 \cdot \mathbf{w}_2 = \begin{bmatrix} 1+i &#038; 1-i &#038; 0 \end{bmatrix} \begin{bmatrix} -i \\ 0 \\ 2-i \end{bmatrix} = (1+i)(-i) + 0 + 0 = 1 &#8211; i . \]
<h3>(b) $ \mathbf{w}_1 \cdot \mathbf{w}_3 $. </h3>
<p>\begin{align*} \mathbf{w}_1 \cdot \mathbf{w}_3 &#038;= \begin{bmatrix} 1+i &#038; 1-i &#038; 0 \end{bmatrix} \begin{bmatrix} 2+i \\ 1-3i \\ 2i \end{bmatrix} \\ &#038;= (1+i)(2+i) + (1-i)(1-3i) + 0 \\ &#038;= (1 + 3i) + (-2 &#8211; 4i) \\ &#038;= -1 &#8211; i . \end{align*}</p>
<h3>(c) $((2+i)\mathbf{w}_1 &#8211; (1+i)\mathbf{w}_2 ) \cdot \mathbf{w}_4$.</h3>
<p>\begin{align*} ((2+i)\mathbf{w}_1 &#8211; (1+i)\mathbf{w}_2 ) \cdot \mathbf{w}_4 &#038;= (2+i)( \mathbf{w}_1 \cdot \mathbf{w}_4) &#8211; (1+i) ( \mathbf{w}_2 \cdot \mathbf{w}_4 ) \\<br />
&#038;= (2+i) ( 2i ) &#8211; (1+i)(0) \\<br />
&#038;= -2 + 4i \end{align*}</p>
<p>Note that $\mathbf{w}_2 \cdot \mathbf{w}_4=0$ because these vectors are orthogonal.</p>
<h3>(d) $\| \mathbf{w}_1 \| , \| \mathbf{w}_2 \|$, and $\| \mathbf{w}_3 \|$.</h3>
<p>For an arbitrary complex vector $\mathbf{v}$, its length is defined to be<br />
\[ \| \mathbf{v} \| = \sqrt{ \overline{\mathbf{v}}^\trans \mathbf{v} } . \]
<p>Thus,<br />
\[ \| \mathbf{w}_1 \| \, = \, \sqrt{ (1-i)(1+i) + (1+i)(1-i) + 0 } = \sqrt{ 2 + 2} = \sqrt{4} , \]
\[ \| \mathbf{w}_2 \| \, = \, \sqrt{ (i)(-i) + 0 + (2+i)(2-i) } = \sqrt{1 + 5} = \sqrt{6} , \]
\[ \| \mathbf{w}_3 \| \, = \, \sqrt{ (2-i)(2+i) + (1+3i)(1-3i) + (-2i)(2i) } = \sqrt{ 5 + 10 + 4} = \sqrt{19} . \]
<h3>(e) $\| 3 \mathbf{w}_4 \|$.</h3>
<p>$ \| 3 \mathbf{w}_4 \| = 3 \| \mathbf{w}_4 \|  = 3\cdot 3=9 $ .</p>
<h3>(f) What is the distance between $\mathbf{w}_2$ and $\mathbf{w}_3$?</h3>
<p>The distance between these vectors is given by $\| \mathbf{w}_2 &#8211; \mathbf{w}_3 \|$.  First we calculate this difference:<br />
\[ \mathbf{w}_2 &#8211; \mathbf{w}_3 \, = \, \begin{bmatrix} -i \\ 0 \\ 2 &#8211; i \end{bmatrix} &#8211; \begin{bmatrix} 2+i \\ 1 &#8211; 3i \\ 2i \end{bmatrix} \, = \, \begin{bmatrix} -2 &#8211; 2i \\ -1 + 3i \\ 2 &#8211; 3i \end{bmatrix} . \]
<p>Now the length of the complex vector is defined to be<br />
	\begin{align*}<br />
	\| \mathbf{w}_2 &#8211; \mathbf{w}_3 \| &#038;= \sqrt{ \left( \overline{ \mathbf{w}_2 &#8211; \mathbf{w}_3 } \right)^{\trans} \left(  \mathbf{w}_2 &#8211; \mathbf{w}_3 \right) } \\[6pt]
	&#038;= \sqrt{ \begin{bmatrix} -2 + 2i &#038; -1 &#8211; 3i &#038; 2 + 3i \end{bmatrix} \begin{bmatrix} -2 &#8211; 2i \\ -1 + 3i \\ 2 &#8211; 3i \end{bmatrix} } \\[6pt]
	&#038;= \sqrt{ (-2+2i)(-2-2i) + (-1-3i)(-1+3i) + (2+3i)(2-3i) } \\[6pt]
	&#038;= \sqrt{ 8 + 10 + 13 } \\[6pt]
	&#038;= \sqrt{ 31} \end{align*}</p>
<button class="simplefavorite-button has-count" data-postid="6836" data-siteid="1" data-groupid="1" data-favoritecount="10" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">10</span></button><p>The post <a href="https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/" target="_blank">Dot Product, Lengths, and Distances of Complex Vectors</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6836</post-id>	</item>
		<item>
		<title>Inner Products, Lengths, and Distances of 3-Dimensional Real Vectors</title>
		<link>https://yutsumura.com/inner-products-lengths-and-distances-of-3-dimensional-real-vectors/</link>
				<comments>https://yutsumura.com/inner-products-lengths-and-distances-of-3-dimensional-real-vectors/#respond</comments>
				<pubDate>Tue, 06 Feb 2018 04:39:24 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[distance]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[length]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[magnitude of a vector]]></category>
		<category><![CDATA[norm of a vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6823</guid>
				<description><![CDATA[<p>For this problem, use the real vectors \[ \mathbf{v}_1 = \begin{bmatrix} -1 \\ 0 \\ 2 \end{bmatrix} , \mathbf{v}_2 = \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} , \mathbf{v}_3 = \begin{bmatrix} 2 \\ 2&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/inner-products-lengths-and-distances-of-3-dimensional-real-vectors/" target="_blank">Inner Products, Lengths, and Distances of 3-Dimensional Real Vectors</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 687</h2>
<p>For this problem, use the real vectors<br />
\[ \mathbf{v}_1 = \begin{bmatrix} -1 \\ 0 \\ 2 \end{bmatrix} , \mathbf{v}_2 = \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} , \mathbf{v}_3 = \begin{bmatrix} 2 \\ 2 \\ 3 \end{bmatrix} . \]
Suppose that $\mathbf{v}_4$ is another vector which is orthogonal to $\mathbf{v}_1$ and $\mathbf{v}_3$, and satisfying<br />
\[ \mathbf{v}_2 \cdot \mathbf{v}_4 = -3 . \]
<p>Calculate the following expressions:</p>
<p><strong>(a)</strong> $\mathbf{v}_1 \cdot \mathbf{v}_2 $. </p>
<p><strong>(b)</strong> $\mathbf{v}_3 \cdot \mathbf{v}_4$. </p>
<p><strong>(c)</strong> $( 2 \mathbf{v}_1 + 3 \mathbf{v}_2 &#8211; \mathbf{v}_3 ) \cdot \mathbf{v}_4 $.</p>
<p><strong>(d)</strong> $\| \mathbf{v}_1 \| , \, \| \mathbf{v}_2 \| , \, \| \mathbf{v}_3 \| $.</p>
<p><strong>(e)</strong> What is the distance between $\mathbf{v}_1$ and $\mathbf{v}_2$?</p>
<p>&nbsp;<br />
<span id="more-6823"></span><br />

<h2>Solution.</h2>
<h3>(a) $\mathbf{v}_1 \cdot \mathbf{v}_2 $. </h3>
<p>\[ \mathbf{v}_1 \cdot \mathbf{v}_2 = \begin{bmatrix} -1 &#038; 0 &#038; 2 \end{bmatrix} \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} = -6 . \]
<h3>(b) $\mathbf{v}_3 \cdot \mathbf{v}_4$.</h3>
<p>We are given that $\mathbf{v}_3$ and $\mathbf{v}_4$ are orthogonal vectors, thus<br />
\[ \mathbf{v}_3 \cdot \mathbf{v}_4 = 0 . \]
<h3>(c) $( 2 \mathbf{v}_1 + 3 \mathbf{v}_2 &#8211; \mathbf{v}_3 ) \cdot \mathbf{v}_4 $.</h3>
<p>First, distribute the dot product over the sum:<br />
\[ ( 2 \mathbf{v}_1 + 3 \mathbf{v}_2 &#8211; \mathbf{v}_3 ) \cdot \mathbf{v}_4  = 2 \mathbf{v}_1 \cdot \mathbf{v}_4  + 3 \mathbf{v}_2 \cdot \mathbf{v}_4  &#8211; \mathbf{v}_3 \cdot \mathbf{v}_4  . \]
<p>Next we use the given value for $\mathbf{v}_2 \cdot \mathbf{v}_4$, along with the given facts that $\mathbf{v}_4$ is orthogonal to both $\mathbf{v}_1$ and $\mathbf{v}_3$:<br />
\begin{align*}<br />
&#038;2 \mathbf{v}_1 \cdot \mathbf{v}_4  + 3 \mathbf{v}_2 \cdot \mathbf{v}_4  &#8211; \mathbf{v}_3 \cdot \mathbf{v}_4  \\<br />
&#038;=2\cdot 0 +3 \cdot (-3)-0 =-9.<br />
\end{align*}</p>
<h3>(d) $\| \mathbf{v}_1 \| , \, \| \mathbf{v}_2 \| , \, \| \mathbf{v}_3 \| $.</h3>
<p>The length of a general vector $\mathbf{w}$ is $\|\mathbf{w}\|:=\sqrt{ \mathbf{w}^{\trans} \mathbf{w} }$.  Thus,<br />
\[ \| \mathbf{v}_1 \| \, = \, \sqrt{ (-1)^2+0^2+2^2} \, = \, \sqrt{5} , \]
\[ \| \mathbf{v}_2 \| \, = \, \sqrt{ 0^2+2^2+(-3)^2} \, = \, \sqrt{13} , \]
\[ \| \mathbf{v}_3 \| \, = \, \sqrt{ 2^2+2^2+3^2 } \, = \, \sqrt{17} . \]
<h3>(e) What is the distance between $\mathbf{v}_1$ and $\mathbf{v}_2$?</h3>
<p>The distance between the two vectors is defined to be $ \| \mathbf{v}_1 &#8211; \mathbf{v}_2 \| $.  First we calculate<br />
\[ \mathbf{v}_1 &#8211; \mathbf{v}_2 \, = \, \begin{bmatrix} -1 \\ 0 \\ 2 \end{bmatrix} &#8211; \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} \, = \, \begin{bmatrix} -1 \\ -2 \\ 5 \end{bmatrix} . \]
<p>Thus,<br />
\[ \| \mathbf{v}_1 &#8211; \mathbf{v}_2 \| = \sqrt{ 1 + 4 + 25 } = \sqrt{30} . \]
<button class="simplefavorite-button has-count" data-postid="6823" data-siteid="1" data-groupid="1" data-favoritecount="18" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">18</span></button><p>The post <a href="https://yutsumura.com/inner-products-lengths-and-distances-of-3-dimensional-real-vectors/" target="_blank">Inner Products, Lengths, and Distances of 3-Dimensional Real Vectors</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/inner-products-lengths-and-distances-of-3-dimensional-real-vectors/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6823</post-id>	</item>
		<item>
		<title>The Length of a Vector is Zero if and only if the Vector is the Zero Vector</title>
		<link>https://yutsumura.com/the-length-of-a-vector-is-zero-if-and-only-if-the-vector-is-the-zero-vector/</link>
				<comments>https://yutsumura.com/the-length-of-a-vector-is-zero-if-and-only-if-the-vector-is-the-zero-vector/#respond</comments>
				<pubDate>Mon, 25 Dec 2017 01:56:44 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose of a vector]]></category>
		<category><![CDATA[zero vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6297</guid>
				<description><![CDATA[<p>Let $\mathbf{v}$ be an $n \times 1$ column vector. Prove that $\mathbf{v}^\trans \mathbf{v} = 0$ if and only if $\mathbf{v}$ is the zero vector $\mathbf{0}$. &#160; Proof. Let $\mathbf{v} = \begin{bmatrix} v_1 \\ v_2&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-length-of-a-vector-is-zero-if-and-only-if-the-vector-is-the-zero-vector/" target="_blank">The Length of a Vector is Zero if and only if the Vector is the Zero Vector</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 639</h2>
<p>Let $\mathbf{v}$ be an $n \times 1$ column vector.  </p>
<p>Prove that $\mathbf{v}^\trans \mathbf{v} = 0$ if and only if $\mathbf{v}$ is the zero vector $\mathbf{0}$.</p>
<p>&nbsp;<br />
<span id="more-6297"></span></p>
<h2> Proof. </h2>
<p>	Let $\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} $. </p>
<p>Then we have<br />
	\[\mathbf{v}^\trans \mathbf{v} = \begin{bmatrix} v_1 &#038; v_2 &#038; \cdots &#038; v_n \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} = \sum_{i=1}^n \mathbf{v}_i^2 . \]
	Because each $v_i^2$ is non-negative, this sum is $0$ if and only if $v_i = 0$ for each $i$.  In this case, $\mathbf{v}$ is the zero vector.</p>
<h2>Comment.</h2>
<p>Recall that the the <strong>length</strong> of the vector $\mathbf{v}\in \R^n$ is defined to be<br />
\[\|\mathbf{v}\| :=\sqrt{\mathbf{v}^{\trans} \mathbf{v}}.\]
<p>The problem implies that the length of a vector is $0$ if and only if the vector is the zero vector.</p>
<button class="simplefavorite-button has-count" data-postid="6297" data-siteid="1" data-groupid="1" data-favoritecount="14" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">14</span></button><p>The post <a href="https://yutsumura.com/the-length-of-a-vector-is-zero-if-and-only-if-the-vector-is-the-zero-vector/" target="_blank">The Length of a Vector is Zero if and only if the Vector is the Zero Vector</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-length-of-a-vector-is-zero-if-and-only-if-the-vector-is-the-zero-vector/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6297</post-id>	</item>
		<item>
		<title>7 Problems on Skew-Symmetric Matrices</title>
		<link>https://yutsumura.com/7-problems-on-skew-symmetric-matrices/</link>
				<comments>https://yutsumura.com/7-problems-on-skew-symmetric-matrices/#respond</comments>
				<pubDate>Fri, 15 Sep 2017 04:21:05 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[hermitian matrix]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[skew-symmetric matrix]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4904</guid>
				<description><![CDATA[<p>Let $A$ and $B$ be $n\times n$ skew-symmetric matrices. Namely $A^{\trans}=-A$ and $B^{\trans}=-B$. (a) Prove that $A+B$ is skew-symmetric. (b) Prove that $cA$ is skew-symmetric for any scalar $c$. (c) Let $P$ be an&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/7-problems-on-skew-symmetric-matrices/" target="_blank">7 Problems on Skew-Symmetric Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 564</h2>
<p>		Let $A$ and $B$ be $n\times n$ skew-symmetric matrices. Namely $A^{\trans}=-A$ and $B^{\trans}=-B$.</p>
<p><strong>(a)</strong> Prove that $A+B$ is skew-symmetric.</p>
<p><strong>(b)</strong> Prove that $cA$ is skew-symmetric for any scalar $c$.</p>
<p><strong>(c)</strong> Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is skew-symmetric.</p>
<p><strong>(d)</strong> Suppose that $A$ is real skew-symmetric. Prove that $iA$ is an Hermitian matrix.</p>
<p><strong>(e)</strong> Prove that if $AB=-BA$, then $AB$ is a skew-symmetric matrix.</p>
<p><strong>(f)</strong> Let $\mathbf{v}$ be an $n$-dimensional column vecotor. Prove that $\mathbf{v}^{\trans}A\mathbf{v}=0$.</p>
<p><strong>(g)</strong> Suppose that $A$ is a real skew-symmetric matrix and $A^2\mathbf{v}=\mathbf{0}$ for some vector $\mathbf{v}\in \R^n$. Then prove that $A\mathbf{v}=\mathbf{0}$.</p>
<p>&nbsp;<br />
<span id="more-4904"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that $A+B$ is skew-symmetric.</h3>
<p> We have<br />
		\begin{align*}<br />
		(A+B)^{\trans}=A^{\trans}+B^{\trans}=(-A)+(-B)=-(A+B).<br />
		\end{align*}<br />
		Hence $A+B$ is skew-symmetric.</p>
<h3>(b) Prove that $cA$ is skew-symmetric for any scalar $c$.</h3>
<p> We compute<br />
		\begin{align*}<br />
		(cA)^{\trans}=cA^{\trans}=c(-A)=-cA.<br />
		\end{align*}<br />
		Thus, $cA$ is skew-symmetric.</p>
<h3>(c) Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is skew-symmetric.</h3>
<p> Using the properties of transpose, we have<br />
		\begin{align*}<br />
		(P^{\trans}AP)^{\trans}&#038;=P^{\trans}A^{\trans}(P^{\trans})^{\trans}=P^{\trans}A^{\trans}P\\<br />
		&#038;=P^{\trans}(-A)P=-(P^{\trans}AP).<br />
		\end{align*}<br />
		This implies that $P^{\trans}AP$ is skew-symmetric.</p>
<h3>(d) Suppose that $A$ is real skew-symmetric. Prove that $iA$ is an Hermitian matrix.</h3>
<p>Note that since $A$ is real, we have $\bar{A}=A$.<br />
		Then we have<br />
		\begin{align*}<br />
		(\overline{iA})^{\trans}=(\bar{i}\bar{A})^{\trans}=(-iA)^{\trans}=(-i)A^{\trans}=(-i)(-A)=iA.<br />
		\end{align*}<br />
		It follows that $iA$ is Hermitian.</p>
<h3>(e) Prove that if $AB=-BA$, then $AB$ is a skew-symmetric matrix.</h3>
<p>We calculate<br />
		\begin{align*}<br />
		(AB)^{\trans}&#038;=B^{\trans}A^{\trans}=(-B)(-A)\\<br />
		&#038;=BA=-AB,<br />
		\end{align*}<br />
		where the last step follows from the assumption $AB=-BA$.<br />
		This proves that $AB$ is skew-symmetric.</p>
<h3>(f) Let $\mathbf{v}$ be an $n$-dimensional column vecotor. Prove that $\mathbf{v}^{\trans}A\mathbf{v}=0$.</h3>
<p> Observe that $\mathbf{v}^{\trans}A\mathbf{v}$ is a $1\times 1$ matrix, or just a number.<br />
		So we have<br />
		\begin{align*}<br />
		\mathbf{v}^{\trans}A\mathbf{v}&#038;=(\mathbf{v}^{\trans}A\mathbf{v})^{\trans}=\mathbf{v}^{\trans}A^{\trans}(\mathbf{v}^{\trans})^{\trans}\\<br />
		&#038;=\mathbf{v}^{\trans}A^{\trans}\mathbf{v}=\mathbf{v}^{\trans}(-A)\mathbf{v}=-(\mathbf{v}^{\trans}A\mathbf{v}).<br />
		\end{align*}<br />
		This yields that $2\mathbf{v}^{\trans}A\mathbf{v}=0$, and hence $\mathbf{v}^{\trans}A\mathbf{v}=0$.</p>
<h3>(g) Suppose that $A$ is a real skew-symmetric matrix and $A^2\mathbf{v}=\mathbf{0}$ for some vector $\mathbf{v}\in \R^n$. Then prove that $A\mathbf{v}=\mathbf{0}$.</h3>
<p>Let us compute the length of the vector $A\mathbf{v}$.<br />
		We have<br />
		\begin{align*}<br />
		\|A\mathbf{v}\|&#038;=(A\mathbf{v})^{\trans}(A\mathbf{v})=\mathbf{v}^{\trans}A^{\trans}A\mathbf{v}\\<br />
		&#038;=\mathbf{v}^{\trans}(-A)A\mathbf{v}=-\mathbf{v}^{\trans}A^2\mathbf{v}\\<br />
		&#038;=-\mathbf{v}\mathbf{0} &#038;&#038;\text{by assumption}\\<br />
		&#038;=0.<br />
		\end{align*}<br />
		Since the length $\|A\mathbf{v}\|=0$, we conclude that $A\mathbf{v}=\mathbf{0}$.</p>
<button class="simplefavorite-button has-count" data-postid="4904" data-siteid="1" data-groupid="1" data-favoritecount="25" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">25</span></button><p>The post <a href="https://yutsumura.com/7-problems-on-skew-symmetric-matrices/" target="_blank">7 Problems on Skew-Symmetric Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/7-problems-on-skew-symmetric-matrices/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4904</post-id>	</item>
		<item>
		<title>The Sum of Cosine Squared in an Inner Product Space</title>
		<link>https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/</link>
				<comments>https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/#respond</comments>
				<pubDate>Wed, 30 Aug 2017 03:50:11 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[cosine]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inner product space]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4770</guid>
				<description><![CDATA[<p>Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$. Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$. Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/" target="_blank">The Sum of Cosine Squared in an Inner Product Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 551</h2>
<p>	Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$.<br />
	Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$.<br />
	Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for $i=1,\dots, n$.</p>
<p>	Prove that<br />
	\[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1.\]
<p>&nbsp;<br />
<span id="more-4770"></span><br />

<h2>Definition (Angle between Vectors).</h2>
<p>Let $\langle\mathbf{a}, \mathbf{b}\rangle$ denote the inner product of vectors $\mathbf{a}$ and $\mathbf{b}$ in $V$.</p>
<p>		Recall that the angle $\theta$ between $\mathbf{a}$ and $\mathbf{b}$ is defined as the unique number $\theta$ between $0$ and $\pi$ satisfying<br />
		\[\cos \theta=\frac{\langle\mathbf{a}, \mathbf{b}\rangle}{\|\mathbf{a}\| \|\mathbf{b}\|}.\]
<h2> Proof. </h2>
<p>		Express the vector $\mathbf{v}$ as a linear combination of the basis vectors as<br />
		\[\mathbf{v}=a_1\mathbf{u}_1+\dots+a_n\mathbf{u}_n\]
		for some real numbers $a_1, \dots, a_n$.</p>
<p>		The length of the vector $\mathbf{v}$ is given by<br />
		\[\|\mathbf{v}\|=\sqrt{a_1^2+\cdots+a_n^2}. \tag{*}\]
<hr />
<p>		For each $i$, we have using the properties of the inner product<br />
		\begin{align*}<br />
		\langle \mathbf{v}, \mathbf{u}_i\rangle&#038;=\langle a_1\mathbf{u}_1+\dots+a_n\mathbf{u}_n, \mathbf{u}_i\rangle\\<br />
		&#038;=a_1\langle\mathbf{u}_1, \mathbf{u}_i\rangle+\cdots +a_n \langle\mathbf{u}_n, \mathbf{u}_i \rangle\\<br />
		&#038;=a_i \tag{**}<br />
		\end{align*}<br />
		since $\langle\mathbf{u}_i, \mathbf{u}_i\rangle=1$ and $\langle\mathbf{u}_j, \mathbf{u}_i\rangle=0$ if $j\neq i$ as $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is orthonormal.</p>
<hr />
<p>		By definition of the angle, we have<br />
		\begin{align*}<br />
		\cos \theta_i&#038;=\frac{\langle\mathbf{v}, \mathbf{u}_i\rangle}{\|\mathbf{v}\| \|\mathbf{u}_i\|}=\frac{\langle\mathbf{v}, \mathbf{u}_i\rangle}{\|\mathbf{v}\| } &#038;&#038; \text{since $\|\mathbf{u}_i\|=1$.}<br />
		\end{align*}<br />
		It follows that<br />
		\begin{align*}<br />
		\cos ^2\theta_1+\cdots+\cos^2 \theta_n &#038;=\frac{\langle\mathbf{v}, \mathbf{u}_1\rangle^2}{\|\mathbf{v}\|^2 }+\cdots+\frac{\langle\mathbf{v}, \mathbf{u}_n\rangle^2}{\|\mathbf{v}\|^2 }\\[6pt]
		&#038;=\frac{1}{\|\mathbf{v}\|^2}(a_1^2+\cdots a_n^2) &#038;&#038;\text{by (**)}\\[6pt]
		&#038;=\frac{1}{\|\mathbf{v}\|^2}\cdot \|\mathbf{v}\|^2 &#038;&#038;\text{by (*)}\\[6pt]
		&#038;=1.<br />
		\end{align*}</p>
<p>		Thus we obtain<br />
		\[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1\]
		as required.</p>
<button class="simplefavorite-button has-count" data-postid="4770" data-siteid="1" data-groupid="1" data-favoritecount="26" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">26</span></button><p>The post <a href="https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/" target="_blank">The Sum of Cosine Squared in an Inner Product Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4770</post-id>	</item>
		<item>
		<title>Unit Vectors and Idempotent Matrices</title>
		<link>https://yutsumura.com/unit-vectors-and-idempotent-matrices/</link>
				<comments>https://yutsumura.com/unit-vectors-and-idempotent-matrices/#comments</comments>
				<pubDate>Wed, 02 Aug 2017 15:41:12 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[eigenspace]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[idempotent]]></category>
		<category><![CDATA[idempotent matrix]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal vector]]></category>
		<category><![CDATA[unit vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4295</guid>
				<description><![CDATA[<p>A square matrix $A$ is called idempotent if $A^2=A$. (a) Let $\mathbf{u}$ be a vector in $\R^n$ with length $1$. Define the matrix $P$ to be $P=\mathbf{u}\mathbf{u}^{\trans}$. Prove that $P$ is an idempotent matrix.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/unit-vectors-and-idempotent-matrices/" target="_blank">Unit Vectors and Idempotent Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 527</h2>
<p>	A square matrix $A$ is called <strong>idempotent</strong> if $A^2=A$.</p>
<hr />
<p><strong>(a)</strong> Let $\mathbf{u}$ be a vector in $\R^n$ with length $1$.<br />
	Define the matrix $P$ to be $P=\mathbf{u}\mathbf{u}^{\trans}$.</p>
<p>	Prove that $P$ is an idempotent matrix.</p>
<hr />
<p><strong>(b)</strong> Suppose that $\mathbf{u}$ and $\mathbf{v}$ be unit vectors in $\R^n$ such that $\mathbf{u}$ and $\mathbf{v}$ are orthogonal.<br />
	Let $Q=\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans}$.</p>
<p>	Prove that $Q$ is an idempotent matrix.</p>
<hr />
<p><strong>(c)</strong> Prove that each nonzero vector of the form $a\mathbf{u}+b\mathbf{v}$ for some $a, b\in \R$ is an eigenvector corresponding to the eigenvalue $1$ for the matrix $Q$ in part (b).</p>
<p>&nbsp;<br />
<span id="more-4295"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that $P=\mathbf{u}\mathbf{u}^{\trans}$ is an idempotent matrix.</h3>
<p>The length of the vector $\mathbf{u}$ is given by<br />
			\[\|\mathbf{u}\|=\sqrt{\mathbf{u}^{\trans}\mathbf{u}}.\]
			Since $\|\mathbf{u}\|=1$ by assumption, it yields that<br />
			\[\mathbf{u}^{\trans}\mathbf{u}=1.\]
<p>			Let us compute $P^2$ using the associative properties of matrix multiplication.<br />
			We have<br />
			\begin{align*}<br />
		P^2&#038;=(\mathbf{u}\mathbf{u}^{\trans})(\mathbf{u}\mathbf{u}^{\trans})\\<br />
		&#038;=\mathbf{u}(\mathbf{u}^{\trans}\mathbf{u})\mathbf{u}^{\trans}\\<br />
		&#038;=\mathbf{u}( 1 ) \mathbf{u}^{\trans}=\mathbf{u}\mathbf{u}^{\trans}=P.<br />
		\end{align*}</p>
<p>		Thus, we have obtained $P^2=P$, and hence $P$ is an idempotent matrix.</p>
<h3>(b) Prove that $Q=\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans}$ is an idempotent matrix</h3>
<p>Since $\mathbf{u}$ and $\mathbf{v}$ are unit vectors, we have as in part (a)<br />
			\[\mathbf{u}^{\trans}\mathbf{u}=1 \text{ and  } \mathbf{v}^{\trans}\mathbf{v}=1.\]
			Since $\mathbf{u}$ and $\mathbf{v}$ are orthogonal, their dot (inner) product is $0$.<br />
			Thus we have<br />
			\[\mathbf{u}^{\trans}\mathbf{v}=\mathbf{v}^{\trans}\mathbf{u}=0.\]
<p>			Using these identities, we compute $Q^2$ as follows.<br />
			We have<br />
			\begin{align*}<br />
		Q^2&#038;=(\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans})(\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans})\\<br />
		&#038;=\mathbf{u}\mathbf{u}^{\trans}(\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans})<br />
		+\mathbf{v}\mathbf{v}^{\trans}(\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans})\\<br />
		&#038;=\mathbf{u}\underbrace{\mathbf{u}^{\trans}\mathbf{u}}_{1}\mathbf{u}^{\trans}+\mathbf{u}\underbrace{\mathbf{u}^{\trans}\mathbf{v}}_{0}\mathbf{v}^{\trans}<br />
		+\mathbf{v}\underbrace{\mathbf{v}^{\trans}\mathbf{u}}_{0}\mathbf{u}^{\trans}+\mathbf{v}\underbrace{\mathbf{v}^{\trans}\mathbf{v}}_{1}\mathbf{v}^{\trans}\\[6pt]
		&#038;=\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans}=Q.<br />
		\end{align*}<br />
		It follows that $Q$ is an idempotent matrix.</p>
<h3>(c) Prove that $a\mathbf{u}+b\mathbf{v}$ is an eigenvector</h3>
<p> Let us first compute $Q\mathbf{u}$. We have<br />
		\begin{align*}<br />
		Q\mathbf{u}&#038;=(\mathbf{u}\mathbf{u}^{\trans}+\mathbf{v}\mathbf{v}^{\trans})\mathbf{u}\\<br />
		&#038;=\mathbf{u}\underbrace{\mathbf{u}^{\trans}\mathbf{u}}_{1}+\mathbf{v}\underbrace{\mathbf{v}^{\trans}\mathbf{u}}_{0}=\mathbf{u}.<br />
		\end{align*}</p>
<p>		Note that $\mathbf{u}$ is a nonzero vector because it is a unit vector.<br />
		Thus, the equality $Q\mathbf{u}=\mathbf{u}$ implies that $1$ is an eigenvalue of $Q$ and $\mathbf{v}$ is a corresponding eigenvector.</p>
<p>		Similarly, we can check that $\mathbf{v}$ is an eigenvector corresponding to the eigenvalue $1$.</p>
<p>		Now let $a\mathbf{u}+b\mathbf{v}$ be a nonzero vector.<br />
		Then we have<br />
		\begin{align*}<br />
		Q(a\mathbf{u}+b\mathbf{v})&#038;=aQ\mathbf{u}+bQ\mathbf{v}=a\mathbf{u}+b\mathbf{v}.<br />
		\end{align*}<br />
		It follows that $a\mathbf{u}+b\mathbf{v}$ is an eigenvector corresponding to the eigenvalue $1$.</p>
<h4>Another way to prove (c)</h4>
<p>		Another way to see this is as follows.<br />
		As we saw above, the vectors $\mathbf{u}$ and $\mathbf{v}$ are eigenvectors corresponding to the eigenvalue $1$.<br />
		Hence $\mathbf{u}, \mathbf{v} \in E_{1}$, where $E_1$ is an eigenspace of the eigenvalue $1$</p>
<p>		Note that $E_1$ is a vector space, hence $a\mathbf{u}+b\mathbf{v}$ is a nonzero vector in $E_{1}$.<br />
		Thus, $a\mathbf{u}+b\mathbf{v}$ is an eigenvector corresponding to the eigenvalue $1$ as well.</p>
<h2> Related Question. </h2>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
<strong>(a)</strong> Find a nonzero, nonidentity idempotent matrix.</p>
<p><strong>(b)</strong> Show that eigenvalues of an idempotent matrix $A$ is either $0$ or $1$.
</div>
<p>See the post &#8628;<br />
<a href="//yutsumura.com/idempotent-matrix-and-its-eigenvalues/" target="_blank">Idempotent Matrix and its Eigenvalues</a><br />
for solutions of this problem.</p>
<button class="simplefavorite-button has-count" data-postid="4295" data-siteid="1" data-groupid="1" data-favoritecount="29" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">29</span></button><p>The post <a href="https://yutsumura.com/unit-vectors-and-idempotent-matrices/" target="_blank">Unit Vectors and Idempotent Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/unit-vectors-and-idempotent-matrices/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4295</post-id>	</item>
		<item>
		<title>Prove that the Length $\&#124;A^n\mathbf{v}\&#124;$ is As Small As We Like.</title>
		<link>https://yutsumura.com/prove-that-the-length-anmathbfv-is-as-small-as-we-like/</link>
				<comments>https://yutsumura.com/prove-that-the-length-anmathbfv-is-as-small-as-we-like/#respond</comments>
				<pubDate>Tue, 18 Apr 2017 02:49:32 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[Berkeley]]></category>
		<category><![CDATA[Berkeley.LA]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[magnitude of a vector]]></category>
		<category><![CDATA[norm]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2696</guid>
				<description><![CDATA[<p>Consider the matrix \[A=\begin{bmatrix} 3/2 &#038; 2\\ -1&#038; -3/2 \end{bmatrix} \in M_{2\times 2}(\R).\] (a) Find the eigenvalues and corresponding eigenvectors of $A$. (b) Show that for $\mathbf{v}=\begin{bmatrix} 1 \\ 0 \end{bmatrix}\in \R^2$, we can&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/prove-that-the-length-anmathbfv-is-as-small-as-we-like/" target="_blank">Prove that the Length $\|A^n\mathbf{v}\|$ is As Small As We Like.</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 381</h2>
<p>	Consider the matrix<br />
	\[A=\begin{bmatrix}<br />
	  3/2 &#038; 2\\<br />
	  -1&#038; -3/2<br />
	\end{bmatrix} \in M_{2\times 2}(\R).\]
<p><strong>(a)</strong> Find the eigenvalues and corresponding eigenvectors of $A$.</p>
<p><strong>(b)</strong> Show that for $\mathbf{v}=\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}\in \R^2$, we can choose $n$ large enough so that the length $\|A^n\mathbf{v}\|$ is as small as we like.</p>
<p>(<em>University of California, Berkeley, Linear Algebra Final Exam Problem</em>)<br />
&nbsp;<br />
<span id="more-2696"></span><br />

<h2> Proof. </h2>
<h3>(a) Find the eigenvalues and corresponding eigenvectors of $A$.</h3>
<p>	  	 To find the eigenvalues of $A$, we compute the characteristic polynomial $p(t)$ of $A$.<br />
	  	We have<br />
	  	\begin{align*}<br />
	p(t)&#038;=\det(A-tI)\\<br />
	&#038;=\begin{vmatrix}<br />
	  3/2-t &#038; 2\\<br />
	  -1&#038; -3/2-t<br />
	\end{vmatrix}\\<br />
	&#038;=t^2-1/4.<br />
	\end{align*}<br />
	Since the eigenvalues are roots of the characteristic polynomials, the eigenvalues of $A$ are $\pm 1/2$.</p>
<p>	Next we find the eigenvectors corresponding to eigenvalue $1/2$.<br />
	These are the solutions of $(A-\frac{1}{2}I)\mathbf{x}=\mathbf{0}$.<br />
	We have<br />
	\begin{align*}<br />
	A-\frac{1}{2}I=\begin{bmatrix}<br />
	  1 &#038; 2\\<br />
	  -1&#038; -2<br />
	\end{bmatrix}<br />
	\xrightarrow{R_2+R_1}<br />
	\begin{bmatrix}<br />
	  1 &#038; 2\\<br />
	  0&#038; 0<br />
	\end{bmatrix}.<br />
	\end{align*}<br />
	Thus, the solution $\mathbf{x}$ satisfies $x_1=-2x_2$, and the eigenvectors are<br />
	\[\mathbf{x}=x_2\begin{bmatrix}<br />
	  -2 \\<br />
	  1<br />
	\end{bmatrix},\]
	where $x_2$ is a nonzero scalar.</p>
<p>	Similarly, we find the eigenvectors corresponding to the eigenvalue $-1/2$.<br />
	We solve $(A+\frac{1}{2}I)\mathbf{x}=\mathbf{0}$.<br />
	We have<br />
	\begin{align*}<br />
	A+\frac{1}{2}I=\begin{bmatrix}<br />
	  2 &#038; 2\\<br />
	  -1&#038; -1<br />
	\end{bmatrix}<br />
	\xrightarrow[\text{then } R_2+R_1]{\frac{1}{2}R_1}<br />
	\begin{bmatrix}<br />
	  1 &#038; 1\\<br />
	  0&#038; 0<br />
	\end{bmatrix}.<br />
	\end{align*}<br />
	Thus, we have $x_1=-x_2$, and the eigenvectors are<br />
	\[\mathbf{x}=x_2\begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix},\]
	where $x_2$ is a nonzero scalar.</p>
<h3>(b) We can choose $n$ large enough so that the length $\|A^n\mathbf{v}\|$ is as small as we like.</h3>
<p> We express the vector $\mathbf{v}=\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}$ as a linear combination of eigenvectors $\begin{bmatrix}<br />
	  -2 \\<br />
	  1<br />
	\end{bmatrix}$ and $\begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix}$ corresponding to eigenvalues $1/2$ and $-1/2$, respectively.<br />
	Let<br />
	\[\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}=c_1\begin{bmatrix}<br />
	  -2 \\<br />
	  1<br />
	\end{bmatrix}+c_2\begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix}\]
	for some scalars $c_1, c_2$.<br />
	Solving this for $c_1, c_2$, we find that $c_1=-1$ and $c_2=1$, and thus we have<br />
	\[\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}=-\begin{bmatrix}<br />
	  -2 \\<br />
	  1<br />
	\end{bmatrix}+\begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix}.\]
	Then for any positive integer $n$, we have<br />
	\begin{align*}<br />
	A^n\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}&#038;=-A^n\begin{bmatrix}<br />
	  -2 \\<br />
	  1<br />
	\end{bmatrix}+A^n\begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix}\\<br />
	&#038;=-\left(\,  \frac{1}{2} \,\right)^n\begin{bmatrix}<br />
	  -2 \\<br />
	  1<br />
	\end{bmatrix}+\left(\,  -\frac{1}{2} \,\right)^n\begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix}\\<br />
	&#038;=\left(\,  \frac{1}{2} \,\right)^n\begin{bmatrix}<br />
	  2-(-1)^n \\<br />
	  -1+(-1)^n<br />
	\end{bmatrix}<br />
	\end{align*}<br />
	Note that in the second equality we used the following fact: If $A\mathbf{x}=\lambda \mathbf{x}$, then $A^n\mathbf{x}=\lambda^n \mathbf{x}$.</p>
<p>	Then the length is<br />
	\begin{align*}<br />
	\left \| A^n\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}\right \|&#038;=\left(\,  \frac{1}{2} \,\right)^n \sqrt{\left(\,  2-(-1)^n \,\right)^2+\left(\,  -1+(-1)^n \,\right)^2}\\<br />
	&#038; \leq \left(\,  \frac{1}{2} \,\right)^n \sqrt{3^2+2^2}\\<br />
	&#038;= \sqrt{13}\left(\,  \frac{1}{2} \,\right)^n \to 0 \text{ as $n$ tends to infinity}.<br />
	\end{align*}<br />
	Therefore, we can choose $n$ large enough so that the length $\|A^n\mathbf{v}\|$ is as small as we wish.</p>
<button class="simplefavorite-button has-count" data-postid="2696" data-siteid="1" data-groupid="1" data-favoritecount="5" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">5</span></button><p>The post <a href="https://yutsumura.com/prove-that-the-length-anmathbfv-is-as-small-as-we-like/" target="_blank">Prove that the Length $\|A^n\mathbf{v}\|$ is As Small As We Like.</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/prove-that-the-length-anmathbfv-is-as-small-as-we-like/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2696</post-id>	</item>
		<item>
		<title>Orthonormal Basis of Null Space and Row Space</title>
		<link>https://yutsumura.com/orthonormal-basis-of-null-space-and-row-space/</link>
				<comments>https://yutsumura.com/orthonormal-basis-of-null-space-and-row-space/#comments</comments>
				<pubDate>Fri, 07 Apr 2017 01:06:05 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[leading 1 method]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[null space]]></category>
		<category><![CDATA[nullity]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[orthonormal basis]]></category>
		<category><![CDATA[rank]]></category>
		<category><![CDATA[rank-nullity theorem]]></category>
		<category><![CDATA[reduced row echelon form]]></category>
		<category><![CDATA[row space]]></category>
		<category><![CDATA[subspace]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2595</guid>
				<description><![CDATA[<p>Let $A=\begin{bmatrix} 1 &#038; 0 &#038; 1 \\ 0 &#038;1 &#038;0 \end{bmatrix}$. (a) Find an orthonormal basis of the null space of $A$. (b) Find the rank of $A$. (c) Find an orthonormal basis&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/orthonormal-basis-of-null-space-and-row-space/" target="_blank">Orthonormal Basis of Null Space and Row Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 366</h2>
<p>	Let $A=\begin{bmatrix}<br />
	  1 &#038; 0 &#038; 1 \\<br />
	   0 &#038;1 &#038;0<br />
	\end{bmatrix}$.</p>
<p>	<strong>(a)</strong> Find an orthonormal basis of the null space of $A$.</p>
<p>	<strong>(b)</strong> Find the rank of $A$.</p>
<p>	<strong>(c)</strong> Find an orthonormal basis of the row space of $A$.</p>
<p>	(<em>The Ohio State University, Linear Algebra Exam Problem</em>)<br />
&nbsp;<br />
<span id="more-2595"></span></p>

<h2>Solution.</h2>
<p>	First of all, note that $A$ is already in reduced row echelon form.</p>
<h3>(a) Find an orthonormal basis of the null space of $A$.</h3>
<p> Let us find a basis of null space of $A$.<br />
		The null space consists of the solutions of $A\mathbf{x}=0$.<br />
		Since $A$ is in reduced row echelon form, the solutions $\mathbf{x}=\begin{bmatrix}<br />
	  x_1 \\<br />
	   x_2 \\<br />
	    x_3<br />
	  \end{bmatrix}$ satisfy<br />
	  \[x_1=-x_3 \text{ and } x_2=0,\]
	  hence the general solution is<br />
	  \[\mathbf{x}=\begin{bmatrix}<br />
	  -x_3 \\<br />
	   0 \\<br />
	    x_3<br />
	  \end{bmatrix}=x_3\begin{bmatrix}<br />
	  -1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix}.\]
	  Therefore, the set<br />
	  \[\left\{\,  \begin{bmatrix}<br />
	  -1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix} \,\right\}\]
	   is a basis of the null space of $A$.<br />
	  Since the length of the basis vector is $\sqrt{(-1)^2+0^2+1^2}=\sqrt{2}$, it is not orthonormal basis.<br />
	  Thus, we divide the vector by its length and obtain an orthonormal basis<br />
	  \[\left\{\, \frac{1}{\sqrt{2}}\begin{bmatrix}<br />
	  -1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix} \,\right\}.\]
&nbsp;</p>
<h3>(b) Find the rank of $A$.</h3>
<p>From part (a), we see that the nullity of $A$ is $1$. The rank-nullity theorem says that<br />
	  \[\text{rank of $A$} + \text{ nullity of $A$}=3.\]
	  Hence the rank of $A$ is $2$.</p>
<p>	  The second way to find the rank of $A$ is to use the definition of the rank: The rank of a matrix $B$ is the number of nonzero rows in a reduced row echelon matrix that is row equivalent to $B$.<br />
	  Since $A$ is in echelon form and it has two nonzero rows, the rank is $2$.</p>
<p>	  The third way to find the rank is to use the leading 1 method. By the leading 1 method, we see that the first two columns form a basis of the range, hence the rank of $A$ is $2$.<br />
	  &nbsp;</p>
<h3>(c) Find an orthonormal basis of the row space of $A$.</h3>
<p> By the row space method, the nonzero rows in reduced row echelon form a basis of the row space of $A$. Thus<br />
	  \[ \left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    0<br />
	  \end{bmatrix} \,\right\}\]
	  is a basis of the row space of $A$.<br />
	  Since the dot (inner) product of these two vectors is $0$, they are orthogonal.<br />
	  The length of the vectors is $\sqrt{2}$ and $1$, respectively.<br />
	  Hence an orthonormal basis of the row space of $A$ is<br />
	   \[ \left\{\, \frac{1}{\sqrt{2}} \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    0<br />
	  \end{bmatrix} \,\right\}\]
		&nbsp;</p>
<h2>Linear Algebra Midterm Exam 2 Problems and Solutions </h2>
<ul>
<li><a href="//yutsumura.com/true-or-false-problems-of-vector-spaces-and-linear-transformations/" target="_blank">True of False Problems  and Solutions</a>: True or False problems of vector spaces and linear transformations</li>
<li><a href="//yutsumura.com/10-examples-of-subsets-that-are-not-subspaces-of-vector-spaces/" target="_blank">Problem 1 and its solution</a>: See (7) in the post &#8220;10 examples of subsets that are not subspaces of vector spaces&#8221;</li>
<li><a href="//yutsumura.com/determine-whether-trigonometry-functions-sin2x-cos2x-1-are-linearly-independent-or-dependent/" target="_blank">Problem 2 and its solution</a>: Determine whether trigonometry functions $\sin^2(x), \cos^2(x), 1$ are linearly independent or dependent</li>
<li>Problem 3 and its solution (current problem): Orthonormal basis of null space and row space</li>
<li><a href="//yutsumura.com/basis-of-span-in-vector-space-of-polynomials-of-degree-2-or-less/" target="_blank">Problem 4 and its solution</a>: Basis of span in vector space of polynomials of degree 2 or less</li>
<li><a href="//yutsumura.com/determine-value-of-linear-transformation-from-r3-to-r2/" target="_blank">Problem 5 and its solution</a>: Determine value of linear transformation from $R^3$ to $R^2$</li>
<li><a href="//yutsumura.com/rank-and-nullity-of-linear-transformation-from-r3-to-r2/" target="_blank">Problem 6 and its solution</a>: Rank and nullity of linear transformation from $R^3$ to $R^2$</li>
<li><a href="//yutsumura.com/find-matrix-representation-of-linear-transformation-from-r2-to-r2/" target="_blank">Problem 7 and its solution</a>: Find matrix representation of linear transformation from $R^2$ to $R^2$</li>
<li><a href="//yutsumura.com/hyperplane-through-origin-is-subspace-of-4-dimensional-vector-space/" target="_blank">Problem 8 and its solution</a>: Hyperplane through origin is subspace of 4-dimensional vector space</li>
</ul>
<button class="simplefavorite-button has-count" data-postid="2595" data-siteid="1" data-groupid="1" data-favoritecount="36" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">36</span></button><p>The post <a href="https://yutsumura.com/orthonormal-basis-of-null-space-and-row-space/" target="_blank">Orthonormal Basis of Null Space and Row Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/orthonormal-basis-of-null-space-and-row-space/feed/</wfw:commentRss>
		<slash:comments>6</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2595</post-id>	</item>
		<item>
		<title>Find the Distance Between Two Vectors if the Lengths and the Dot Product are Given</title>
		<link>https://yutsumura.com/find-the-distance-between-two-vectors-if-the-lengths-and-the-dot-product-are-given/</link>
				<comments>https://yutsumura.com/find-the-distance-between-two-vectors-if-the-lengths-and-the-dot-product-are-given/#respond</comments>
				<pubDate>Fri, 13 Jan 2017 00:45:11 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[distance]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[length]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose matrix]]></category>
		<category><![CDATA[vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1920</guid>
				<description><![CDATA[<p>Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are \[\&#124;\mathbf{a}\&#124;=\&#124;\mathbf{b}\&#124;=1\] and the inner product \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\&#124;\mathbf{a}-\mathbf{b}\&#124;$. (Note that this length is the distance between $\mathbf{a}$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/find-the-distance-between-two-vectors-if-the-lengths-and-the-dot-product-are-given/" target="_blank">Find the Distance Between Two Vectors if the Lengths and the Dot Product are Given</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 254</h2>
<p>Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are<br />
\[\|\mathbf{a}\|=\|\mathbf{b}\|=1\]
and the inner product<br />
\[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\]
<p>Then determine the length $\|\mathbf{a}-\mathbf{b}\|$.<br />
(Note that this length is the distance between $\mathbf{a}$ and $\mathbf{b}$.)</p>
<p>&nbsp;<br />
<span id="more-1920"></span></p>
<h2>Solution.</h2>
<p>	Recall that the length of a vector $\mathbf{x}$ is defined to be<br />
	\[\|\mathbf{x}\|=\sqrt{\mathbf{x}^{\trans}\mathbf{x}},\]
	where $\mathbf{x}^{\trans}$ is the transpose of $\mathbf{x}$.</p>
<p>	Also, recall that the inner product of two vectors $\mathbf{x}, \mathbf{y}$ are commutative.<br />
	Namely we have<br />
	\[\mathbf{x}\cdot \mathbf{y}=\mathbf{x}^{\trans}\mathbf{y}=\mathbf{y}^{\trans}\mathbf{x}=\mathbf{y} \cdot \mathbf{x}.\]
<hr />
<p>	Applying the second fact with given vectors $\mathbf{a}, \mathbf{b}$, we obtain<br />
	\[\mathbf{a}^{\trans}\mathbf{b}=\mathbf{b}^{\trans}\mathbf{a}= -\frac{1}{2}.\]
<hr />
<p>	Now we compute $\|\mathbf{a}-\mathbf{b}\|^2$ as follows.<br />
	We have<br />
	\begin{align*}<br />
\|\mathbf{a}-\mathbf{b}\|^2&#038;=(\mathbf{a}-\mathbf{b})^{\trans}(\mathbf{a}-\mathbf{b}) \qquad \text{ (by definition of the length)}\\<br />
&#038;=(\mathbf{a}^{\trans}-\mathbf{b}^{\trans})(\mathbf{a}-\mathbf{b})\\<br />
&#038;=\mathbf{a}^{\trans}\mathbf{a}-\mathbf{a}^{\trans}\mathbf{b}-\mathbf{b}^{\trans}\mathbf{a}+\mathbf{b}^{\trans}\mathbf{b}\\<br />
&#038;=\|\mathbf{a}\|^2-\mathbf{a}^{\trans}\mathbf{b}-\mathbf{b}^{\trans}\mathbf{a}+\|\mathbf{b}\|^2\\<br />
&#038;=1-\left(-\frac{1}{2} \right)-\left(-\frac{1}{2} \right)+1\\<br />
&#038;=3.<br />
\end{align*}</p>
<p>Since the length is nonnegative, we take the square root of the above equality and obtain<br />
\[\|\mathbf{a}-\mathbf{b}\|=\sqrt{3}.\]
<button class="simplefavorite-button has-count" data-postid="1920" data-siteid="1" data-groupid="1" data-favoritecount="17" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">17</span></button><p>The post <a href="https://yutsumura.com/find-the-distance-between-two-vectors-if-the-lengths-and-the-dot-product-are-given/" target="_blank">Find the Distance Between Two Vectors if the Lengths and the Dot Product are Given</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/find-the-distance-between-two-vectors-if-the-lengths-and-the-dot-product-are-given/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1920</post-id>	</item>
		<item>
		<title>Find the Inverse Matrix of a Matrix With Fractions</title>
		<link>https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/</link>
				<comments>https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/#respond</comments>
				<pubDate>Sat, 10 Dec 2016 04:44:13 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[augmented matrix]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[length]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[orthogonal matrix]]></category>
		<category><![CDATA[orthonormal vector]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1535</guid>
				<description><![CDATA[<p>Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} &#038; \frac{3}{7} &#038; \frac{6}{7} \\[6 pt] \frac{6}{7} &#038;\frac{2}{7} &#038;-\frac{3}{7} \\[6pt] -\frac{3}{7} &#038; \frac{6}{7} &#038; -\frac{2}{7} \end{bmatrix}.\] &#160; Hint. You may use the augmented matrix method&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/" target="_blank">Find the Inverse Matrix of a Matrix With Fractions</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 214</h2>
<p>Find the inverse matrix of the matrix<br />
\[A=\begin{bmatrix}<br />
  \frac{2}{7} &#038; \frac{3}{7} &#038; \frac{6}{7} \\[6 pt]
   \frac{6}{7} &#038;\frac{2}{7} &#038;-\frac{3}{7} \\[6pt]
   -\frac{3}{7} &#038; \frac{6}{7} &#038; -\frac{2}{7}<br />
\end{bmatrix}.\]
<p>&nbsp;<br />
<span id="more-1535"></span><br />

<h2>Hint.</h2>
<p>You may use the augmented matrix method to find the inverse matrix.<br />
Here we give an alternative way to find the inverse matrix by noting that $A$ is an orthogonal matrix.</p>
<p>Recall that a matrix $B$ is orthogonal if $B^{\trans}B=B^{\trans}B=I$.<br />
Thus, once we know $B$ is an orthogonal matrix, then the inverse matrix $B^{-1}$ is just the transpose matrix $B^{\trans}$.</p>
<p>Also, recall that a matrix $B$ is orthogonal if and only if the column vectors of $B$ form an orthonormal set.</p>
<h2>Solution.</h2>
<p>	We first show that $A$ is an orthogonal matrix.<br />
	To do this, it suffices to that the column vectors form an orthonormal set.</p>
<p>	Let<br />
	\[ \mathbf{v}_1= \begin{bmatrix}<br />
  \frac{2}{7} \\[6 pt]
   \frac{6}{7}  \\[6pt]
   -\frac{3}{7}<br />
\end{bmatrix},<br />
\mathbf{v}_2=\begin{bmatrix}<br />
  \frac{3}{7}  \\[6 pt]
  \frac{2}{7}  \\[6pt]
    \frac{6}{7} \end{bmatrix},<br />
 \mathbf{v}_3=\begin{bmatrix}<br />
  \frac{6}{7} \\[6 pt]
 -\frac{3}{7} \\[6pt]
 -\frac{2}{7}<br />
\end{bmatrix}\]
be the column vectors of $A$.</p>
<p>Then the length of the vector $\mathbf{v}_1$ is<br />
\[||\mathbf{v}_1||=\sqrt{(2/7)^2+(6/7)^2+(-3/7)^2}=1.\]
Similarly, we have $||\mathbf{v}_2||=||\mathbf{v}_3||=1$.<br />
Thus, column vectors are unit vectors.</p>
<p>The dot (inner) product of the vectors $\mathbf{v}_1$ and $\mathbf{v}_2$ is<br />
\[\mathbf{v}_1\cdot \mathbf{v}_2=\frac{2}{7}\cdot \frac{3}{7}+\frac{6}{7}\cdot \frac{2}{7}+\left( -\frac{3}{7}\right) \cdot \frac{6}{7}=0.\]
Similarly, we have<br />
\[\mathbf{v}_1\cdot \mathbf{v}_3=0, \quad \mathbf{v}_2\cdot \mathbf{v}_3=0.\]
<p>Therefore, the column vectors are orthogonal.<br />
Hence the column vectors of $A$ are orthonormal, and this implies that $A$ is an orthogonal matrix. Namely, $A^{\trans}=A^{-1}$.<br />
Thus the inverse matrix of $A$ is<br />
\[A^{-1}=\begin{bmatrix}<br />
  \frac{2}{7} &#038; \frac{6}{7} &#038; -\frac{3}{7} \\[6 pt]
   \frac{3}{7} &#038;\frac{2}{7} &#038;\frac{6}{7} \\[6pt]
   \frac{6}{7} &#038; -\frac{3}{7} &#038; -\frac{2}{7}<br />
\end{bmatrix}.\]
<button class="simplefavorite-button has-count" data-postid="1535" data-siteid="1" data-groupid="1" data-favoritecount="48" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">48</span></button><p>The post <a href="https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/" target="_blank">Find the Inverse Matrix of a Matrix With Fractions</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/find-the-inverse-matrix-of-a-matrix-with-fractions/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1535</post-id>	</item>
	</channel>
</rss>
