<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>orthogonal basis &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/orthogonal-basis/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Mon, 26 Mar 2018 03:05:37 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Using Gram-Schmidt Orthogonalization, Find an Orthogonal Basis for the Span</title>
		<link>https://yutsumura.com/using-gram-schmidt-orthogonalization-find-an-orthogonal-basis-for-the-span/</link>
				<comments>https://yutsumura.com/using-gram-schmidt-orthogonalization-find-an-orthogonal-basis-for-the-span/#respond</comments>
				<pubDate>Mon, 26 Mar 2018 03:05:37 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis for a vector space]]></category>
		<category><![CDATA[Gram-Schmidt orthogonalization process]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[span]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6977</guid>
				<description><![CDATA[<p>Using Gram-Schmidt orthogonalization, find an orthogonal basis for the span of the vectors $\mathbf{w}_{1},\mathbf{w}_{2}\in\R^{3}$ if \[ \mathbf{w}_{1} = \begin{bmatrix} 1 \\ 0 \\ 3 \end{bmatrix} ,\quad \mathbf{w}_{2} = \begin{bmatrix} 2 \\ -1 \\ 0&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/using-gram-schmidt-orthogonalization-find-an-orthogonal-basis-for-the-span/" target="_blank">Using Gram-Schmidt Orthogonalization, Find an Orthogonal Basis for the Span</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 716</h2>
<p>	Using Gram-Schmidt orthogonalization, find an orthogonal basis for the span of the vectors $\mathbf{w}_{1},\mathbf{w}_{2}\in\R^{3}$ if<br />
	\[<br />
	\mathbf{w}_{1}<br />
	=<br />
	\begin{bmatrix}<br />
	1 \\ 0 \\ 3<br />
	\end{bmatrix}<br />
	,\quad<br />
	\mathbf{w}_{2}<br />
	=<br />
	\begin{bmatrix}<br />
	2 \\ -1 \\ 0<br />
	\end{bmatrix}<br />
	.<br />
	\]
<p>&nbsp;<br />
<span id="more-6977"></span><br />

<h2>Solution.</h2>
<p>	We apply Gram-Schmidt orthogonalization as follows. The first step is to define $\mathbf{u}_{1}=\mathbf{w}_{1}$. Before defining $\mathbf{u}_{2}$, we must compute<br />
	\begin{align*}<br />
	\mathbf{u}_{1}^{T}\mathbf{w}_{2}<br />
	&#038;=<br />
	\mathbf{w}_{1}^{T}\mathbf{w}_{2}<br />
	=<br />
	\begin{bmatrix}<br />
	1 &#038; 0 &#038; 3<br />
	\end{bmatrix}<br />
	\begin{bmatrix}<br />
	2 \\ -1 \\ 0<br />
	\end{bmatrix}<br />
	=2+0+0=2,<br />
	\\<br />
	\mathbf{u}_{1}^{T}\mathbf{u}_{1}<br />
	&#038;=<br />
	\mathbf{w}_{1}^{T}\mathbf{w}_{1}<br />
	=<br />
	\begin{bmatrix}<br />
	1 &#038; 0 &#038; 3<br />
	\end{bmatrix}<br />
	\begin{bmatrix}<br />
	1 \\ 0 \\ 3<br />
	\end{bmatrix}<br />
	=1+0+9=10.<br />
	\end{align*}</p>
<hr />
<p>	Next, we define<br />
	\[<br />
	\mathbf{u}_{2}<br />
	=<br />
	\mathbf{w}_{2}<br />
	-\dfrac{\mathbf{u}_{1}^{T}\mathbf{w}_{2}}<br />
	{\mathbf{u}_{1}^{T}\mathbf{u}_{1}}<br />
	\mathbf{u}_{1}<br />
	=<br />
	\begin{bmatrix}<br />
	2 \\ -1 \\ 0<br />
	\end{bmatrix}<br />
	-\dfrac{2}{10}<br />
	\begin{bmatrix}<br />
	1 \\ 0 \\ 3<br />
	\end{bmatrix}<br />
	=<br />
	\begin{bmatrix}<br />
	10/5 \\ -1 \\ 0<br />
	\end{bmatrix}<br />
	&#8211;<br />
	\begin{bmatrix}<br />
	1/5 \\ 0 \\ 3/5<br />
	\end{bmatrix}<br />
	=<br />
	\begin{bmatrix}<br />
	9/5 \\ -1 \\ -3/5<br />
	\end{bmatrix}<br />
	.<br />
	\]
	By Gram-Schmidt orthogonalization, $\{\mathbf{u}_{1},\mathbf{u}_{2}\}$ is an orthogonal basis for the span of the vectors $\mathbf{w}_{1}$ and $\mathbf{w}_{2}$.</p>
<h2>Remark </h2>
<p>		Note that since scalar multiplication by a nonzero number does not change the orthogonality of vectors and the new vectors still form a basis, we could have used $5\mathbf{u}_2$, instead of $\mathbf{u}_2$ to avoid a fraction in our computation.<br />
		We have<br />
		\[5\mathbf{u}_2=\begin{bmatrix}<br />
	  10 \\<br />
	   -5 \\<br />
	    0<br />
	  \end{bmatrix}-\begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    3<br />
	  \end{bmatrix}=\begin{bmatrix}<br />
	  9 \\<br />
	   -5 \\<br />
	    -3<br />
	  \end{bmatrix},\]
	  and $\{\mathbf{u}_1, 5\mathbf{u}_2\}$ is an orthogonal basis for the span.</p>
<button class="simplefavorite-button has-count" data-postid="6977" data-siteid="1" data-groupid="1" data-favoritecount="84" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">84</span></button><p>The post <a href="https://yutsumura.com/using-gram-schmidt-orthogonalization-find-an-orthogonal-basis-for-the-span/" target="_blank">Using Gram-Schmidt Orthogonalization, Find an Orthogonal Basis for the Span</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/using-gram-schmidt-orthogonalization-find-an-orthogonal-basis-for-the-span/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6977</post-id>	</item>
		<item>
		<title>Normalize Lengths to Obtain an Orthonormal Basis</title>
		<link>https://yutsumura.com/normalize-lengths-to-obtain-an-orthonormal-basis/</link>
				<comments>https://yutsumura.com/normalize-lengths-to-obtain-an-orthonormal-basis/#respond</comments>
				<pubDate>Thu, 22 Mar 2018 03:57:05 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis for a vector space]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[orthogonal vector]]></category>
		<category><![CDATA[orthonormal basis]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6973</guid>
				<description><![CDATA[<p>Let \[ \mathbf{v}_{1} = \begin{bmatrix} 1 \\ 1 \end{bmatrix} ,\; \mathbf{v}_{2} = \begin{bmatrix} 1 \\ -1 \end{bmatrix} . \] Let $V=\Span(\mathbf{v}_{1},\mathbf{v}_{2})$. Do $\mathbf{v}_{1}$ and $\mathbf{v}_{2}$ form an orthonormal basis for $V$? If not, then&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/normalize-lengths-to-obtain-an-orthonormal-basis/" target="_blank">Normalize Lengths to Obtain an Orthonormal Basis</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 715</h2>
<p>	Let<br />
	\[<br />
	\mathbf{v}_{1}<br />
	=<br />
	\begin{bmatrix}<br />
	1 \\ 1<br />
	\end{bmatrix}<br />
	,\;<br />
	\mathbf{v}_{2}<br />
	=<br />
	\begin{bmatrix}<br />
	1 \\ -1<br />
	\end{bmatrix}<br />
	.<br />
	\]
	Let $V=\Span(\mathbf{v}_{1},\mathbf{v}_{2})$. Do $\mathbf{v}_{1}$ and $\mathbf{v}_{2}$ form an orthonormal basis for $V$? </p>
<p>If not, then find an orthonormal basis for $V$.</p>
<p>&nbsp;<br />
<span id="more-6973"></span></p>
<h2>Solution.</h2>
<p>	We begin by computing<br />
	\begin{align*}<br />
	\mathbf{v}_{1}\cdot\mathbf{v}_{2}<br />
	&#038;=<br />
	\mathbf{v}_{1}^{T}\mathbf{v}_{2}<br />
	=<br />
	\begin{bmatrix}<br />
	1 &#038; 1<br />
	\end{bmatrix}<br />
	\begin{bmatrix}<br />
	1 \\ -1<br />
	\end{bmatrix}<br />
	=<br />
	1\cdot 1+1\cdot -1<br />
	=<br />
	1-1<br />
	=<br />
	0,<br />
	\\<br />
	\mathbf{v}_{1}\cdot\mathbf{v}_{1}<br />
	&#038;=<br />
	\begin{bmatrix}<br />
	1 &#038; 1<br />
	\end{bmatrix}<br />
	\begin{bmatrix}<br />
	1 \\ 1<br />
	\end{bmatrix}<br />
	=<br />
	1+1<br />
	=<br />
	2,<br />
	\\<br />
	\mathbf{v}_{2}\cdot\mathbf{v}_{2}<br />
	&#038;=<br />
	\begin{bmatrix}<br />
	1 &#038; -1<br />
	\end{bmatrix}<br />
	\begin{bmatrix}<br />
	1 \\ -1<br />
	\end{bmatrix}<br />
	=<br />
	1+1<br />
	=<br />
	2.<br />
	\end{align*}<br />
	Since $\mathbf{v}_{1}\cdot\mathbf{v}_{2}=0$, the vectors $\mathbf{v}_1$ and $\mathbf{v}_2$ are orthogonal. Since  $\mathbf{v}_{1}$ and $\mathbf{v}_{2}$ are nonzero orthogonal vectors, they are linearly independent, and it follows that $\mathbf{v}_{1}$ and $\mathbf{v}_{2}$ form an orthogonal basis for $V$. However, since $\mathbf{v}_{i}\cdot\mathbf{v}_{i}=2\neq 1$ for $i=1,2$, we know that $\mathbf{v}_{1}$ and $\mathbf{v}_{2}$ do not form an orthonormal basis for $V$.</p>
<hr />
<p>	To find an orthonormal basis for $V$, note that for any scalars $a$ and $b$, $(a\mathbf{v}_{1})\cdot(b\mathbf{v}_{2})=ab(\mathbf{v}_{1}\cdot\mathbf{v}_{2})=ab\cdot 0=0$. Therefore, $a\mathbf{v}_{1}$ and $b\mathbf{v}_{2}$ will always form an orthogonal basis for $V$. All we need to do is choose $a$ and $b$ so that $a\mathbf{v}_{1}$ and $b\mathbf{v}_{2}$ form an orthonormal set. For $a$, we require<br />
	\[<br />
	1<br />
	=\|a\mathbf{v}\|=a\|\mathbf{v}\|<br />
	\]
	and so<br />
	\[<br />
	a=\frac{1}{\|\mathbf{v}\|}=\frac{1}{\sqrt{2}}.<br />
	\]
	 (Note that $\|\mathbf{v}\| \neq 0$ as $\mathbf{v}\neq \mathbf{0}$. Also, note that to obtain a length 1 vector, we just needed divide the vector by its length.)<br />
	  Similarly, $b=1/\sqrt{2}$. Therefore, if we define<br />
	\begin{align*}<br />
	\mathbf{w}_{1}<br />
	&#038;=<br />
	\dfrac{1}{\sqrt{2}}<br />
	\mathbf{v}_{1}<br />
	=<br />
	\dfrac{1}{\sqrt{2}}<br />
	\begin{bmatrix}<br />
	1 \\ 1<br />
	\end{bmatrix}<br />
	,<br />
	\\<br />
	\mathbf{w}_{2}<br />
	&#038;=<br />
	\dfrac{1}{\sqrt{2}}<br />
	\mathbf{v}_{2}<br />
	=<br />
	\dfrac{1}{\sqrt{2}}<br />
	\begin{bmatrix}<br />
	1 \\ -1<br />
	\end{bmatrix}<br />
	,<br />
	\end{align*}<br />
	then $\mathbf{w}_{1}$ and $\mathbf{w}_{2}$ form an orthonormal basis for $V$.</p>
<button class="simplefavorite-button has-count" data-postid="6973" data-siteid="1" data-groupid="1" data-favoritecount="71" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">71</span></button><p>The post <a href="https://yutsumura.com/normalize-lengths-to-obtain-an-orthonormal-basis/" target="_blank">Normalize Lengths to Obtain an Orthonormal Basis</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/normalize-lengths-to-obtain-an-orthonormal-basis/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6973</post-id>	</item>
		<item>
		<title>Find an Orthonormal Basis of the Given Two Dimensional Vector Space</title>
		<link>https://yutsumura.com/find-an-orthonormal-basis-of-the-given-two-dimensional-vector-space/</link>
				<comments>https://yutsumura.com/find-an-orthonormal-basis-of-the-given-two-dimensional-vector-space/#respond</comments>
				<pubDate>Wed, 08 Nov 2017 05:49:58 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[Gram-Schmidt orthogonalization process]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[orthonormal basis]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5251</guid>
				<description><![CDATA[<p>Let $W$ be a subspace of $\R^4$ with a basis \[\left\{\, \begin{bmatrix} 1 \\ 0 \\ 1 \\ 1 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 1 \\ 1 \end{bmatrix} \,\right\}.\] Find an orthonormal basis&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/find-an-orthonormal-basis-of-the-given-two-dimensional-vector-space/" target="_blank">Find an Orthonormal Basis of the Given Two Dimensional Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 602</h2>
<p> Let $W$ be a subspace of $\R^4$ with a basis<br />
	\[\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1 \\<br />
	   1<br />
	   \end{bmatrix}, \begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    1 \\<br />
	   1<br />
	   \end{bmatrix} \,\right\}.\]
<p>	   Find an ortho<strong>normal</strong> basis of $W$.</p>
<p><em>(The Ohio State University, Linear Algebra Midterm)</em><br />
&nbsp;<br />
<span id="more-5251"></span><br />

<h2>Solution.</h2>
<p>	Let<br />
	\[\mathbf{v}_1= \begin{bmatrix}<br />
	1 \\<br />
	0 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
	0 \\<br />
	1 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix} .\]
	Note that they are not orthogonal as the dot product is<br />
	\[\mathbf{v}_1\cdot \mathbf{v}_2= \begin{bmatrix}<br />
	1 \\<br />
	0 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix}\cdot \begin{bmatrix}<br />
	0 \\<br />
	1 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix}=1\cdot0+ 0\cdot 1+ 1\cdot 1+1\cdot 1=2\neq 0.\]
<hr />
<p>	Let us first find an orthogonal basis for $W$ by the Gram-Schmidt orthogonalization process.</p>
<p>	Let $\mathbf{w}_1:=\mathbf{v}_1$.<br />
	Next, let $\mathbf{w}_2:=\mathbf{v}_2+a\mathbf{v}_1$, where $a$ is a scalar to be determined so that $\mathbf{w}_1\cdot \mathbf{w}_2=0$.<br />
(You may also use the formula of the Gram-Schmidt orthogonalization.)</p>
<p>	As $\mathbf{w}_1$ and $\mathbf{w}_2$ is orthogonal, we have<br />
	\begin{align*}<br />
	0&#038;=\mathbf{w}_1\cdot \mathbf{w}_2=\mathbf{v}_1\cdot(\mathbf{v}_2+a\mathbf{v}_1)\\<br />
	&#038;=\mathbf{v}_1\cdot\mathbf{v}_2+a\mathbf{v}_1\cdot\mathbf{v}_1\\<br />
	&#038;=2+3a.<br />
	\end{align*}<br />
	It follows that $a=-2/3$ and<br />
	\[\mathbf{w}_2=\mathbf{v}_2-\frac{2}{3}\mathbf{v}_1=\begin{bmatrix}<br />
	0 \\<br />
	1 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix}-\frac{2}{3}\begin{bmatrix}<br />
	1 \\<br />
	0 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix}.\]
<hr />
<p>	Now, to avoid fractions in our computation, let us consider $3\mathbf{w}_2$, instead of $\mathbf{w}_2$. Note that the scaling does not change the orthogonality.<br />
	We have<br />
	\[3\mathbf{w}_2=3\begin{bmatrix}<br />
	0 \\<br />
	1 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix}-2\begin{bmatrix}<br />
	1 \\<br />
	0 \\<br />
	1 \\<br />
	1<br />
	\end{bmatrix}=\begin{bmatrix}<br />
	  -2\\ 3\\ 1 \\1<br />
	\end{bmatrix}.\]
<p>	Thus the set $\{\mathbf{w}_1, 3\mathbf{w}_2\}$ is an orthogonal basis for $W$.<br />
	However, the length  of these vectors are not $1$ as we see<br />
	\begin{align*}<br />
	\|\mathbf{w}_1\|&#038;=\sqrt{1^2+0^2+1^2+1^2}=\sqrt{3}\\<br />
	\|3\mathbf{w}_2\|&#038;=\sqrt{(-2)^2+3^2+1^2+1^2}=\sqrt{15}.<br />
	\end{align*}</p>
<hr />
<p>	Now it suffices to normalize the vectors $\mathbf{w}_1, 3\mathbf{w}_2$ to obtain an orthonormal basis.<br />
	Therefore, the set<br />
	\[\left\{\, \frac{1}{\sqrt{3}} \begin{bmatrix}<br />
		1 \\<br />
		0 \\<br />
		1 \\<br />
		1<br />
		\end{bmatrix}, \, \frac{1}{\sqrt{15}}\begin{bmatrix}<br />
		-2\\ 3\\ 1 \\1<br />
		\end{bmatrix}\,\right\}.\]
		is an orthonormal basis for $W$.</p>
<h2>Comment.</h2>
<p>This is one of the midterm 2 exam problems for Linear Algebra (Math 2568) in Autumn 2017.</p>
<p>One common mistake is just to normalize the vectors by dividing them by their length $\sqrt{3}$.<br />
The resulting vectors have length $1$, but they are not orthogonal.</p>
<p>Another mistake is that you just changed the numbers in the vectors so that they are orthogonal.<br />
The issue here is that if you change the numbers randomly, then the new vectors might no longer belong to the subspace $W$.</p>
<p>The point of the Gram-Schmidt orthogonalization is that the process converts any basis for $W$ to an orthogonal basis for $W$.<br />
The above solution didn&#8217;t use the full formula of the Gram-Schmidt orthogonalization. Of course, you may use the formula in the exam but you must remember it correctly.</p>
<h2>List of Midterm 2 Problems for Linear Algebra (Math 2568) in Autumn 2017</h2>
<ol>
<li><a href="//yutsumura.com/vector-space-of-2-by-2-traceless-matrices/" rel="noopener" target="_blank">Vector Space of 2 by 2 Traceless Matrices</a></li>
<li>Find an Orthonormal Basis of the Given Two Dimensional Vector Space ←The current problem</li>
<li><a href="//yutsumura.com/are-the-trigonometric-functions-sin2x-and-cos2x-linearly-independent/" rel="noopener" target="_blank">Are the Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$ Linearly Independent?</a></li>
<li><a href="//yutsumura.com/find-bases-for-the-null-space-range-and-the-row-space-of-a-5times-4-matrix/" rel="noopener" target="_blank">Find Bases for the Null Space, Range, and the Row Space of a $5\times 4$ Matrix</a></li>
<li><a href="//yutsumura.com/matrix-representation-rank-and-nullity-of-a-linear-transformation-tr2to-r3/" rel="noopener" target="_blank">Matrix Representation, Rank, and Nullity of a Linear Transformation $T:\R^2\to \R^3$</a></li>
<li><a href="//yutsumura.com/determine-the-dimension-of-a-mysterious-vector-space-from-coordinate-vectors/" rel="noopener" target="_blank">Determine the Dimension of a Mysterious Vector Space From Coordinate Vectors</a></li>
<li><a href="//yutsumura.com/find-a-basis-of-the-subspace-spanned-by-four-polynomials-of-degree-3-or-less/" rel="noopener" target="_blank">Find a Basis of the Subspace Spanned by Four Polynomials of Degree 3 or Less</a></li>
</ol>
<button class="simplefavorite-button has-count" data-postid="5251" data-siteid="1" data-groupid="1" data-favoritecount="48" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">48</span></button><p>The post <a href="https://yutsumura.com/find-an-orthonormal-basis-of-the-given-two-dimensional-vector-space/" target="_blank">Find an Orthonormal Basis of the Given Two Dimensional Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/find-an-orthonormal-basis-of-the-given-two-dimensional-vector-space/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5251</post-id>	</item>
		<item>
		<title>Find an Orthonormal Basis of $\R^3$ Containing a Given Vector</title>
		<link>https://yutsumura.com/find-an-orthonormal-basis-of-r3-containing-a-given-vector/</link>
				<comments>https://yutsumura.com/find-an-orthonormal-basis-of-r3-containing-a-given-vector/#respond</comments>
				<pubDate>Mon, 06 Nov 2017 16:25:00 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis for a vector space]]></category>
		<category><![CDATA[cross product]]></category>
		<category><![CDATA[Gram-Schmidt orthogonalization process]]></category>
		<category><![CDATA[Gram-Schmidt process]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[orthonormal basis]]></category>
		<category><![CDATA[perpendicular vector]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5232</guid>
				<description><![CDATA[<p>Let $\mathbf{v}_1=\begin{bmatrix} 2/3 \\ 2/3 \\ 1/3 \end{bmatrix}$ be a vector in $\R^3$. Find an orthonormal basis for $\R^3$ containing the vector $\mathbf{v}_1$. &#160; The first solution uses the Gram-Schumidt orthogonalization process. On the&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/find-an-orthonormal-basis-of-r3-containing-a-given-vector/" target="_blank">Find an Orthonormal Basis of $\R^3$ Containing a Given Vector</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 600</h2>
<p>Let $\mathbf{v}_1=\begin{bmatrix}<br />
2/3 \\ 2/3 \\ 1/3<br />
\end{bmatrix}$ be a vector in $\R^3$.</p>
<p>Find an orthonormal basis for $\R^3$ containing the vector $\mathbf{v}_1$.</p>
<p>&nbsp;<br />
<span id="more-5232"></span><br />

The first solution uses the Gram-Schumidt orthogonalization process.<br />
On the other hand, the second solution uses the cross product.</p>
<h2> Solution 1 (The Gram-Schumidt Orthogonalization) </h2>
<p>	First of all, note that the length of the vector $\mathbf{v}_1$ is $1$ as<br />
	\[\|\mathbf{v}_1\|=\sqrt{\left(\frac{2}{3}\right)^2+\left(\frac{2}{3}\right)^2+\left(\frac{1}{3}\right)^2}=1.\]
<hr />
<p>	We want to find two vectors $\mathbf{v}_2, \mathbf{v}_3$ such that $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ is an orthonormal basis for $\R^3$.<br />
	The vectors $\mathbf{v}_2, \mathbf{v}_3$ must lie on the plane that is perpendicular to the vector $\mathbf{v}_1$.<br />
	So consider the subspace<br />
	\[W=\left \{\,\begin{bmatrix} x\\y\\z \end{bmatrix} \in \R^3 \quad \middle | \quad  \begin{bmatrix} x\\y\\z \end{bmatrix}  \cdot \begin{bmatrix}<br />
	2/3 \\ 2/3 \\ 1/3<br />
	\end{bmatrix}=0 \,\right\}\]
<p>	Note that $W$ consists of all vectors that are perpendicular to $\mathbf{v}_1$, hence $W$ is a plane that is perpendicular to $\mathbf{v}_1$.</p>
<hr />
<p>	The relation<br />
	\[ \begin{bmatrix} x\\y\\z \end{bmatrix}  \cdot \begin{bmatrix}<br />
	2/3 \\ 2/3 \\ 1/3<br />
	\end{bmatrix}=0\]
	can be written as<br />
	\[\frac{2}{3}x+\frac{2}{3}y+\frac{1}{3}z=0,\]
	or equivalently<br />
	\[z=-2x-2y.\]
	Hence the vectors in $W$ can be written as<br />
	\[\begin{bmatrix} x\\y\\z \end{bmatrix}=\begin{bmatrix} x\\y\\-2x-2y \end{bmatrix}=x\begin{bmatrix}1\\0\\-2\end{bmatrix}+y\begin{bmatrix}0\\1\\-2\end{bmatrix}.\]
<p>	It follows that<br />
	\[\left\{\,\begin{bmatrix}1\\0\\-2\end{bmatrix}, \begin{bmatrix}0\\1\\-2\end{bmatrix} \,\right\}\]
	is a basis for the subspace $W$. Let us call these vectors $\mathbf{u}_1, \mathbf{u}_2$, respectively.</p>
<hr />
<p>	We apply the Gram-Schmidt orthogonalization to this basis $\{\mathbf{u}_1, \mathbf{u}_2\}$ and obtain an orthogonal basis as follows.<br />
	We do not change the first vector: let $\mathbf{w}_1=\mathbf{u}_1$.</p>
<p>Next, we set<br />
	\[\mathbf{w}_2=\mathbf{u}_2+a\mathbf{u}_1\]
	for some scalar $a$.<br />
	To determine $a$, we compute<br />
	\begin{align*}<br />
		0&#038;=\mathbf{w}_1\cdot \mathbf{w}_2=\mathbf{u}_1\cdot(\mathbf{u}_2+a\mathbf{u}_1) \\<br />
		&#038;=\mathbf{u}_1\cdot\mathbf{u}_2+a\mathbf{u}_1\cdot\mathbf{u}_1\\<br />
		&#038;=(1\cdot 0+0\cdot 1+(-2)\cdot(-2))+a(1\cdot 1+ 0\cdot 0 +(-2)\cdot (-2)\\<br />
		&#038;=4+5a.<br />
		\end{align*}<br />
		Hence, $a=-4/5$ and we obtain<br />
	\begin{align*}<br />
		\mathbf{w}_2&#038;=\mathbf{u}_2-\frac{4}{5}\mathbf{u}_1\\[6pt]
		&#038;=\begin{bmatrix}0\\1\\-2\end{bmatrix} -\frac{4}{5}\begin{bmatrix}1\\0\\-2\end{bmatrix}.<br />
		\end{align*}<br />
	 (Note that you may use the Gram-Schumidt orthogonalization formula, instead of the above method.)</p>
<hr />
<p>	 As scaling does not change the orthogonality, consider $5\mathbf{w}_2$, instead of $\mathbf{w}_2$ (to avoids fractions).<br />
	 We have<br />
	 \[5\mathbf{w}_2=\begin{bmatrix}0\\5\\-10\end{bmatrix} -4\begin{bmatrix}1\\0\\-2\end{bmatrix}=\begin{bmatrix}<br />
	 -4\\ 5\\-2<br />
	 \end{bmatrix}.\]
<p>	 Therefore, $\{\mathbf{w}_1, 5\mathbf{w}_2\}$ is an orthogonal basis of $W$.</p>
<hr />
<p>	 We obtain an orthonormal basis of $W$ by normalizing the length of these basis vectors.<br />
	 As<br />
	 \[\|\mathbf{w}_1\|=\sqrt{1^2+0^2+(-2)^2}=\sqrt{5}\]
	 and<br />
	\[\|5\mathbf{w}_2\|=\sqrt{(-4)^2+5^2+(-2)^2}=\sqrt{45}=3\sqrt{5},\]
	the vectors<br />
	\[\mathbf{v}_2:=\frac{\mathbf{w}_1}{\|\mathbf{w}_1\|}=\frac{1}{\sqrt{5}}\begin{bmatrix}1\\0\\-2\end{bmatrix}\]
	and<br />
	\[\mathbf{v}_3:=\frac{5\mathbf{w}_2}{\|5\mathbf{w}_2\|}=\frac{1}{3\sqrt{5}}\begin{bmatrix}-4\\5\\-2\end{bmatrix}\]
	form an orthonormal basis of $W$.<br />
	Note that as the vectors $\mathbf{v}_2, \mathbf{v}_3$ lie in $W$, they are still perpendicular to the vector $\mathbf{v}_1$.</p>
<hr />
<p>	It follows that $\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ is an orthonomal set in $\R^3$, thus it is an orthonormal basis for $\R^3$.</p>
<h2> Solution 2 (Cross Product) </h2>
<p>				Next, we solve the problem using the cross product.</p>
<p>			Let $\mathbf{v}_1=\frac{1}{3}\mathbf{u}_1$, where $\mathbf{u}_1=\begin{bmatrix}<br />
	  2 \\<br />
	   2 \\<br />
	    1<br />
	  \end{bmatrix}$.<br />
			Our first goal is to find the vectors $\mathbf{u}_2$ and $\mathbf{u}_3$ such that $\{\mathbf{u}_1,\mathbf{u}_2, \mathbf{u}_3\}$ is an orthogonal basis for $\R^3$.</p>
<hr />
<p>			Let $\mathbf{x}=\begin{bmatrix}<br />
	  x \\<br />
	   y \\<br />
	    z<br />
	  \end{bmatrix}$ be a vector that is perpendicular to $\mathbf{u}_1$.<br />
	  Then we have $\mathbf{x}\cdot \mathbf{u}_1=0$, and hence we have the relation<br />
	  \[2x+2y+z=0.\]
	  For example, the vector $\mathbf{u}_2:=\begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    -2<br />
	  \end{bmatrix}$ satisfies the relation, and hence $\mathbf{u}_2\cdot \mathbf{u}_1=0$.<br />
	  (So far, it is not so different from Solution 1.)</p>
<hr />
<p>	Now, let us define the third vector $\mathbf{u}_3$ to be the cross product of $\mathbf{u}_1$ and $\mathbf{u}_2$:<br />
	\[\mathbf{u}_3:=\mathbf{u}_1\times \mathbf{u}_2=\begin{bmatrix}<br />
	  2 \\<br />
	   2 \\<br />
	    1<br />
	  \end{bmatrix}\times \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    -2<br />
	  \end{bmatrix}=\begin{bmatrix}<br />
	  -4 \\<br />
	   5 \\<br />
	    -2<br />
	  \end{bmatrix}.\]
	By the property of the cross product, the vector $\mathbf{u}_3$ is perpendicular to both $\mathbf{u}_1, \mathbf{u}_2$.</p>
<p>	Therefore, the set<br />
	\[\{\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3\}=\left\{\,\begin{bmatrix}<br />
	  2 \\<br />
	   2 \\<br />
	    1<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    -2<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  -4 \\<br />
	   5 \\<br />
	    -2<br />
	  \end{bmatrix} \,\right\}\]
	  is an orthogonal basis for $\R^3$ as it consists of three nonzero orthogonal vectors.</p>
<hr />
<p>	  Finally, to obtain an orthonormal basis for $\R^2$, we just need to normalize the lengths of these vectors.<br />
	  The lengths are<br />
	  \begin{align*}<br />
	\|\mathbf{u}_1\|&#038;=\sqrt{2^2+2^2+1^2}=\sqrt{9}=3  	\\<br />
	\|\mathbf{u}_2\|&#038;=\sqrt{1^2+0^2+(-2)^2}=\sqrt{5}\\<br />
	\|\mathbf{u}_3\|&#038;=\sqrt{(-4)^2+5^2+(-2)^2}=\sqrt{45}=3\sqrt{5}.<br />
	  \end{align*}<br />
	  Then we have $\mathbf{v}_1=\frac{\mathbf{u}_1}{\|\mathbf{u}_1\|}$.<br />
	  Let $\mathbf{v}_2:=\frac{\mathbf{u}_2}{\|\mathbf{u}_2\|}$ and $\mathbf{v}_3:=\frac{\mathbf{u}_3}{\|\mathbf{u}_3\|}$.</p>
<p>	It follows that the set<br />
	\[\{\mathbf{v}_1\mathbf{v}_2\mathbf{v}_3\}=\left\{\, \frac{1}{3}\begin{bmatrix}<br />
	  2 \\<br />
	   2 \\<br />
	    1<br />
	  \end{bmatrix}, \, \frac{1}{\sqrt{5}}\begin{bmatrix}1\\0\\-2\end{bmatrix}, \, \frac{1}{3\sqrt{5}}\begin{bmatrix}-4\\5\\-2\end{bmatrix}  \,\right\}\]
	  is an orthonormal basis for $\R^3$.</p>
<button class="simplefavorite-button has-count" data-postid="5232" data-siteid="1" data-groupid="1" data-favoritecount="37" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">37</span></button><p>The post <a href="https://yutsumura.com/find-an-orthonormal-basis-of-r3-containing-a-given-vector/" target="_blank">Find an Orthonormal Basis of $\R^3$ Containing a Given Vector</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/find-an-orthonormal-basis-of-r3-containing-a-given-vector/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5232</post-id>	</item>
		<item>
		<title>The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</title>
		<link>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/</link>
				<comments>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/#comments</comments>
				<pubDate>Wed, 16 Aug 2017 21:55:59 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[Gram-Schmidt orthogonalization process]]></category>
		<category><![CDATA[Gram-Schmidt process]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inner product space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[positive definite matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4653</guid>
				<description><![CDATA[<p>Consider the $2\times 2$ real matrix \[A=\begin{bmatrix} 1 &#038; 1\\ 1&#038; 3 \end{bmatrix}.\] (a) Prove that the matrix $A$ is positive definite. (b) Since $A$ is positive definite by part (a), the formula \[\langle&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 539</h2>
<p>		   Consider the $2\times 2$ real matrix<br />
		   \[A=\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}.\]
<p><strong>(a)</strong> Prove that the matrix $A$ is positive definite.</p>
<p><strong>(b)</strong> Since $A$ is positive definite by part (a), the formula<br />
		\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans} A \mathbf{y}\]
		for $\mathbf{x}, \mathbf{y} \in \R^2$ defines an inner product on $\R^n$.<br />
		Consider $\R^2$ as an inner product space with this inner product.</p>
<p>		Prove that the unit vectors<br />
		\[\mathbf{e}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix} \text{ and } \mathbf{e}_2=\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}\]
		are not orthogonal in the inner product space $\R^2$.</p>
<p><strong>(c)</strong> Find an orthogonal basis $\{\mathbf{v}_1, \mathbf{v}_2\}$ of $\R^2$ from the basis $\{\mathbf{e}_1, \mathbf{e}_2\}$ using the Gram-Schmidt orthogonalization process.</p>
<p>&nbsp;<br />
<span id="more-4653"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that the matrix $A$ is positive definite.</h3>
<p> We prove that for every nonzero vector $\mathbf{x}=\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}\in \R^2$, we have $\mathbf{x}^{\trans} A \mathbf{x} > 0$.<br />
		We have<br />
		\begin{align*}<br />
		\mathbf{x}^{\trans} A \mathbf{x}&#038;=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix} \begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x+y \\<br />
		  x+3y<br />
		\end{bmatrix}\\[6pt]
		&#038;=x(x+y)+y(x+3y)=x^2+2xy+3y^2\\<br />
		&#038;=x^2+2xy+y^2+2y^2=(x+y)^2+2y^2.<br />
		\end{align*}</p>
<p>		Since $\mathbf{x}\neq \mathbf{0}$, at least one of $x, y$ is nonzero.<br />
		Thus the last expression is always positive.<br />
		Hence $A$ is a positive definite matrix.</p>
<h3>(b) Prove that $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal in the inner product space $\R^2$.</h3>
<p> Note that by post &#8220;<a href="//yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a>&#8220;, the formula $\langle \mathbf{x}, \mathbf{y}\rangle$ defines an inner product on $\R^2$.</p>
<p>		Two vectors $\mathbf{x}$ and $\mathbf{y}$ is said to be <strong>orthogonal</strong> if $\langle \mathbf{x}, \mathbf{y}\rangle=0$.</p>
<p>		The vectors $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal with this inner product since<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_2\rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  3<br />
		\end{bmatrix}=1\neq 0.<br />
		\end{align*}</p>
<h3>(c) Find an orthogonal basis using the Gram-Schmidt orthogonalization process.</h3>
<p>By the Gram-Schmidt orthogonalization process, we have<br />
		\begin{align*}<br />
		\mathbf{v}_1&#038;=\mathbf{e}_1\\<br />
		\mathbf{v}_2&#038;=\mathbf{e}_2-\frac{\langle \mathbf{v}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{v}_1, \mathbf{v}_1 \rangle}\mathbf{v}_1<br />
		=\mathbf{e}_2-\frac{\langle \mathbf{e}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{e}_1, \mathbf{e}_1 \rangle}\mathbf{e}_1.<br />
		\end{align*}</p>
<p>		We compute<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_1 \rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  1<br />
		\end{bmatrix}=1.<br />
		\end{align*}<br />
		We also have $\langle \mathbf{e}_1, \mathbf{e}_2\rangle=1$ from part (b).<br />
		Thus, we have<br />
		\begin{align*}<br />
		\mathbf{v}_2=\mathbf{e}_2-\mathbf{e}_1=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Thus, the Gram-Schmidt orthogonalization process yields the orthogonal basis<br />
		\[\mathbf{v}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.\]
<h4>Double Check</h4>
<p>		Let us verify that $\mathbf{v}_1, \mathbf{v}_2$ are orthogonal by computing their inner product directly as follows.<br />
		We have<br />
		\begin{align*}<br />
		\langle \mathbf{v}_1, \mathbf{v}_2\rangle=\mathbf{v}_1^{\trans} A\mathbf{v}_2=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  2<br />
		\end{bmatrix}=0.<br />
		\end{align*}</p>
<button class="simplefavorite-button has-count" data-postid="4653" data-siteid="1" data-groupid="1" data-favoritecount="21" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">21</span></button><p>The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4653</post-id>	</item>
		<item>
		<title>Quiz 10. Find Orthogonal Basis / Find Value of Linear Transformation</title>
		<link>https://yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/</link>
				<comments>https://yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/#comments</comments>
				<pubDate>Wed, 29 Mar 2017 20:51:26 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[Gram-Schmidt process]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear transformation]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[span]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2546</guid>
				<description><![CDATA[<p>(a) Let $S=\{\mathbf{v}_1, \mathbf{v}_2\}$ be the set of the following vectors in $\R^4$. \[\mathbf{v}_1=\begin{bmatrix} 1 \\ 0 \\ 1 \\ 0 \end{bmatrix} \text{ and } \mathbf{v}_2=\begin{bmatrix} 0 \\ 1 \\ 1 \\ 0 \end{bmatrix}.\]&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/" target="_blank">Quiz 10. Find Orthogonal Basis / Find Value of Linear Transformation</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 356</h2>
<p>	<strong>(a)</strong> Let $S=\{\mathbf{v}_1, \mathbf{v}_2\}$ be the set of the following vectors in $\R^4$.<br />
	\[\mathbf{v}_1=\begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix} \text{ and } \mathbf{v}_2=\begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}.\]
	   Find an orthogonal basis of the subspace $\Span(S)$ of $\R^4$.</p>
<p>	&nbsp;<br />
	<strong>(b)</strong> Let $T:\R^2 \to \R^3$ be a linear transformation such that<br />
	\[T(\mathbf{e}_1)=\mathbf{u}_1 \text{ and } T(\mathbf{e}_2)=\mathbf{u}_2,\]
	where $\{\mathbf{e}_1, \mathbf{e}_2\}$ is the standard unit vectors of $\R^2$ and<br />
	\[\mathbf{u}_1=\begin{bmatrix}<br />
	  5 \\<br />
	   1 \\<br />
	    2<br />
	  \end{bmatrix} \text{ and } \mathbf{u}_2=\begin{bmatrix}<br />
	  8 \\<br />
	   2 \\<br />
	    6<br />
	  \end{bmatrix}.\]
	  Then find<br />
	  \[T\left(\,  \begin{bmatrix}<br />
	  3 \\<br />
	  -2<br />
	\end{bmatrix} \,\right).\]
<p>&nbsp;<br />
<span id="more-2546"></span><br />

<h2>(a) Solution 1. (Using the Gram-Schmidt process)</h2>
<p>	   It is straightforward to check that the vectors $\mathbf{v}_1, \mathbf{v}_2$ are linearly independent, and hence the set $S$ is a basis of $\Span(S)$.<br />
	   Since the dot (inner) product of $\mathbf{v}_1$ and $\mathbf{v}_2$ is<br />
	   \[\mathbf{v_1}\cdot \mathbf{v}_2=1\neq 0,\]
	   $S$ is not an orthogonal basis. We apply the Gram-Schmidt process to generate an orthogonal basis from the basis $S$.</p>
<p>	   The Gram-Schmidt process for two vectors is as follows. We define vectors $\mathbf{u}_1, \mathbf{u}_2$ by the following formula. Then $B=\{\mathbf{u}_1, \mathbf{u_2} \}$ is an orthogonal basis of $\Span(S)$.<br />
	   \begin{align*}<br />
	\mathbf{u}_1&#038;:=\mathbf{v}_1\\[6pt]
	\mathbf{u}_2&#038;:=\mathbf{v}_2-\frac{\mathbf{u}_1\cdot \mathbf{v}_2}{\mathbf{u}_1\cdot \mathbf{u}_1} \mathbf{u}_1. \tag{*}<br />
	\end{align*}<br />
	Since we have<br />
	\begin{align*}<br />
	\mathbf{u}_1 \cdot \mathbf{v}_2=\mathbf{v}_1\cdot \mathbf{v}_2=1 \text{ and }<br />
	\mathbf{u}_1\cdot \mathbf{u}_1=\mathbf{v}_1\cdot \mathbf{v}_1=2,<br />
	\end{align*}<br />
	we compute<br />
	\begin{align*}<br />
	\mathbf{u}_2&#038;=\mathbf{v}_2-\frac{1}{2}\mathbf{u}_1\\[6pt]
	&#038;=\begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}-\frac{1}{2}<br />
	   \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}\\[6pt]
	   &#038;=\begin{bmatrix}<br />
	  -1/2\\<br />
	   1 \\<br />
	    1/2 \\<br />
	   0<br />
	   \end{bmatrix}=\frac{1}{2}\begin{bmatrix}<br />
	  -1 \\<br />
	   2 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}.<br />
	\end{align*}<br />
	Therefore the set<br />
	\[\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix},  \begin{bmatrix}<br />
	  -1/2\\<br />
	   1 \\<br />
	    1/2 \\<br />
	   0<br />
	   \end{bmatrix}\,\right\}\]
	   is an orthogonal basis of $\Span(S)$.<br />
	   Note that scaling by a nonzero scalar does not change the orthogonality, the set<br />
	   \[\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix},  \begin{bmatrix}<br />
	  -1\\<br />
	   2 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}\,\right\}\]
	   is also an orthogonal basis of $\Span(S)$, just in case you prefer not to have a fraction.</p>
<h2>(a) Solution 2. (Using a pattern of the Gram-Schmidt process)</h2>
<p>		Here is another solution using a partial information of the Gram-Schmidt process.<br />
		As in Solution 1, the set $S$ is a (non-orthogonal) basis of $\Span(S)$.<br />
		We want to apply the Gram-Schmidt process but suppose we only remember the pattern of the Gram-Schmidt process. Namely, we want to define orthogonal vectors $\mathbf{u}, \mathbf{u}_2$ by<br />
		\begin{align*}<br />
	\mathbf{u}_1&#038;:=\mathbf{v}_1\\[6pt]
	\mathbf{u}_2&#038;:=\mathbf{v}_2+a \mathbf{u}_1.<br />
	\end{align*}<br />
	Here $a$ is some number, which is given in the Gram-Schmidt process (*) but we don&#8217;t remember.<br />
	We can still determine the number $a$ as follows. Since $\mathbf{u}_1$ and $\mathbf{u_2}$ will be orthogonal, we have<br />
	\begin{align*}<br />
	0&#038;=\mathbf{u}_1\cdot \mathbf{u}_2=\mathbf{u}_1 \cdot (\mathbf{v}_2+a\mathbf{u}_1)\\<br />
	&#038;=\mathbf{u}_1\cdot \mathbf{v}_2+a\mathbf{u}_1 \cdot \mathbf{u}_1\\<br />
	&#038;=1+2a.<br />
	\end{align*}<br />
		Hence, we obtain $a=-1/2$.<br />
		Then we determine<br />
		\begin{align*}<br />
	\mathbf{u}_2&#038;=\mathbf{v}_2-\frac{1}{2}\mathbf{u}_1\\[6pt]
	&#038;=\begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}-\frac{1}{2}<br />
	   \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}\\[6pt]
	   &#038;=\begin{bmatrix}<br />
	  -1/2\\<br />
	   1 \\<br />
	    1/2 \\<br />
	   0<br />
	   \end{bmatrix}=\frac{1}{2}\begin{bmatrix}<br />
	  -1 \\<br />
	   2 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix}.<br />
	\end{align*}<br />
	(So we could complete the Gram-Schmidt process even though we didn&#8217;t remember the details.)<br />
	Hence the set<br />
	\[\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    1 \\<br />
	   0<br />
	   \end{bmatrix},  \begin{bmatrix}<br />
	  -1/2\\<br />
	   1 \\<br />
	    1/2 \\<br />
	   0<br />
	   \end{bmatrix}\,\right\}\]
	   is an orthogonal basis of $\Span(S)$.</p>
<h2>(b) Solution.</h2>
<p>		We first express the vector $\begin{bmatrix}<br />
	  3 \\<br />
	  -2<br />
	\end{bmatrix}$ as the linear combination<br />
	\[\begin{bmatrix}<br />
	  3 \\<br />
	  -2<br />
	\end{bmatrix}=3\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}-2\begin{bmatrix}<br />
	  0 \\<br />
	  1<br />
	\end{bmatrix}=3\mathbf{e}_1-2\mathbf{e}_2.\]
	Then we compute<br />
	\begin{align*}<br />
	T\left(\,  \begin{bmatrix}<br />
	  3 \\<br />
	  -2<br />
	\end{bmatrix} \,\right)&#038;=T(3\mathbf{e}_1-2\mathbf{e}_2)\\<br />
	&#038;=3T(\mathbf{e}_1)-2T(\mathbf{e}_2) &#038;&#038; \text{ by linearity of $T$}\\[6pt]
	&#038;=3\begin{bmatrix}<br />
	  5 \\<br />
	   1 \\<br />
	    2<br />
	  \end{bmatrix}-2\begin{bmatrix}<br />
	  8 \\<br />
	   2 \\<br />
	    6<br />
	  \end{bmatrix}\\[6pt]
	  &#038;=\begin{bmatrix}<br />
	  15 \\<br />
	   3 \\<br />
	    6<br />
	  \end{bmatrix}-\begin{bmatrix}<br />
	  16 \\<br />
	   4 \\<br />
	    12<br />
	  \end{bmatrix}=\begin{bmatrix}<br />
	  -1 \\<br />
	   -1 \\<br />
	    -6<br />
	  \end{bmatrix}.<br />
	  \end{align*}</p>
<p>	  Therefore we have found<br />
	  \[T\left(\,  \begin{bmatrix}<br />
	  3 \\<br />
	  -2<br />
	\end{bmatrix} \,\right)<br />
	=\begin{bmatrix}<br />
	  -1 \\<br />
	   -1 \\<br />
	    -6<br />
	  \end{bmatrix}.\]
<h2>Comment.</h2>
<p>These are Quiz 10 problems for Math 2568 (Introduction to Linear Algebra) at OSU in Spring 2017.</p>
<h3>List of Quiz Problems of Linear Algebra (Math 2568) at OSU in Spring 2017</h3>
<p>There were 13 weekly quizzes. Here is the list of links to the quiz problems and solutions.</p>
<ul>
<li><a href="//yutsumura.com/quiz-1-gauss-jordan-elimination-homogeneous-system-math-2568-spring-2017/" target="_blank">Quiz 1. Gauss-Jordan elimination / homogeneous system. </a></li>
<li><a href="//yutsumura.com/quiz-2-the-vector-form-for-the-general-solution-transpose-matrices-math-2568-spring-2017/" target="_blank">Quiz 2. The vector form for the general solution / Transpose matrices. </a></li>
<li><a href="//yutsumura.com/quiz-3-condition-that-vectors-are-linearly-dependent-orthogonal-vectors-are-linearly-independent/" target="_blank">Quiz 3. Condition that vectors are linearly dependent/ orthogonal vectors are linearly independent</a></li>
<li><a href="//yutsumura.com/quiz-4-inverse-matrix-nonsingular-matrix-satisfying-a-relation/" target="_blank">Quiz 4. Inverse matrix/ Nonsingular matrix satisfying a relation</a></li>
<li><a href="//yutsumura.com/quiz-5-example-and-non-example-of-subspaces-in-3-dimensional-space/" target="_blank">Quiz 5. Example and non-example of subspaces in 3-dimensional space</a></li>
<li><a href="//yutsumura.com/quiz-6-determine-vectors-in-null-space-range-find-a-basis-of-null-space/" target="_blank">Quiz 6. Determine vectors in null space, range / Find a basis of null space</a></li>
<li><a href="//yutsumura.com/quiz-7-find-a-basis-of-the-range-rank-and-nullity-of-a-matrix/" target="_blank">Quiz 7. Find a basis of the range, rank, and nullity of a matrix</a></li>
<li><a href="//yutsumura.com/quiz-8-determine-subsets-are-subspaces-functions-taking-integer-values-set-of-skew-symmetric-matrices/" target="_blank">Quiz 8. Determine subsets are subspaces: functions taking integer values / set of skew-symmetric matrices</a></li>
<li><a href="//yutsumura.com/quiz-9-find-a-basis-of-the-subspace-spanned-by-four-matrices/" target="_blank">Quiz 9. Find a basis of the subspace spanned by four matrices</a></li>
<li><a href="//yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/" target="_blank">Quiz 10. Find orthogonal basis / Find value of linear transformation</a></li>
<li><a href="//yutsumura.com/quiz-11-find-eigenvalues-and-eigenvectors-properties-of-determinants/" target="_blank">Quiz 11. Find eigenvalues and eigenvectors/ Properties of determinants</a></li>
<li><a href="//yutsumura.com/quiz-12-find-eigenvalues-and-their-algebraic-and-geometric-multiplicities/" target="_blank">Quiz 12. Find eigenvalues and their algebraic and geometric multiplicities</a></li>
<li><a href="//yutsumura.com/quiz-13-part-1-diagonalize-a-matrix/" target="_blank">Quiz 13 (Part 1). Diagonalize a matrix.</a></li>
<li><a href="//yutsumura.com/quiz-13-part-2-find-eigenvalues-and-eigenvectors-of-a-special-matrix/" target="_blank">Quiz 13 (Part 2). Find eigenvalues and eigenvectors of a special matrix</a></li>
</ul>
<button class="simplefavorite-button has-count" data-postid="2546" data-siteid="1" data-groupid="1" data-favoritecount="21" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">21</span></button><p>The post <a href="https://yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/" target="_blank">Quiz 10. Find Orthogonal Basis / Find Value of Linear Transformation</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/quiz-10-find-orthogonal-basis-find-value-of-linear-transformation/feed/</wfw:commentRss>
		<slash:comments>7</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2546</post-id>	</item>
	</channel>
</rss>
