<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>inner product space &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/inner-product-space/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Thu, 26 Oct 2017 21:38:28 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>An Orthogonal Transformation from $\R^n$ to $\R^n$ is an Isomorphism</title>
		<link>https://yutsumura.com/an-orthogonal-transformation-from-rn-to-rn-is-an-isomorphism/</link>
				<comments>https://yutsumura.com/an-orthogonal-transformation-from-rn-to-rn-is-an-isomorphism/#respond</comments>
				<pubDate>Wed, 25 Oct 2017 05:47:24 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[Euclidean space]]></category>
		<category><![CDATA[injective linear transformation]]></category>
		<category><![CDATA[inner product space]]></category>
		<category><![CDATA[isomorphism]]></category>
		<category><![CDATA[isomorphism of vector spaces]]></category>
		<category><![CDATA[kernel of]]></category>
		<category><![CDATA[linear transformation]]></category>
		<category><![CDATA[null space of a linear transformation]]></category>
		<category><![CDATA[orthogonal transformation]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5163</guid>
				<description><![CDATA[<p>Let $\R^n$ be an inner product space with inner product $\langle \mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\trans}\mathbf{y}$ for $\mathbf{x}, \mathbf{y}\in \R^n$. A linear transformation $T:\R^n \to \R^n$ is called orthogonal transformation if for all $\mathbf{x}, \mathbf{y}\in \R^n$, it&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/an-orthogonal-transformation-from-rn-to-rn-is-an-isomorphism/" target="_blank">An Orthogonal Transformation from $\R^n$ to $\R^n$ is an Isomorphism</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 592</h2>
<p>Let $\R^n$ be an inner product space with inner product $\langle \mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\trans}\mathbf{y}$ for $\mathbf{x}, \mathbf{y}\in \R^n$.</p>
<p>	 A linear transformation $T:\R^n \to \R^n$ is called <strong>orthogonal transformation</strong> if for all $\mathbf{x}, \mathbf{y}\in \R^n$, it satisfies<br />
	 \[\langle T(\mathbf{x}), T(\mathbf{y})\rangle=\langle\mathbf{x}, \mathbf{y} \rangle.\]
<p>	 Prove that if $T:\R^n\to \R^n$ is an orthogonal transformation, then $T$ is an isomorphism.</p>
<p>&nbsp;<br />
<span id="more-5163"></span><br />

We give two proofs.<br />
The second one uses a fact about the injectivity of linear transformations.</p>
<h2>Proof 1.</h2>
<p>	 	As $T$ is a linear transformation from $\R^n$ to itself, it suffices to show that $T$ is an injective linear transformation.</p>
<p>	 	Suppose that $T(\mathbf{x})=T(\mathbf{y})$ for $\mathbf{x}, \mathbf{y}\in \R^n$.<br />
	 	We show that $\mathbf{x}=\mathbf{y}$.</p>
<p>	 	We have<br />
	 	\begin{align*}<br />
		&#038;\|\mathbf{x}-\mathbf{y}\|^2\\<br />
		&#038;=(\mathbf{x}-\mathbf{y})^{\trans}(\mathbf{x}-\mathbf{y})\\<br />
		&#038;=(\mathbf{x}^{\trans}-\mathbf{y}^{\trans})(\mathbf{x}-\mathbf{y})\\<br />
		&#038;=\mathbf{x}^{\trans}\mathbf{x}-\mathbf{x}^{\trans}\mathbf{y}-\mathbf{y}^{\trans}\mathbf{x}+\mathbf{y}^{\trans}\mathbf{y}\\<br />
		&#038;=\langle\mathbf{x}, \mathbf{x} \rangle-\langle\mathbf{x}, \mathbf{y} \rangle-\langle\mathbf{y}, \mathbf{x} \rangle+\langle\mathbf{y}, \mathbf{y} \rangle\\<br />
		&#038;=\langle T(\mathbf{x}), T(\mathbf{x}) \rangle-\langle T(\mathbf{x}), T(\mathbf{y}) \rangle-\langle T(\mathbf{y}), T(\mathbf{x}) \rangle+\langle T(\mathbf{y}), T(\mathbf{y}) \rangle\\<br />
		&#038;\text{(since $T$ is an orthogonal transformation)}\\<br />
		&#038;=\langle T(\mathbf{x}), T(\mathbf{x}) \rangle-\langle T(\mathbf{x}), T(\mathbf{x}) \rangle-\langle T(\mathbf{x}), T(\mathbf{x}) \rangle+\langle T(\mathbf{x}), T(\mathbf{x}) \rangle\\<br />
		&#038;\text{(since $T(\mathbf{x})=T(\mathbf{y})$)}\\<br />
		&#038;=0.<br />
		\end{align*}<br />
		It follows that $\|\mathbf{x}-\mathbf{y}\|=0$ and hence $\mathbf{x}=\mathbf{y}$.<br />
		This proves that $T:\R^n\to \R^n$ is injective.</p>
<p>		As $T$ is an injective linear transformation from the $n$-dimensional vector space $\R^n$ to itself, it is also surjective, and thus $T$ is an isomorphism.</p>
<h2> Proof 2. </h2>
<p> Recall that the linear transformation  $T$ is injective if and only if the null space $\calN(T)=\{\mathbf{0}\}$, that is, $T(\mathbf{x})=\mathbf{0}$ implies that $\mathbf{x}=\mathbf{0}$.<br />
(See the post &#8220;<a href="//yutsumura.com/a-linear-transformation-is-injective-one-to-one-if-and-only-if-the-nullity-is-zero/" rel="noopener" target="_blank">A Linear Transformation is Injective (One-To-One) if and only if the Nullity is Zero</a>&#8221; for the proof of this fact.)</p>
<p>			 	We use this fact to show that $T$ is injective.<br />
			 	Suppose that $T(\mathbf{x})=\mathbf{0}$.<br />
			 	Then we have<br />
			 	\begin{align*}<br />
		\|\mathbf{x}\|^2&#038;=\langle \mathbf{x}, \mathbf{x}\rangle\\<br />
		&#038;=\langle T(\mathbf{x}), T(\mathbf{x})\rangle &#038;&#038;\text{as $T$ is orthogonal}\\<br />
		&#038;=\langle \mathbf{0}, \mathbf{0}\rangle=0 &#038;&#038;\text{as $T(\mathbf{x})=\mathbf{0}$}.<br />
		\end{align*}</p>
<p>			 It follows that the length $\|\mathbf{x}\|=0$, and hence $\mathbf{x}=\mathbf{0}$.<br />
			 This proves that the null space $\calN(T)=\{\mathbf{0}\}$ and $T$ is injective.</p>
<p>			 As $T$ is an injective linear transformation from $\R^n$ to itself, it is also surjective, and hence $T$ is an isomorphism.</p>
<button class="simplefavorite-button has-count" data-postid="5163" data-siteid="1" data-groupid="1" data-favoritecount="32" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">32</span></button><p>The post <a href="https://yutsumura.com/an-orthogonal-transformation-from-rn-to-rn-is-an-isomorphism/" target="_blank">An Orthogonal Transformation from $\R^n$ to $\R^n$ is an Isomorphism</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/an-orthogonal-transformation-from-rn-to-rn-is-an-isomorphism/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5163</post-id>	</item>
		<item>
		<title>The Sum of Cosine Squared in an Inner Product Space</title>
		<link>https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/</link>
				<comments>https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/#respond</comments>
				<pubDate>Wed, 30 Aug 2017 03:50:11 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[cosine]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inner product space]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4770</guid>
				<description><![CDATA[<p>Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$. Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$. Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/" target="_blank">The Sum of Cosine Squared in an Inner Product Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 551</h2>
<p>	Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$.<br />
	Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$.<br />
	Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for $i=1,\dots, n$.</p>
<p>	Prove that<br />
	\[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1.\]
<p>&nbsp;<br />
<span id="more-4770"></span><br />

<h2>Definition (Angle between Vectors).</h2>
<p>Let $\langle\mathbf{a}, \mathbf{b}\rangle$ denote the inner product of vectors $\mathbf{a}$ and $\mathbf{b}$ in $V$.</p>
<p>		Recall that the angle $\theta$ between $\mathbf{a}$ and $\mathbf{b}$ is defined as the unique number $\theta$ between $0$ and $\pi$ satisfying<br />
		\[\cos \theta=\frac{\langle\mathbf{a}, \mathbf{b}\rangle}{\|\mathbf{a}\| \|\mathbf{b}\|}.\]
<h2> Proof. </h2>
<p>		Express the vector $\mathbf{v}$ as a linear combination of the basis vectors as<br />
		\[\mathbf{v}=a_1\mathbf{u}_1+\dots+a_n\mathbf{u}_n\]
		for some real numbers $a_1, \dots, a_n$.</p>
<p>		The length of the vector $\mathbf{v}$ is given by<br />
		\[\|\mathbf{v}\|=\sqrt{a_1^2+\cdots+a_n^2}. \tag{*}\]
<hr />
<p>		For each $i$, we have using the properties of the inner product<br />
		\begin{align*}<br />
		\langle \mathbf{v}, \mathbf{u}_i\rangle&#038;=\langle a_1\mathbf{u}_1+\dots+a_n\mathbf{u}_n, \mathbf{u}_i\rangle\\<br />
		&#038;=a_1\langle\mathbf{u}_1, \mathbf{u}_i\rangle+\cdots +a_n \langle\mathbf{u}_n, \mathbf{u}_i \rangle\\<br />
		&#038;=a_i \tag{**}<br />
		\end{align*}<br />
		since $\langle\mathbf{u}_i, \mathbf{u}_i\rangle=1$ and $\langle\mathbf{u}_j, \mathbf{u}_i\rangle=0$ if $j\neq i$ as $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is orthonormal.</p>
<hr />
<p>		By definition of the angle, we have<br />
		\begin{align*}<br />
		\cos \theta_i&#038;=\frac{\langle\mathbf{v}, \mathbf{u}_i\rangle}{\|\mathbf{v}\| \|\mathbf{u}_i\|}=\frac{\langle\mathbf{v}, \mathbf{u}_i\rangle}{\|\mathbf{v}\| } &#038;&#038; \text{since $\|\mathbf{u}_i\|=1$.}<br />
		\end{align*}<br />
		It follows that<br />
		\begin{align*}<br />
		\cos ^2\theta_1+\cdots+\cos^2 \theta_n &#038;=\frac{\langle\mathbf{v}, \mathbf{u}_1\rangle^2}{\|\mathbf{v}\|^2 }+\cdots+\frac{\langle\mathbf{v}, \mathbf{u}_n\rangle^2}{\|\mathbf{v}\|^2 }\\[6pt]
		&#038;=\frac{1}{\|\mathbf{v}\|^2}(a_1^2+\cdots a_n^2) &#038;&#038;\text{by (**)}\\[6pt]
		&#038;=\frac{1}{\|\mathbf{v}\|^2}\cdot \|\mathbf{v}\|^2 &#038;&#038;\text{by (*)}\\[6pt]
		&#038;=1.<br />
		\end{align*}</p>
<p>		Thus we obtain<br />
		\[\cos ^2\theta_1+\cdots+\cos^2 \theta_n=1\]
		as required.</p>
<button class="simplefavorite-button has-count" data-postid="4770" data-siteid="1" data-groupid="1" data-favoritecount="26" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">26</span></button><p>The post <a href="https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/" target="_blank">The Sum of Cosine Squared in an Inner Product Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-sum-of-cosine-squared-in-an-inner-product-space/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4770</post-id>	</item>
		<item>
		<title>The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</title>
		<link>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/</link>
				<comments>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/#comments</comments>
				<pubDate>Wed, 16 Aug 2017 21:55:59 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[Gram-Schmidt orthogonalization process]]></category>
		<category><![CDATA[Gram-Schmidt process]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inner product space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[positive definite matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4653</guid>
				<description><![CDATA[<p>Consider the $2\times 2$ real matrix \[A=\begin{bmatrix} 1 &#038; 1\\ 1&#038; 3 \end{bmatrix}.\] (a) Prove that the matrix $A$ is positive definite. (b) Since $A$ is positive definite by part (a), the formula \[\langle&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 539</h2>
<p>		   Consider the $2\times 2$ real matrix<br />
		   \[A=\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}.\]
<p><strong>(a)</strong> Prove that the matrix $A$ is positive definite.</p>
<p><strong>(b)</strong> Since $A$ is positive definite by part (a), the formula<br />
		\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans} A \mathbf{y}\]
		for $\mathbf{x}, \mathbf{y} \in \R^2$ defines an inner product on $\R^n$.<br />
		Consider $\R^2$ as an inner product space with this inner product.</p>
<p>		Prove that the unit vectors<br />
		\[\mathbf{e}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix} \text{ and } \mathbf{e}_2=\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}\]
		are not orthogonal in the inner product space $\R^2$.</p>
<p><strong>(c)</strong> Find an orthogonal basis $\{\mathbf{v}_1, \mathbf{v}_2\}$ of $\R^2$ from the basis $\{\mathbf{e}_1, \mathbf{e}_2\}$ using the Gram-Schmidt orthogonalization process.</p>
<p>&nbsp;<br />
<span id="more-4653"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that the matrix $A$ is positive definite.</h3>
<p> We prove that for every nonzero vector $\mathbf{x}=\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}\in \R^2$, we have $\mathbf{x}^{\trans} A \mathbf{x} > 0$.<br />
		We have<br />
		\begin{align*}<br />
		\mathbf{x}^{\trans} A \mathbf{x}&#038;=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix} \begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x+y \\<br />
		  x+3y<br />
		\end{bmatrix}\\[6pt]
		&#038;=x(x+y)+y(x+3y)=x^2+2xy+3y^2\\<br />
		&#038;=x^2+2xy+y^2+2y^2=(x+y)^2+2y^2.<br />
		\end{align*}</p>
<p>		Since $\mathbf{x}\neq \mathbf{0}$, at least one of $x, y$ is nonzero.<br />
		Thus the last expression is always positive.<br />
		Hence $A$ is a positive definite matrix.</p>
<h3>(b) Prove that $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal in the inner product space $\R^2$.</h3>
<p> Note that by post &#8220;<a href="//yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a>&#8220;, the formula $\langle \mathbf{x}, \mathbf{y}\rangle$ defines an inner product on $\R^2$.</p>
<p>		Two vectors $\mathbf{x}$ and $\mathbf{y}$ is said to be <strong>orthogonal</strong> if $\langle \mathbf{x}, \mathbf{y}\rangle=0$.</p>
<p>		The vectors $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal with this inner product since<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_2\rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  3<br />
		\end{bmatrix}=1\neq 0.<br />
		\end{align*}</p>
<h3>(c) Find an orthogonal basis using the Gram-Schmidt orthogonalization process.</h3>
<p>By the Gram-Schmidt orthogonalization process, we have<br />
		\begin{align*}<br />
		\mathbf{v}_1&#038;=\mathbf{e}_1\\<br />
		\mathbf{v}_2&#038;=\mathbf{e}_2-\frac{\langle \mathbf{v}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{v}_1, \mathbf{v}_1 \rangle}\mathbf{v}_1<br />
		=\mathbf{e}_2-\frac{\langle \mathbf{e}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{e}_1, \mathbf{e}_1 \rangle}\mathbf{e}_1.<br />
		\end{align*}</p>
<p>		We compute<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_1 \rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  1<br />
		\end{bmatrix}=1.<br />
		\end{align*}<br />
		We also have $\langle \mathbf{e}_1, \mathbf{e}_2\rangle=1$ from part (b).<br />
		Thus, we have<br />
		\begin{align*}<br />
		\mathbf{v}_2=\mathbf{e}_2-\mathbf{e}_1=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Thus, the Gram-Schmidt orthogonalization process yields the orthogonal basis<br />
		\[\mathbf{v}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.\]
<h4>Double Check</h4>
<p>		Let us verify that $\mathbf{v}_1, \mathbf{v}_2$ are orthogonal by computing their inner product directly as follows.<br />
		We have<br />
		\begin{align*}<br />
		\langle \mathbf{v}_1, \mathbf{v}_2\rangle=\mathbf{v}_1^{\trans} A\mathbf{v}_2=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  2<br />
		\end{bmatrix}=0.<br />
		\end{align*}</p>
<button class="simplefavorite-button has-count" data-postid="4653" data-siteid="1" data-groupid="1" data-favoritecount="21" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">21</span></button><p>The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4653</post-id>	</item>
	</channel>
</rss>
