<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>orthogonal vectors &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/orthogonal-vectors/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Thu, 08 Feb 2018 05:52:55 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Dot Product, Lengths, and Distances of Complex Vectors</title>
		<link>https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/</link>
				<comments>https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/#respond</comments>
				<pubDate>Thu, 08 Feb 2018 05:52:55 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[complex vector]]></category>
		<category><![CDATA[distance]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[imaginary number]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal vectors]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6836</guid>
				<description><![CDATA[<p>For this problem, use the complex vectors \[ \mathbf{w}_1 = \begin{bmatrix} 1 + i \\ 1 &#8211; i \\ 0 \end{bmatrix} , \, \mathbf{w}_2 = \begin{bmatrix} -i \\ 0 \\ 2 &#8211; i \end{bmatrix}&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/" target="_blank">Dot Product, Lengths, and Distances of Complex Vectors</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 689</h2>
<p>For this problem, use the complex vectors<br />
\[ \mathbf{w}_1 = \begin{bmatrix} 1 + i \\ 1 &#8211; i \\ 0 \end{bmatrix} , \, \mathbf{w}_2 = \begin{bmatrix} -i \\ 0 \\ 2 &#8211; i \end{bmatrix} , \, \mathbf{w}_3 = \begin{bmatrix} 2+i \\ 1 &#8211; 3i \\ 2i \end{bmatrix} . \]
<p>Suppose $\mathbf{w}_4$ is another complex vector which is orthogonal to both $\mathbf{w}_2$ and $\mathbf{w}_3$, and satisfies $\mathbf{w}_1 \cdot \mathbf{w}_4 = 2i$ and $\| \mathbf{w}_4 \| = 3$.</p>
<p>Calculate the following expressions:</p>
<p><strong>(a)</strong> $ \mathbf{w}_1 \cdot \mathbf{w}_2 $. </p>
<p><strong>(b)</strong> $ \mathbf{w}_1 \cdot \mathbf{w}_3 $. </p>
<p><strong>(c)</strong> $((2+i)\mathbf{w}_1 &#8211; (1+i)\mathbf{w}_2 ) \cdot \mathbf{w}_4$.</p>
<p><strong>(d)</strong> $\| \mathbf{w}_1 \| , \| \mathbf{w}_2 \|$, and $\| \mathbf{w}_3 \|$.</p>
<p><strong>(e)</strong> $\| 3 \mathbf{w}_4 \|$.</p>
<p><strong>(f)</strong> What is the distance between $\mathbf{w}_2$ and $\mathbf{w}_3$?</p>
<p>&nbsp;<br />
<span id="more-6836"></span><br />

<h2>Solution.</h2>
<h3>(a) $ \mathbf{w}_1 \cdot \mathbf{w}_2 $. </h3>
<p>\[ \mathbf{w}_1 \cdot \mathbf{w}_2 = \begin{bmatrix} 1+i &#038; 1-i &#038; 0 \end{bmatrix} \begin{bmatrix} -i \\ 0 \\ 2-i \end{bmatrix} = (1+i)(-i) + 0 + 0 = 1 &#8211; i . \]
<h3>(b) $ \mathbf{w}_1 \cdot \mathbf{w}_3 $. </h3>
<p>\begin{align*} \mathbf{w}_1 \cdot \mathbf{w}_3 &#038;= \begin{bmatrix} 1+i &#038; 1-i &#038; 0 \end{bmatrix} \begin{bmatrix} 2+i \\ 1-3i \\ 2i \end{bmatrix} \\ &#038;= (1+i)(2+i) + (1-i)(1-3i) + 0 \\ &#038;= (1 + 3i) + (-2 &#8211; 4i) \\ &#038;= -1 &#8211; i . \end{align*}</p>
<h3>(c) $((2+i)\mathbf{w}_1 &#8211; (1+i)\mathbf{w}_2 ) \cdot \mathbf{w}_4$.</h3>
<p>\begin{align*} ((2+i)\mathbf{w}_1 &#8211; (1+i)\mathbf{w}_2 ) \cdot \mathbf{w}_4 &#038;= (2+i)( \mathbf{w}_1 \cdot \mathbf{w}_4) &#8211; (1+i) ( \mathbf{w}_2 \cdot \mathbf{w}_4 ) \\<br />
&#038;= (2+i) ( 2i ) &#8211; (1+i)(0) \\<br />
&#038;= -2 + 4i \end{align*}</p>
<p>Note that $\mathbf{w}_2 \cdot \mathbf{w}_4=0$ because these vectors are orthogonal.</p>
<h3>(d) $\| \mathbf{w}_1 \| , \| \mathbf{w}_2 \|$, and $\| \mathbf{w}_3 \|$.</h3>
<p>For an arbitrary complex vector $\mathbf{v}$, its length is defined to be<br />
\[ \| \mathbf{v} \| = \sqrt{ \overline{\mathbf{v}}^\trans \mathbf{v} } . \]
<p>Thus,<br />
\[ \| \mathbf{w}_1 \| \, = \, \sqrt{ (1-i)(1+i) + (1+i)(1-i) + 0 } = \sqrt{ 2 + 2} = \sqrt{4} , \]
\[ \| \mathbf{w}_2 \| \, = \, \sqrt{ (i)(-i) + 0 + (2+i)(2-i) } = \sqrt{1 + 5} = \sqrt{6} , \]
\[ \| \mathbf{w}_3 \| \, = \, \sqrt{ (2-i)(2+i) + (1+3i)(1-3i) + (-2i)(2i) } = \sqrt{ 5 + 10 + 4} = \sqrt{19} . \]
<h3>(e) $\| 3 \mathbf{w}_4 \|$.</h3>
<p>$ \| 3 \mathbf{w}_4 \| = 3 \| \mathbf{w}_4 \|  = 3\cdot 3=9 $ .</p>
<h3>(f) What is the distance between $\mathbf{w}_2$ and $\mathbf{w}_3$?</h3>
<p>The distance between these vectors is given by $\| \mathbf{w}_2 &#8211; \mathbf{w}_3 \|$.  First we calculate this difference:<br />
\[ \mathbf{w}_2 &#8211; \mathbf{w}_3 \, = \, \begin{bmatrix} -i \\ 0 \\ 2 &#8211; i \end{bmatrix} &#8211; \begin{bmatrix} 2+i \\ 1 &#8211; 3i \\ 2i \end{bmatrix} \, = \, \begin{bmatrix} -2 &#8211; 2i \\ -1 + 3i \\ 2 &#8211; 3i \end{bmatrix} . \]
<p>Now the length of the complex vector is defined to be<br />
	\begin{align*}<br />
	\| \mathbf{w}_2 &#8211; \mathbf{w}_3 \| &#038;= \sqrt{ \left( \overline{ \mathbf{w}_2 &#8211; \mathbf{w}_3 } \right)^{\trans} \left(  \mathbf{w}_2 &#8211; \mathbf{w}_3 \right) } \\[6pt]
	&#038;= \sqrt{ \begin{bmatrix} -2 + 2i &#038; -1 &#8211; 3i &#038; 2 + 3i \end{bmatrix} \begin{bmatrix} -2 &#8211; 2i \\ -1 + 3i \\ 2 &#8211; 3i \end{bmatrix} } \\[6pt]
	&#038;= \sqrt{ (-2+2i)(-2-2i) + (-1-3i)(-1+3i) + (2+3i)(2-3i) } \\[6pt]
	&#038;= \sqrt{ 8 + 10 + 13 } \\[6pt]
	&#038;= \sqrt{ 31} \end{align*}</p>
<button class="simplefavorite-button has-count" data-postid="6836" data-siteid="1" data-groupid="1" data-favoritecount="8" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">8</span></button><p>The post <a href="https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/" target="_blank">Dot Product, Lengths, and Distances of Complex Vectors</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/dot-product-lengths-and-distances-of-complex-vectors/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6836</post-id>	</item>
		<item>
		<title>Eigenvalues and Eigenvectors of The Cross Product Linear Transformation</title>
		<link>https://yutsumura.com/eigenvalues-and-eigenvectors-of-the-cross-product-linear-transformation/</link>
				<comments>https://yutsumura.com/eigenvalues-and-eigenvectors-of-the-cross-product-linear-transformation/#respond</comments>
				<pubDate>Fri, 27 Oct 2017 02:42:23 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[cross product]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear transformation]]></category>
		<category><![CDATA[matrix representation]]></category>
		<category><![CDATA[orthogonal vectors]]></category>
		<category><![CDATA[skew-symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5181</guid>
				<description><![CDATA[<p>We fix a nonzero vector $\mathbf{a}$ in $\R^3$ and define a map $T:\R^3\to \R^3$ by \[T(\mathbf{v})=\mathbf{a}\times \mathbf{v}\] for all $\mathbf{v}\in \R^3$. Here the right-hand side is the cross product of $\mathbf{a}$ and $\mathbf{v}$. (a)&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/eigenvalues-and-eigenvectors-of-the-cross-product-linear-transformation/" target="_blank">Eigenvalues and Eigenvectors of The Cross Product Linear Transformation</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 593</h2>
<p>	We fix a nonzero vector $\mathbf{a}$ in $\R^3$ and define a map $T:\R^3\to \R^3$ by<br />
	\[T(\mathbf{v})=\mathbf{a}\times \mathbf{v}\]
	for all $\mathbf{v}\in \R^3$.<br />
	Here the right-hand side is the cross product of $\mathbf{a}$ and $\mathbf{v}$.</p>
<p><strong>(a)</strong> Prove that $T:\R^3\to \R^3$ is a linear transformation.</p>
<p><strong>(b)</strong> Determine the eigenvalues and eigenvectors of $T$.</p>
<p>&nbsp;<br />
<span id="more-5181"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that $T:\R^3\to \R^3$ is a linear transformation.</h3>
<p>To prove that $T$ is a linear transformation, we need to verify the following equalities for all $\mathbf{u}, \mathbf{v}\in \R^3$ and $c\in \R$:</p>
<ol>
<li> $T(\mathbf{u}+\mathbf{v})=T(\mathbf{u})+T(\mathbf{v})$</li>
<li>$T(c\mathbf{v})=cT(\mathbf{v})$.</li>
</ol>
<p>		These follow from the basic properties of cross products as follows.<br />
		We have<br />
		\begin{align*}<br />
		T(\mathbf{u}+\mathbf{v})&#038;=\mathbf{a}\times (\mathbf{u}+\mathbf{v})\\<br />
		&#038;=\mathbf{a}\times \mathbf{u}+\mathbf{a}\times \mathbf{v} &#038;&#038;\text{the cross product is distributive}\\<br />
		&#038;=T(\mathbf{u})+T(\mathbf{v}).<br />
		\end{align*}<br />
		As the cross product is compatible with scalar multiplication, we also have<br />
		\begin{align*}<br />
		T(c\mathbf{v})&#038;=\mathbf{a}\times (c\mathbf{v})=c(\mathbf{a}\times \mathbf{v}) =cT(\mathbf{v}).<br />
		\end{align*}<br />
		Therefore $T$ is a linear transformation.</p>
<h3>(b) Determine the eigenvalues and eigenvectors of $T$.</h3>
<p>Let $\lambda$ be an eigenvalue of the linear transformation $T$ and let $\mathbf{x}$ be an eigenvector corresponding to $\lambda$.<br />
		Then we have<br />
		\[T(\mathbf{x})=\mathbf{a}\times \mathbf{x}=\lambda \mathbf{x}.\]
		We take the dot product and obtain<br />
		\begin{align*}<br />
		(a\times \mathbf{x})\cdot \mathbf{x}=\lambda \mathbf{x}\cdot \mathbf{x}=\lambda \|\mathbf{x}\|^2.<br />
		\end{align*}<br />
		Note that the vector $\mathbf{a}\times \mathbf{x}$ and $\mathbf{x}$ is orthogonal, hence the left-hand side of the above equality is zero.<br />
		On the other hand, since $\mathbf{x}$ is an eigenvector, it is a nonzero vector. Hence the length $\|\mathbf{x}\|\neq 0$.<br />
		It follows that we have $0=\lambda \|\mathbf{x}\|^2$ with $\|\mathbf{x}\|\neq 0$, and thus $\lambda=0$.<br />
		Hence the eigenvalue of $T$ is $\lambda=0$.</p>
<p>		This yields that we have $\mathbf{a}\times \mathbf{x}=0\cdot \mathbf{x}=\mathbf{0}$ for any eigenvector $\mathbf{x}$.<br />
		Note that the cross product is zero indicates that the vectors $\mathbf{a}$ and $\mathbf{x}$ are parallel.<br />
		Hence $\mathbf{x}=r\mathbf{a}$ for a nonzero $r\in \R$.</p>
<p>		In summary, the eigenvalue of $T$ is $0$ and corresponding eigenvectors are $r\mathbf{a}$ for any nonzero real number $r\in \R$.</p>
<h2>Comment.</h2>
<p>				Let $\{\mathbf{e}_1, \mathbf{e}_2\mathbf{e}_3\}$ be the standard basis for $\R^3$ and write $\mathbf{a}=\begin{bmatrix}<br />
		  a_1 \\<br />
		   a_2 \\<br />
		    a_3<br />
		  \end{bmatrix}$.<br />
		  Then we have<br />
		  \begin{align*}<br />
		T(\mathbf{e}_1)&#038;=\begin{bmatrix}<br />
		  a_1 \\<br />
		   a_2 \\<br />
		    a_3<br />
		  \end{bmatrix}\times \begin{bmatrix}<br />
		  1 \\<br />
		   0 \\<br />
		    0<br />
		  \end{bmatrix}<br />
		  =\begin{bmatrix}<br />
		  0 \\<br />
		   a_3 \\<br />
		    -a_2<br />
		  \end{bmatrix}\\[6pt]
		  T(\mathbf{e}_2)&#038;=\begin{bmatrix}<br />
		  a_1 \\<br />
		   a_2 \\<br />
		    a_3<br />
		  \end{bmatrix}\times \begin{bmatrix}<br />
		  0 \\<br />
		   1 \\<br />
		    0<br />
		  \end{bmatrix}<br />
		  =\begin{bmatrix}<br />
		  -a_3 \\<br />
		   0 \\<br />
		    a_1<br />
		  \end{bmatrix}\\[6pt]
		  T(\mathbf{e}_3)&#038;=\begin{bmatrix}<br />
		  a_1 \\<br />
		   a_2 \\<br />
		    a_3<br />
		  \end{bmatrix}\times \begin{bmatrix}<br />
		  0 \\<br />
		   0 \\<br />
		    1<br />
		  \end{bmatrix}<br />
		  =\begin{bmatrix}<br />
		  a_2 \\<br />
		   -a_1 \\<br />
		    0<br />
		  \end{bmatrix}.<br />
		\end{align*}<br />
		It follows that the matrix representation of the linear transformation $T$ with respect to the standard basis is given by<br />
		\[A=\begin{bmatrix}<br />
		  0 &#038; -a_3 &#038; a_2 \\<br />
		   a_3 &#038;0 &#038;-a_1 \\<br />
		   -a_2 &#038; a_1 &#038; 0<br />
		\end{bmatrix}.\]
		Note that the matrix $A$ is a real skew-symmetric matrix.<br />
		We know that each eigenvalue of a skew-symmetric matrix is either $0$ or purely imaginary.<br />
(See the post &#8220;<a href="//yutsumura.com/eigenvalues-of-real-skew-symmetric-matrix-are-zero-or-purely-imaginary-and-the-rank-is-even/" rel="noopener" target="_blank">Eigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is Even</a>&#8220;.)<br />
		Also if the degree of the matrix is odd, it has $0$ as an eigenvalue.<br />
		(See the post &#8220;<a href="//yutsumura.com/the-determinant-of-a-skew-symmetric-matrix-is-zero/" rel="noopener" target="_blank">The Determinant of a Skew-Symmetric Matrix is Zero</a>&#8220;.)</p>
<button class="simplefavorite-button has-count" data-postid="5181" data-siteid="1" data-groupid="1" data-favoritecount="20" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">20</span></button><p>The post <a href="https://yutsumura.com/eigenvalues-and-eigenvectors-of-the-cross-product-linear-transformation/" target="_blank">Eigenvalues and Eigenvectors of The Cross Product Linear Transformation</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/eigenvalues-and-eigenvectors-of-the-cross-product-linear-transformation/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5181</post-id>	</item>
		<item>
		<title>Orthogonal Nonzero Vectors Are Linearly Independent</title>
		<link>https://yutsumura.com/orthogonal-nonzero-vectors-are-linearly-independent/</link>
				<comments>https://yutsumura.com/orthogonal-nonzero-vectors-are-linearly-independent/#respond</comments>
				<pubDate>Tue, 24 Oct 2017 04:57:29 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[linear combination]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[orthogonal]]></category>
		<category><![CDATA[orthogonal set]]></category>
		<category><![CDATA[orthogonal vectors]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5158</guid>
				<description><![CDATA[<p>Let $S=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ be a set of nonzero vectors in $\R^n$. Suppose that $S$ is an orthogonal set. (a) Show that $S$ is linearly independent. (b) If $k=n$, then prove that $S$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/orthogonal-nonzero-vectors-are-linearly-independent/" target="_blank">Orthogonal Nonzero Vectors Are Linearly Independent</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 591</h2>
<p>		Let $S=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ be a set of nonzero vectors in $\R^n$.<br />
Suppose that $S$ is an orthogonal set. </p>
<p><strong>(a)</strong> Show that $S$ is linearly independent.</p>
<p><strong>(b)</strong> If $k=n$, then prove that $S$ is a basis for $\R^n$.</p>
<p>&nbsp;<br />
<span id="more-5158"></span><br />

<h2> Proof. </h2>
<h3>(a) Show that $S$ is linearly independent.</h3>
<p>			Consider the linear combination<br />
			\[c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots +c_k \mathbf{v}_k=\mathbf{0}.\]
			Our goal is to show that $c_1=c_2=\cdots=c_k=0$.</p>
<hr />
<p>			We compute the dot product of $\mathbf{v}_i$ and the above linear combination for each $i=1, 2, \dots, k$:<br />
			\begin{align*}<br />
		0&#038;=\mathbf{v}_i\cdot \mathbf{0}\\<br />
		&#038;=\mathbf{v}_i \cdot (c_1\mathbf{v}_1+c_2\mathbf{v}_2+\cdots +c_k \mathbf{v}_k)\\<br />
		&#038;=c_1\mathbf{v}_i \cdot \mathbf{v}_1+c_2\mathbf{v}_i \cdot \mathbf{v}_2+\cdots +c_k \mathbf{v}_i \cdot\mathbf{v}_k.<br />
		\end{align*}</p>
<p>		As $S$ is an orthogonal set, we have $\mathbf{v}_i\cdot \mathbf{v}_j=0$ if $i\neq j$.</p>
<p>		Hence all terms but the $i$-th one are zero, and thus we have<br />
		\[0=c_i\mathbf{v}_i\cdot \mathbf{v}_i=c_i \|\mathbf{v}_i\|^2.\]
<p>		Since $\mathbf{v}_i$ is a nonzero vector, its length $\|\mathbf{v}_i\|$ is nonzero.<br />
		It follows that $c_i=0$.</p>
<p>		As this computation holds for every $i=1, 2, \dots, k$, we conclude that $c_1=c_2=\cdots=c_k=0$.<br />
		Hence the set $S$ is linearly independent.</p>
<h3>(b) If $k=n$, then prove that $S$ is a basis for $\R^n$. </h3>
<p>Suppose that $k=n$. Then by part (a), the set $S$ consists of $n$ linearly independent vectors in the dimension $n$ vector space $\R^n$. </p>
<p>		Thus, $S$ is also a spanning set of $\R^n$, and hence $S$ is a basis for $\R^n$.</p>
<button class="simplefavorite-button has-count" data-postid="5158" data-siteid="1" data-groupid="1" data-favoritecount="56" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">56</span></button><p>The post <a href="https://yutsumura.com/orthogonal-nonzero-vectors-are-linearly-independent/" target="_blank">Orthogonal Nonzero Vectors Are Linearly Independent</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/orthogonal-nonzero-vectors-are-linearly-independent/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5158</post-id>	</item>
	</channel>
</rss>
