<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>orthogonal complement &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/orthogonal-complement/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Fri, 17 Nov 2017 13:38:04 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Projection to the subspace spanned by a vector</title>
		<link>https://yutsumura.com/projection-to-the-subspace-spanned-by-a-vector/</link>
				<comments>https://yutsumura.com/projection-to-the-subspace-spanned-by-a-vector/#respond</comments>
				<pubDate>Mon, 08 Aug 2016 22:31:09 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[image]]></category>
		<category><![CDATA[Johns Hopkins]]></category>
		<category><![CDATA[Johns Hopkins.LA]]></category>
		<category><![CDATA[kernel]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear transformation]]></category>
		<category><![CDATA[orthogonal complement]]></category>
		<category><![CDATA[projection]]></category>
		<category><![CDATA[rank]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=355</guid>
				<description><![CDATA[<p>Let $T: \R^3 \to \R^3$ be the linear transformation given by orthogonal projection to the line spanned by $\begin{bmatrix} 1 \\ 2 \\ 2 \end{bmatrix}$. (a) Find a formula for $T(\mathbf{x})$ for $\mathbf{x}\in \R^3$.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/projection-to-the-subspace-spanned-by-a-vector/" target="_blank">Projection to the subspace spanned by a vector</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 60</h2>
<p>Let $T: \R^3 \to \R^3$ be the linear transformation given by orthogonal projection to the line spanned by $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$.</p>
<p><strong>(a)</strong> Find a formula for $T(\mathbf{x})$ for $\mathbf{x}\in \R^3$.</p>
<p><strong>(b)</strong> Find a basis for the image subspace of $T$.</p>
<p><strong>(c)</strong> Find a basis for the kernel subspace of $T$.</p>
<p><strong>(d)</strong> Find the $3 \times 3$ matrix for $T$ with respect to the standard basis for $\R^3$.</p>
<p><strong>(e)</strong> Find a basis for the orthogonal complement of the kernel of $T$. (The orthogonal complement is the subspace of all vectors perpendicular to a given subspace, in this case, the kernel.)</p>
<p><strong>(f)</strong> Find a basis for the orthogonal complement of the image of $T$.</p>
<p><strong>(g)</strong> What is the rank of $T$?</p>
<p>(<em>Johns Hopkins University Exam</em>)</p>
<p><span id="more-355"></span><br />

<h2> Proof. </h2>
<h3>(a) Find a formula for $T(\mathbf{x})$ for $\mathbf{x}\in \R^3$</h3>
<p> For any vector $\mathbf{x}=\begin{bmatrix}<br />
x_1 \\<br />
x_2 \\<br />
x_3<br />
\end{bmatrix}\in \R^3$,<br />
we have $\mathbf{x}=T(\mathbf{x})+\mathbf{v}$, where $\mathbf{v}=\mathbf{x}-T(\mathbf{x})$, which is perpendicular to the vector $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$.<br />
Since $T(\mathbf{x})\in W$, we have $T(\mathbf{x})=t\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$ for some number $t$.<br />
Thus $\mathbf{x}=t\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}+\mathbf{v}$.<br />
To determine the number $t$, we take the inner product with $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$ and obtain<br />
\begin{align*}<br />
\mathbf{x}\cdot \begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}<br />
=t\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}\cdot \begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix} + \mathbf{v} \cdot \begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix} \\<br />
\Leftrightarrow \,\,\,\, x_1+2x_2+2x_3=9t<br />
\end{align*}<br />
Here $\mathbf{v} \cdot \begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}=\mathbf{0}$ since $\mathbf{v}$ is perpendicular to $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$.<br />
Therefore we have $t=\frac{1}{9}(x_1+2x_2+2x_3)$, and the formula is<br />
\[T(\mathbf{x})=\frac{1}{9}(x_1+2x_2+2x_3)\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}.\]
<p>&nbsp;</p>
<h3>(b) Find a basis for the image subspace of $T$</h3>
<p>Let $V$ be the subspace spanned by $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$ in $\R^3$.<br />
Since $T$ is a projection to the subspace $W$, the image is $W$ itself.<br />
(Any vector in $W$ is mapped to itself by the projection $T$.) Since $W$ is spanned by just one vector $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$, it is one-dimensional and a basis is $\left\{\, \begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix} \,\right\}$.</p>
<h3>(c) Find a basis for the kernel subspace of $T$</h3>
<p>If $\mathbf{x}=\begin{bmatrix}<br />
x_1 \\<br />
x_2 \\<br />
x_3<br />
\end{bmatrix} \in \ker T$, then by the formula in part (a), we have $x_1+2x_2+2x_3=0$.<br />
Thus the vectors in the kernel of $T$ can be written as<br />
\[\mathbf{x}=\begin{bmatrix}<br />
-2s-2t \\<br />
s \\<br />
t<br />
\end{bmatrix}=<br />
\begin{bmatrix}<br />
-2 \\<br />
1 \\<br />
0<br />
\end{bmatrix}s+<br />
\begin{bmatrix}<br />
-2 \\<br />
0 \\<br />
1<br />
\end{bmatrix}t,\]
where $s$ and $t$ are free variables.<br />
Since the vectors $\begin{bmatrix}<br />
-2 \\<br />
1 \\<br />
0<br />
\end{bmatrix}$ and $\begin{bmatrix}<br />
-2 \\<br />
0 \\<br />
1<br />
\end{bmatrix}$ are linearly independent and they span the kernel, the basis of $\ker T$ is<br />
\[ \left \{\, \begin{bmatrix}<br />
-2 \\<br />
1 \\<br />
0<br />
\end{bmatrix}, \begin{bmatrix}<br />
-2 \\<br />
0 \\<br />
1<br />
\end{bmatrix}<br />
\,\right \}.\]
<h3>(d) Find the $3 \times 3$ matrix for $T$ with respect to the standard basis for $\R^3$</h3>
<p> The matrix for $T$ is given by $[T(\mathbf{e}_1), T(\mathbf{e}_2), T(\mathbf{e}_3)]$, where $\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3$ are the standard basis unit vectors for $\R^3$.<br />
By the formula in part (a) we compute<br />
\[T(\mathbf{e}_1)=\frac{1}{9}\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}, T(\mathbf{e}_2)=\frac{2}{9}\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}, T(\mathbf{e}_3)=\frac{2}{9}\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}.\]
Therefore the matrix for $T$ with respect to the standard basis is<br />
\[\frac{1}{9}\begin{bmatrix}<br />
1 &amp; 2 &amp; 2 \\<br />
2 &amp;4 &amp;4 \\<br />
2 &amp; 4 &amp; 4<br />
\end{bmatrix}.\]
<h3>(e) Find a basis for the orthogonal complement of the kernel of $T$</h3>
<p>Note that the kernel consists of vectors in $\R^3$ that are perpendicular to $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$.<br />
Therefore the vectors perpendicular to the vectors in the kernel is parallel to $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$.<br />
Thus a basis for the orthogonal complement is $\left\{\, \begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix} \,\right \}$.</p>
<h3>(f) Find a basis for the orthogonal complement of the image of $T$</h3>
<p>The image of $T$ is the subspace $W$ spanned by $\begin{bmatrix}<br />
1 \\<br />
2 \\<br />
2<br />
\end{bmatrix}$.<br />
Thus the orthogonal complement of the image is the same as the kernel of $T$. Thus a basis is<br />
\[ \left \{\, \begin{bmatrix}<br />
-2 \\<br />
1 \\<br />
0<br />
\end{bmatrix}, \begin{bmatrix}<br />
-2 \\<br />
0 \\<br />
1<br />
\end{bmatrix}<br />
\,\right\}\]
as we saw in part (c).</p>
<h3>(g) What is the rank of $T$?</h3>
<p>The rank of $T$ is the dimension of the image of $T$. The image is $T$ and it is one-dimensional since it is spanned by only one vector. Thus the rank of $T$ is $1$.</p>
<button class="simplefavorite-button has-count" data-postid="355" data-siteid="1" data-groupid="1" data-favoritecount="8" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">8</span></button><p>The post <a href="https://yutsumura.com/projection-to-the-subspace-spanned-by-a-vector/" target="_blank">Projection to the subspace spanned by a vector</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/projection-to-the-subspace-spanned-by-a-vector/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">355</post-id>	</item>
		<item>
		<title>If the Kernel of a Matrix $A$ is Trivial, then $A^T A$ is Invertible</title>
		<link>https://yutsumura.com/if-the-kernel-of-a-matrix-a-is-trivial-then-at-a-is-invertible/</link>
				<comments>https://yutsumura.com/if-the-kernel-of-a-matrix-a-is-trivial-then-at-a-is-invertible/#respond</comments>
				<pubDate>Mon, 01 Aug 2016 20:18:37 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[homomorphism]]></category>
		<category><![CDATA[injective homomorphism]]></category>
		<category><![CDATA[invertible matrix]]></category>
		<category><![CDATA[kernel of a matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[null space]]></category>
		<category><![CDATA[orthogonal complement]]></category>
		<category><![CDATA[Stanford]]></category>
		<category><![CDATA[Stanford.LA]]></category>
		<category><![CDATA[transpose]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=253</guid>
				<description><![CDATA[<p>Let $A$ be an $m \times n$ real matrix. Then the kernel of $A$ is defined as $\ker(A)=\{ x\in \R^n \mid Ax=0 \}$. The kernel is also called the null space of $A$. Suppose&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/if-the-kernel-of-a-matrix-a-is-trivial-then-at-a-is-invertible/" target="_blank">If the Kernel of a Matrix $A$ is Trivial, then $A^T A$ is Invertible</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2>Problem 38</h2>
<p>Let $A$ be an $m \times n$ real matrix.<br />
Then the<em><strong> kernel</strong></em> of $A$ is defined as $\ker(A)=\{ x\in \R^n \mid Ax=0 \}$.</p>
<p>The kernel is also called the<em><strong> null space</strong></em> of $A$.<br />
Suppose that $A$ is an $m \times n$ real matrix such that $\ker(A)=0$. Prove that $A^{\trans}A$ is invertible.</p>
<p>(<em>Stanford University Linear Algebra Exam</em>)</p>
<p><span id="more-253"></span><br />

We will give two proofs.</p>
<p>Both proofs try to prove $\ker(A^{\trans}A)=0$. The method in the 1st proof is more or less direct computation.</p>
<p>For the second proof, you need to remember the relation between the transpose and the orthogonal complement of a vector space.</p>
<h2>1st proof</h2>
<p>Since the size of the transpose $A^{\trans}$ is $n\times m$, the product $A^{\trans}A$ is a $n\times n$ square matrix.<br />
To show that it is invertible, we show that the kernel of $A^{\trans}A$ is trivial.<br />
Then the result follows since $A^{\trans}A$ is an injective linear transformation from $\R^n$ to $\R^n$, thus an isomorphism. Hence $A^{\trans}A$ is invertible.</p>
<hr />
<p>To show that the kernel of $A^{\trans}A$ is trivial, denote $A=[A_1 A_2\dots A_n]$, where $A_i$ is the $i$-th column vector of $A$.<br />
Note that the column vectors $A_i$ are linearly independent since $\ker(A)=0$.</p>
<p>Then<br />
\[A^{\trans}=\begin{bmatrix}<br />
A_1^{\trans} \\<br />
A_2^{\trans} \\<br />
\vdots \\<br />
A_n^{\trans}<br />
\end{bmatrix}.\]
<p>Now we suppose that $A^{\trans}Ax=0$ for $x=\begin{bmatrix}<br />
x_1 \\<br />
x_2 \\<br />
\vdots \\<br />
x_n<br />
\end{bmatrix}\in R^n$.<br />
Then we have<br />
\begin{align*}<br />
0=&amp;A^{\trans}A x\\[6pt]
&amp;=\begin{bmatrix}<br />
A_1^{\trans}A_1 &amp; A_1^{\trans}A_2 &amp; \cdots &amp; A_1^{\trans}A_n \\<br />
A_2^{\trans}A_1 &amp;A_2^{\trans}A_2 &amp; \cdots &amp; A_2^{\trans}A_n \\<br />
\vdots &amp; \vdots &amp; \vdots &amp; \vdots \\<br />
A_n^{\trans}A_1 &amp; A_n^{\trans}A_2 &amp; \cdots &amp; A_n^{\trans}A_n<br />
\end{bmatrix}x\\[6pt]
&amp;=\begin{bmatrix}<br />
A_1^{\trans}(x_1 A_1+\cdots +x_n A_n) \\<br />
A_2^{\trans}(x_1 A_1+\cdots+x_n A_n \\<br />
\vdots \\<br />
A_n^{\trans}(x_1 A_1+\cdots+x_n A_n)<br />
\end{bmatrix}.<br />
\end{align*}</p>
<p>Therefore we have $A_i^{\trans}(x_1 A_1+\cdots+x_n A_n)=0$ for all $i=1,\dots, n$.<br />
Equivalently, we have the inner product $A_i\cdot Ax=0$ for all $i=1,\dots,n$.</p>
<p>If $Ax=x_1 A_1+\cdots+x_n A_n\neq 0$, then the vectors $A_1, A_2, \dots, A_n, Ax$ are linearly independent vectors in $\R^n$ since $A_1,\dots, A_n$ are linearly independent and the inner products with $A_i$ and the vector $Ax$ are all zero, hence they are orthogonal.</p>
<p>However, this is a contradiction since there are $n+1$ linearly independent vectors in $\R^n$ of dimension $n$ vector space. Thus, we must have $Ax=0$. Since the kernel of $A$ is trivial, this implies $x=0$. Therefore the kernel of $A^{\trans}A$ is trivial.</p>
<h2>2nd proof</h2>
<p>In the second proof, we still want to prove the kernel of $A^{\trans}A$ is trivial.</p>
<p>Let $x\in \ker(A^{\trans}A)$.<br />
Then we have $A^{\trans}(Ax)=0$ and thus $Ax \in \ker(A^{\trans})=\im(A)^{\perp}$.</p>
<p>On the other hand, clearly $Ax \in \im(A)$.<br />
Thus $Ax \in \im(A)\cap \im(A)^{\perp}=\{0\}$.<br />
So we have $Ax=0$, and $x=0$ since $\ker(A)=0$.</p>
<p>Therefore $\ker(A^{\trans}A)=0$, and since any injective homomorphism from $\R^n$ to itself is an isomorphism, we conclude that $A^{\trans}A$ is invertible.</p>
<button class="simplefavorite-button has-count" data-postid="253" data-siteid="1" data-groupid="1" data-favoritecount="11" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">11</span></button><p>The post <a href="https://yutsumura.com/if-the-kernel-of-a-matrix-a-is-trivial-then-at-a-is-invertible/" target="_blank">If the Kernel of a Matrix $A$ is Trivial, then $A^T A$ is Invertible</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/if-the-kernel-of-a-matrix-a-is-trivial-then-at-a-is-invertible/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">253</post-id>	</item>
	</channel>
</rss>
