<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>basis of a vector space | Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/basis-of-a-vector-space/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Sun, 19 Nov 2017 18:06:57 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.10</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Every Basis of a Subspace Has the Same Number of Vectors</title>
		<link>https://yutsumura.com/every-basis-of-a-subspace-has-the-same-number-of-vectors/</link>
				<comments>https://yutsumura.com/every-basis-of-a-subspace-has-the-same-number-of-vectors/#comments</comments>
				<pubDate>Mon, 02 Oct 2017 05:32:27 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linearly dependent]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[spanning set]]></category>
		<category><![CDATA[subspace]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5006</guid>
				<description><![CDATA[<p>Let $V$ be a subspace of $\R^n$. Suppose that $B=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ is a basis of the subspace $V$. Prove that every basis of $V$ consists of $k$ vectors in $V$. &#160; Hint.&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/every-basis-of-a-subspace-has-the-same-number-of-vectors/">Every Basis of a Subspace Has the Same Number of Vectors</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 577</h2>
<p>		Let $V$ be a subspace of $\R^n$.<br />
		Suppose that $B=\{\mathbf{v}_1, \mathbf{v}_2, \dots, \mathbf{v}_k\}$ is a basis of the subspace $V$.</p>
<p>		Prove that every basis of $V$ consists of $k$ vectors in $V$.</p>
<p>&nbsp;<br />
<span id="more-5006"></span><br />

<h2>Hint.</h2>
<p>You may use the following fact:</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Fact</strong>.<br />
If $S=\{\mathbf{v}_1, \dots, \mathbf{v}_m\}$ is a spanning set of a subspace $V$ of $\R^n$, then any set of $m+1$ or more vectors of $V$ is linearly dependent.
</div>
<p>For a proof of this fact, see the post &#8628;<br />
<a href="//yutsumura.com/if-there-are-more-vectors-than-a-spanning-set-then-vectors-are-linearly-dependent/" rel="noopener" target="_blank">If there are More Vectors Than a Spanning Set, then Vectors are Linearly Dependent</a></p>
<h2> Proof. </h2>
<p>			Let $B&#8217;=\{\mathbf{w}_1, \mathbf{w}_2, \dots, \mathbf{w}_l\}$ be an arbitrary basis of the subspace $V$.<br />
			Our goal is to show that $l=k$.</p>
<hr />
<p>			As $B$ is a basis, it is a spanning set for $V$ consisting of $k$ vectors.<br />
			By the fact stated above, a set of $k+1$ or more vectors of $V$ must be linearly dependent.<br />
			Since $B&#8217;$ is a basis, it is linearly independent.<br />
			It follows that $l\leq k$.</p>
<hr />
<p>			We now change the roles of $B$ and $B&#8217;$.<br />
			As $B&#8217;$ is a basis, it is a spanning set for $V$ consisting of $l$ vectors.<br />
			So it follows from Fact that a set of $l+1$ or more vectors must be linearly dependent.<br />
			Since $B$ is a basis, it is linearly independent.<br />
			Hence $k \leq l$.</p>
<hr />
<p>			Therefore we have $l\leq k$ and $k \leq l$, and it yields that $l=k$, as required.</p>
<button class="simplefavorite-button has-count" data-postid="5006" data-siteid="1" data-groupid="1" data-favoritecount="61" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">61</span></button>The post <a href="https://yutsumura.com/every-basis-of-a-subspace-has-the-same-number-of-vectors/">Every Basis of a Subspace Has the Same Number of Vectors</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/every-basis-of-a-subspace-has-the-same-number-of-vectors/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5006</post-id>	</item>
		<item>
		<title>Three Linearly Independent Vectors in $\R^3$ Form a Basis. Three Vectors Spanning $\R^3$ Form a Basis.</title>
		<link>https://yutsumura.com/three-linearly-independent-vectors-in-r3-form-a-basis-three-vectors-spanning-r3-form-a-basis/</link>
				<comments>https://yutsumura.com/three-linearly-independent-vectors-in-r3-form-a-basis-three-vectors-spanning-r3-form-a-basis/#comments</comments>
				<pubDate>Thu, 28 Sep 2017 03:58:25 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear combination]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[nonsingular matrix]]></category>
		<category><![CDATA[spanning set]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4990</guid>
				<description><![CDATA[<p>Let $B=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a set of three-dimensional vectors in $\R^3$. (a) Prove that if the set $B$ is linearly independent, then $B$ is a basis of the vector space $\R^3$. (b) Prove&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/three-linearly-independent-vectors-in-r3-form-a-basis-three-vectors-spanning-r3-form-a-basis/">Three Linearly Independent Vectors in $\R^3$ Form a Basis. Three Vectors Spanning $\R^3$ Form a Basis.</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 574</h2>
<p>	Let $B=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$ be a set of three-dimensional vectors in $\R^3$.</p>
<p><strong>(a)</strong> Prove that if the set $B$ is linearly independent, then $B$ is a basis of the vector space $\R^3$.</p>
<p><strong>(b)</strong> Prove that if the set $B$ spans $\R^3$, then $B$ is a basis of $\R^3$.</p>
<p>&nbsp;<br />
<span id="more-4990"></span><br />

<h2>Definition (A Basis of a Subspace).</h2>
<p>A subset $B$ of a vector space $V$ is called a <strong>basis</strong> if $B$ is linearly independent spanning set.</p>
<h2> Proof. </h2>
<h3>(a) Prove that if the set $B$ is linearly independent, then $B$ is a basis of the vector space $\R^3$.</h3>
<p>To show that $B$ is a basis, we need only prove that $B$ is a spanning set of $\R^3$ as we know that $B$ is linearly independent.<br />
		Let $\mathbf{b}\in \R^3$ be an arbitrary vector.<br />
		We prove that there exist $x_1, x_2, x_3$ such that<br />
		\[x_1\mathbf{v}_1+x_2\mathbf{v}_2+x_3\mathbf{v}_3=\mathbf{b}.\]
		This is equivalent to having a solution $\mathbf{x}=\begin{bmatrix}<br />
  x_1 \\<br />
   x_2 \\<br />
    x_3<br />
  \end{bmatrix}$ to the matrix equation<br />
		\[A\mathbf{x}=\mathbf{b}, \tag{*}\]
		where<br />
		\[A=[\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3]\]
		is the $3\times 3$ matrix whose column vectors are $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$.</p>
<p>		Since the vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly independent, the matrix $A$ is nonsingular.<br />
		It follows that the equation (*) has the unique solution $\mathbf{x}=A^{-1}\mathbf{b}$.<br />
		Hence $\mathbf{b}$ is a linear combination of the vectors in $B$.<br />
		This means that $B$ is a spanning set of $\R^3$, hence $B$ is a basis.</p>
<h3>(b) Prove that if the set $B$ spans $\R^3$, then $B$ is a basis of $\R^3$.</h3>
<p>As we know that $B$ spans $\R^3$, it suffices to show that $B$ is linearly independent.<br />
		Note that the assumption that $B$ is a spanning set of $\R^3$ means that any vector $\mathbf{b}$ in $\R^3$ is a linear combination of $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$: there exist $c_1, c_2, c_3$ such that<br />
		\[c_1\mathbf{v}_1+c_2\mathbf{v}_2+c_3\mathbf{v}_3=\mathbf{b}.\]
		Equivalently, for any $\mathbf{b}\in \R^3$, the equation $A\mathbf{x}=\mathbf{b}$ has a solution.</p>
<p>	  Let $A=[\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3]$ as before.<br />
	  It follows that the equation<br />
	  \[A\mathbf{x}=\begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    0<br />
	  \end{bmatrix}\]
	  has a solution $\mathbf{x}=\mathbf{u}_1$.<br />
	  Similarly the equations<br />
	  \[A\mathbf{x}=\begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    0<br />
	  \end{bmatrix}, \quad A\mathbf{x}=\begin{bmatrix}<br />
	  0 \\<br />
	   0 \\<br />
	    1<br />
	  \end{bmatrix}\]
	  have solutions $\mathbf{u}_2, \mathbf{u}_3$, respectively.</p>
<hr />
<p>	  Define the $3\times 3$ matrix $A&#8217;$ by<br />
	  \[A&#8217;=[\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3].\]
<p>	  Then it follows that<br />
	  \begin{align*}<br />
	AA&#8217;&#038;=A[\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3]\\<br />
	&#038;=[A\mathbf{u}_1, A\mathbf{u}_2, A\mathbf{u}_3]\\<br />
	&#038;=\begin{bmatrix}<br />
	  1 &#038; 0 &#038; 0 \\<br />
	   0 &#038;1 &#038;0 \\<br />
	   0 &#038; 0 &#038; 1<br />
	\end{bmatrix}.<br />
	\end{align*}</p>
<hr />
<p>	As the identity matrix is nonsingular, the product $AA&#8217;$ is nonsingular.<br />
	Thus, the matrix $A$ is nonsingular as well.<br />
	This implies that the column vectors of $A$ are linearly independent.<br />
	Hence the set $B$ is linearly independent and we conclude that $B$ is a basis of $\R^3$.</p>
<h2> Related Question. </h2>
<p>Use the result of the problem, try the next problem about a basis for $\R^3$.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Determine whether each of the following sets is a basis for $\R^3$.</p>
<p><strong>(a)</strong> $S=\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    -1<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  2 \\<br />
	   1 \\<br />
	    -1<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  -2 \\<br />
	   1 \\<br />
	    4<br />
	  \end{bmatrix} \,\right\}$</p>
<p><strong>(b)</strong> $S=\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   4 \\<br />
	    7<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  2 \\<br />
	   5 \\<br />
	    8<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  3 \\<br />
	   6 \\<br />
	    9<br />
	  \end{bmatrix} \,\right\}$</p>
<p><strong>(c)</strong> $S=\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    2<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    7<br />
	  \end{bmatrix} \,\right\}$</p>
<p><strong>(d)</strong> $S=\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	   2 \\<br />
	    5<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  7 \\<br />
	   4 \\<br />
	    0<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  3 \\<br />
	   8 \\<br />
	    6<br />
	  \end{bmatrix}, \begin{bmatrix}<br />
	  -1 \\<br />
	   9 \\<br />
	    10<br />
	  \end{bmatrix} \,\right\}$</p>
</div>
<button class="simplefavorite-button has-count" data-postid="4990" data-siteid="1" data-groupid="1" data-favoritecount="461" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">461</span></button>The post <a href="https://yutsumura.com/three-linearly-independent-vectors-in-r3-form-a-basis-three-vectors-spanning-r3-form-a-basis/">Three Linearly Independent Vectors in $\R^3$ Form a Basis. Three Vectors Spanning $\R^3$ Form a Basis.</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/three-linearly-independent-vectors-in-r3-form-a-basis-three-vectors-spanning-r3-form-a-basis/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4990</post-id>	</item>
		<item>
		<title>Every $n$-Dimensional Vector Space is Isomorphic to the Vector Space $\R^n$</title>
		<link>https://yutsumura.com/every-n-dimensional-vector-space-is-isomorphic-to-the-vector-space-rn/</link>
				<comments>https://yutsumura.com/every-n-dimensional-vector-space-is-isomorphic-to-the-vector-space-rn/#comments</comments>
				<pubDate>Wed, 23 Aug 2017 00:19:51 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[coordinate vector]]></category>
		<category><![CDATA[isomorphism of vector spaces]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear transformation]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4694</guid>
				<description><![CDATA[<p>Let $V$ be a vector space over the field of real numbers $\R$. Prove that if the dimension of $V$ is $n$, then $V$ is isomorphic to $\R^n$. &#160; Proof. Since $V$ is an&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/every-n-dimensional-vector-space-is-isomorphic-to-the-vector-space-rn/">Every $n$-Dimensional Vector Space is Isomorphic to the Vector Space $\R^n$</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 545</h2>
<p>				 Let $V$ be a vector space over the field of real numbers $\R$.</p>
<p>			Prove that if the dimension of $V$ is $n$, then $V$ is isomorphic to $\R^n$.</p>
<p>&nbsp;<br />
<span id="more-4694"></span><br />

<h2> Proof. </h2>
<p>				Since $V$ is an $n$-dimensional vector space, it has a basis<br />
				\[B=\{\mathbf{v}_1, \dots, \mathbf{v}_n\},\]
				where each $\mathbf{v}_i$ is a vector in $V$.</p>
<hr />
<p>				Define a map $T: V\to \R^n$ by sending each vector $\mathbf{v}\in V$ to its coordinate vector $[\mathbf{v}]_B$ with respect to the basis $B$.<br />
				More explicitly, if<br />
				\[\mathbf{v}=c_1\mathbf{v}_1+\cdots+c_n \mathbf{v}_n \text{ with } c_1, \dots, c_n \in \R,\]
				then the coordinate vector with respect to $B$ is<br />
				\[[\mathbf{v}]_B=\begin{bmatrix}<br />
		  c_1 \\<br />
		   c_2 \\<br />
		    \vdots \\<br />
		   c_n<br />
		   \end{bmatrix} \in \R^n.\]
		   Then the map $T: V \to \R^n$ is defined by<br />
		   \[T(\mathbf{v})=\begin{bmatrix}<br />
		  c_1 \\<br />
		   c_2 \\<br />
		    \vdots \\<br />
		   c_n<br />
		   \end{bmatrix}.\]
<hr />
<p>		   It follows from the properties of the coordinate vectors that the map $T$ is a linear transformation.<br />
		   We show that $T$ is bijective, hence an isomorphism.</p>
<h3>$T$ is injective.</h3>
<p>		   To show that $T$ is injective, it suffices to show that the null space of $T$ is trivial: $\calN(T)=\{\mathbf{0}\}$.<br />
(See the post &#8220;<a href="//yutsumura.com/a-linear-transformation-is-injective-one-to-one-if-and-only-if-the-nullity-is-zero/" target="_blank">A Linear Transformation is Injective (One-To-One) if and only if the Nullity is Zero</a>&#8221; for a proof of this fact.)</p>
<p>		   If $\mathbf{v}\in \calN(T)$, then we have<br />
		   \[\mathbf{0}=T(\mathbf{v})=[\mathbf{v}]_B.\]
		   So the coordinate vector of $\mathbf{v}$ is zero, hence we have<br />
		   \[\mathbf{v}=0\mathbf{v}_1+\cdots +0\mathbf{v}_n=\mathbf{0}.\]
		   Thus, $\calN(T)=\{\mathbf{0}\}$, and $T$ is injective.</p>
<h3>$T$ is surjective.</h3>
<p>		   To show that $T$ is surjective, let<br />
		   \[\mathbf{a}=\begin{bmatrix}<br />
		  a_1 \\<br />
		   a_2 \\<br />
		    \vdots \\<br />
		   a_n<br />
		   \end{bmatrix}\]
		   be an arbitrary vector in $\R^n$.<br />
		   Then consider the vector<br />
		   \[\mathbf{v}:=a_1\mathbf{v}_1+\cdots+ a_n \mathbf{v}_n\]
		   in $V$.<br />
		   Then it follows from the definition of the linear transformation $T$ that<br />
		   \[T(\mathbf{v})=[\mathbf{v}]_B=\begin{bmatrix}<br />
		  a_1 \\<br />
		   a_2 \\<br />
		    \vdots \\<br />
		   a_n<br />
		   \end{bmatrix}=\mathbf{a}.\]
		   Therefore $T$ is surjective.</p>
<hr />
<p>		   In summary, $T: V \to \R^n$ is a bijective linear transformation, and hence $T$ is an isomorphism.<br />
		   Thus, we conclude that $V$ and $\R^n$ are isomorphic.</p>
<h2>Generalization </h2>
<p>We may use more general field $\F$ instead of the field of real numbers $\R$.<br />
Then the general statement is as follows.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
If $V$ is an $n$-dimensional vector space over a field $\F$, then $V$ is isomorphic to $\F^n$.</div>
<p>The proof is identical as above except that we replace $\R$ by $\F$ everywhere.</p>
<button class="simplefavorite-button has-count" data-postid="4694" data-siteid="1" data-groupid="1" data-favoritecount="87" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">87</span></button>The post <a href="https://yutsumura.com/every-n-dimensional-vector-space-is-isomorphic-to-the-vector-space-rn/">Every $n$-Dimensional Vector Space is Isomorphic to the Vector Space $\R^n$</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/every-n-dimensional-vector-space-is-isomorphic-to-the-vector-space-rn/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4694</post-id>	</item>
		<item>
		<title>The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</title>
		<link>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/</link>
				<comments>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/#comments</comments>
				<pubDate>Wed, 16 Aug 2017 21:55:59 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[Gram-Schmidt orthogonalization process]]></category>
		<category><![CDATA[Gram-Schmidt process]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inner product space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[positive definite matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4653</guid>
				<description><![CDATA[<p>Consider the $2\times 2$ real matrix \[A=\begin{bmatrix} 1 &#038; 1\\ 1&#038; 3 \end{bmatrix}.\] (a) Prove that the matrix $A$ is positive definite. (b) Since $A$ is positive definite by part (a), the formula \[\langle&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 539</h2>
<p>		   Consider the $2\times 2$ real matrix<br />
		   \[A=\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}.\]
<p><strong>(a)</strong> Prove that the matrix $A$ is positive definite.</p>
<p><strong>(b)</strong> Since $A$ is positive definite by part (a), the formula<br />
		\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans} A \mathbf{y}\]
		for $\mathbf{x}, \mathbf{y} \in \R^2$ defines an inner product on $\R^n$.<br />
		Consider $\R^2$ as an inner product space with this inner product.</p>
<p>		Prove that the unit vectors<br />
		\[\mathbf{e}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix} \text{ and } \mathbf{e}_2=\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}\]
		are not orthogonal in the inner product space $\R^2$.</p>
<p><strong>(c)</strong> Find an orthogonal basis $\{\mathbf{v}_1, \mathbf{v}_2\}$ of $\R^2$ from the basis $\{\mathbf{e}_1, \mathbf{e}_2\}$ using the Gram-Schmidt orthogonalization process.</p>
<p>&nbsp;<br />
<span id="more-4653"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that the matrix $A$ is positive definite.</h3>
<p> We prove that for every nonzero vector $\mathbf{x}=\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}\in \R^2$, we have $\mathbf{x}^{\trans} A \mathbf{x} > 0$.<br />
		We have<br />
		\begin{align*}<br />
		\mathbf{x}^{\trans} A \mathbf{x}&#038;=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix} \begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x+y \\<br />
		  x+3y<br />
		\end{bmatrix}\\[6pt]
		&#038;=x(x+y)+y(x+3y)=x^2+2xy+3y^2\\<br />
		&#038;=x^2+2xy+y^2+2y^2=(x+y)^2+2y^2.<br />
		\end{align*}</p>
<p>		Since $\mathbf{x}\neq \mathbf{0}$, at least one of $x, y$ is nonzero.<br />
		Thus the last expression is always positive.<br />
		Hence $A$ is a positive definite matrix.</p>
<h3>(b) Prove that $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal in the inner product space $\R^2$.</h3>
<p> Note that by post &#8220;<a href="//yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a>&#8220;, the formula $\langle \mathbf{x}, \mathbf{y}\rangle$ defines an inner product on $\R^2$.</p>
<p>		Two vectors $\mathbf{x}$ and $\mathbf{y}$ is said to be <strong>orthogonal</strong> if $\langle \mathbf{x}, \mathbf{y}\rangle=0$.</p>
<p>		The vectors $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal with this inner product since<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_2\rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  3<br />
		\end{bmatrix}=1\neq 0.<br />
		\end{align*}</p>
<h3>(c) Find an orthogonal basis using the Gram-Schmidt orthogonalization process.</h3>
<p>By the Gram-Schmidt orthogonalization process, we have<br />
		\begin{align*}<br />
		\mathbf{v}_1&#038;=\mathbf{e}_1\\<br />
		\mathbf{v}_2&#038;=\mathbf{e}_2-\frac{\langle \mathbf{v}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{v}_1, \mathbf{v}_1 \rangle}\mathbf{v}_1<br />
		=\mathbf{e}_2-\frac{\langle \mathbf{e}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{e}_1, \mathbf{e}_1 \rangle}\mathbf{e}_1.<br />
		\end{align*}</p>
<p>		We compute<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_1 \rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  1<br />
		\end{bmatrix}=1.<br />
		\end{align*}<br />
		We also have $\langle \mathbf{e}_1, \mathbf{e}_2\rangle=1$ from part (b).<br />
		Thus, we have<br />
		\begin{align*}<br />
		\mathbf{v}_2=\mathbf{e}_2-\mathbf{e}_1=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Thus, the Gram-Schmidt orthogonalization process yields the orthogonal basis<br />
		\[\mathbf{v}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.\]
<h4>Double Check</h4>
<p>		Let us verify that $\mathbf{v}_1, \mathbf{v}_2$ are orthogonal by computing their inner product directly as follows.<br />
		We have<br />
		\begin{align*}<br />
		\langle \mathbf{v}_1, \mathbf{v}_2\rangle=\mathbf{v}_1^{\trans} A\mathbf{v}_2=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  2<br />
		\end{bmatrix}=0.<br />
		\end{align*}</p>
<button class="simplefavorite-button has-count" data-postid="4653" data-siteid="1" data-groupid="1" data-favoritecount="27" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">27</span></button>The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4653</post-id>	</item>
		<item>
		<title>All Linear Transformations that Take the Line $y=x$ to the Line $y=-x$</title>
		<link>https://yutsumura.com/all-linear-transformations-that-take-the-line-yx-to-the-line-y-x/</link>
				<comments>https://yutsumura.com/all-linear-transformations-that-take-the-line-yx-to-the-line-y-x/#respond</comments>
				<pubDate>Wed, 14 Jun 2017 00:28:30 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[line]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear transformation]]></category>
		<category><![CDATA[plane]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3114</guid>
				<description><![CDATA[<p>Determine all linear transformations of the $2$-dimensional $x$-$y$ plane $\R^2$ that take the line $y=x$ to the line $y=-x$. &#160; Solution. Let $T:\R^2 \to \R^2$ be a linear transformation that maps the line $y=x$&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/all-linear-transformations-that-take-the-line-yx-to-the-line-y-x/">All Linear Transformations that Take the Line $y=x$ to the Line $y=-x$</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 454</h2>
<p>	Determine all linear transformations of the $2$-dimensional $x$-$y$ plane $\R^2$ that take the line $y=x$ to the line $y=-x$.</p>
<p>&nbsp;<br />
<span id="more-3114"></span><br />

<h2>Solution.</h2>
<p>		Let $T:\R^2 \to \R^2$ be a linear transformation that maps the line $y=x$ to the line $y=-x$.<br />
		Note that the linear transformation $T$ is completely determined if the values of $T$ on basis vectors of the vector space $\R^2$ are known.</p>
<p>		Let<br />
		\[B=\left\{\,  \begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}, \begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix} \,\right\}\]
	be a basis of $\R^2$.</p>
<hr />
<p>	The reason of this choice is as follows.<br />
	Since we know that $T$ takes the line $y=x$ to the line $y=-x$, the vector $\begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix}$ is mapped into some point on the line $y=-x$.<br />
	This is how we chose the vector $\begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix}$. The other basis vector could be any vector that is not a multiple of $\begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix}$, and we just chose the simple vector $\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}$.</p>
<hr />
<p>	Let<br />
	\[T\left(\,  \begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix} \,\right)=\begin{bmatrix}<br />
	  a \\<br />
	  b<br />
	\end{bmatrix}\]
	for some $a,b \in \R$.</p>
<p>	Since we know that the vector $T\left(\,  \begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix} \,\right)$ is on the line $y=-x$, let<br />
	\[T\left(\,  \begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix} \,\right)=\begin{bmatrix}<br />
	  c \\<br />
	  -c<br />
	\end{bmatrix}\]
	for some $c\in \R$.</p>
<hr />
<p>	We now find a formula for this linear transformation $T$.<br />
	Let $\begin{bmatrix}<br />
	  x \\<br />
	  y<br />
	\end{bmatrix}$ be an arbitrary vector in the plane $\R^2$.</p>
<p>	We express this vector as a linear combination of basis vectors:<br />
	\[\begin{bmatrix}<br />
	  x \\<br />
	  y<br />
	\end{bmatrix}=(x-y)\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}+y\begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix}.\]
<p>	Then we have<br />
	\begin{align*}<br />
	&#038;T\left(\,  \begin{bmatrix}<br />
	  x \\<br />
	  y<br />
	\end{bmatrix} \,\right)\\<br />
	&#038;=T\left(\,  (x-y)\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}+y\begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix} \,\right)\\<br />
	&#038;=(x-y)T\left(\,  \begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}\,\right)+yT\left(\,  \begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix} \,\right) &#038;&#038; \text{since $T$ is a linear transformation}\\<br />
	&#038;=(x-y)\begin{bmatrix}<br />
	  a \\<br />
	  b<br />
	\end{bmatrix}+y\begin{bmatrix}<br />
	  c \\<br />
	  -c<br />
	\end{bmatrix}\\<br />
	&#038;=\begin{bmatrix}<br />
	  ax+(c-a)y \\<br />
	  bx-(c+b)y<br />
	\end{bmatrix}.<br />
	\end{align*}</p>
<hr />
<p>	We conclude that any linear transformation $T:\R^2\to \R^2$ that takes the line $y=x$ to the line $y=-x$ is of the form</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
\[T\left(\,  \begin{bmatrix}<br />
	  x \\<br />
	  y<br />
	\end{bmatrix} \,\right)=\begin{bmatrix}<br />
	  ax+(c-a)y \\<br />
	  bx-(c+b)y<br />
	\end{bmatrix}\]
</div>
<p>	for some $a, b, c\in \R$.</p>
<h2> Remark. </h2>
<p>	Remark that if $c=0$, then all the points on the line $y=x$ are mapped into the origin, which is on the line $y=-x$.<br />
	If we want to avoid this degenerate case, we need to assume that $c\neq 0$.</p>
<button class="simplefavorite-button has-count" data-postid="3114" data-siteid="1" data-groupid="1" data-favoritecount="34" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">34</span></button>The post <a href="https://yutsumura.com/all-linear-transformations-that-take-the-line-yx-to-the-line-y-x/">All Linear Transformations that Take the Line $y=x$ to the Line $y=-x$</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/all-linear-transformations-that-take-the-line-yx-to-the-line-y-x/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3114</post-id>	</item>
		<item>
		<title>Dimension of the Sum of Two Subspaces</title>
		<link>https://yutsumura.com/dimension-of-the-sum-of-two-subspaces/</link>
				<comments>https://yutsumura.com/dimension-of-the-sum-of-two-subspaces/#comments</comments>
				<pubDate>Tue, 06 Jun 2017 01:33:51 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis for a vector space]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[dimension]]></category>
		<category><![CDATA[dimension of a vector space]]></category>
		<category><![CDATA[finite dimensional vector space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[sum of subspaces]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3017</guid>
				<description><![CDATA[<p>Let $U$ and $V$ be finite dimensional subspaces in a vector space over a scalar field $K$. Then prove that \[\dim(U+V) \leq \dim(U)+\dim(V).\] &#160; Definition (The sum of subspaces). Recall that the sum of&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/dimension-of-the-sum-of-two-subspaces/">Dimension of the Sum of Two Subspaces</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 440</h2>
<p> Let $U$ and $V$ be finite dimensional subspaces in a vector space over a scalar field $K$.<br />
	Then prove that<br />
	\[\dim(U+V) \leq \dim(U)+\dim(V).\]
<p>&nbsp;<br />
<span id="more-3017"></span><br />

<h2>Definition (The sum of subspaces).</h2>
<p>Recall that the sum of subspaces $U$ and $V$ is<br />
\[U+V=\{\mathbf{x}+\mathbf{y} \mid \mathbf{x}\in U, \mathbf{y}\in V\}.\]
The sum $U+V$ is a subspace.<br />
(See the post &#8220;<a href="//yutsumura.com/the-sum-of-subspaces-is-a-subspace-of-a-vector-space/" target="_blank">The sum of subspaces is a subspace of a vector space</a>&#8221; for a proof.)</p>
<h2> Proof. </h2>
<p>		Let $n=\dim(U)$ and $m=\dim(V)$.<br />
		Let<br />
\[B_1=\{\mathbf{u}_1, \dots, \mathbf{u}_n\}\]
 be a basis of the vector space $U$ and let<br />
		\[B_2=\{\mathbf{v}_1, \dots, \mathbf{v}_m\}\]
		be a basis of the vector space $V$.</p>
<p>		An arbitrary element of the vector space $U+W$ is of the form $\mathbf{x}+\mathbf{y}$, where $\mathbf{x}\in U$ and $\mathbf{y} \in V$.</p>
<hr />
<p>		Since $B_1$ is a basis of $U$, we can write<br />
		\[\mathbf{x}=r_1\mathbf{u}_1+\cdots +r_n \mathbf{u}_n\]
		for some scalars $r_1, \dots, r_n\in K$.<br />
		Also, since $B_2$ is a basis of $V$, we can write<br />
		\[\mathbf{y}=s_1\mathbf{v}_1+\cdots +s_m \mathbf{v}_m\]
		for some scalars $s_1, \dots, s_m\in K$.</p>
<hr />
<p>		Thus, we have<br />
		\begin{align*}<br />
	\mathbf{x}+\mathbf{y}&#038;=r_1\mathbf{u}_1+\cdots +r_n \mathbf{u}_n+s_1\mathbf{v}_1+\cdots +s_m \mathbf{v}_m,<br />
	\end{align*}<br />
	and hence $\mathbf{x}+\mathbf{y}$ is in the span $S:=\Span(\mathbf{u}_1, \dots, \mathbf{u}_n, \mathbf{v}_1, \dots, \mathbf{v}_m)$.</p>
<p>	Thus we have $U+W \subset S$ and it yields that \begin{align*}<br />
	\dim(U+W) \leq \dim(S)\leq n+m=\dim(U)+\dim(V).<br />
	\end{align*}<br />
	This completes the proof.</p>
<h2> Related Question. </h2>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">Let $A$ and $B$ be $m\times n$ matrices.<br />
	Prove that<br />
	\[\rk(A+B) \leq \rk(A)+\rk(B).\]</div>
<p>See the post &#8628;<br />
<a href="//yutsumura.com/the-rank-of-the-sum-of-two-matrices/" target="_blank">The rank of the sum of two matrices</a><br />
for a proof of this problem.</p>
<button class="simplefavorite-button has-count" data-postid="3017" data-siteid="1" data-groupid="1" data-favoritecount="44" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">44</span></button>The post <a href="https://yutsumura.com/dimension-of-the-sum-of-two-subspaces/">Dimension of the Sum of Two Subspaces</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/dimension-of-the-sum-of-two-subspaces/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3017</post-id>	</item>
		<item>
		<title>Powers of a Matrix Cannot be a Basis of the Vector Space of Matrices</title>
		<link>https://yutsumura.com/powers-of-a-matrix-cannot-be-a-basis-of-the-vector-space-of-matrices/</link>
				<comments>https://yutsumura.com/powers-of-a-matrix-cannot-be-a-basis-of-the-vector-space-of-matrices/#respond</comments>
				<pubDate>Tue, 11 Apr 2017 19:51:24 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[Cayley-Hamilton theorem]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linearly dependent]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2667</guid>
				<description><![CDATA[<p>Let $n>1$ be a positive integer. Let $V=M_{n\times n}(\C)$ be the vector space over the complex numbers $\C$ consisting of all complex $n\times n$ matrices. The dimension of $V$ is $n^2$. Let $A \in&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/powers-of-a-matrix-cannot-be-a-basis-of-the-vector-space-of-matrices/">Powers of a Matrix Cannot be a Basis of the Vector Space of Matrices</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 375</h2>
<p> Let $n>1$ be a positive integer. Let $V=M_{n\times n}(\C)$ be the vector space over the complex numbers $\C$ consisting of all complex $n\times n$ matrices. The dimension of $V$ is $n^2$.<br />
	 Let $A \in V$ and consider the set<br />
	 \[S_A=\{I=A^0, A, A^2, \dots, A^{n^2-1}\}\]
	 of $n^2$ elements.<br />
	 Prove that the set $S_A$ cannot be a basis of the vector space $V$ for any $A\in V$.</p>
<p>&nbsp;<br />
<span id="more-2667"></span></p>
<h2> Proof. </h2>
<p>	 	We prove that the set $S_A$ is linearly dependent, hence it cannot be a basis of $V$.<br />
	 	Since $A$ is an $n\times n$ matrix, its characteristic polynomial $p(t)=\det(tI-A)$ is a degree $n$ polynomial.</p>
<p>(Your preferred definition of the characteristic polynomial might be $\det(A-tI)$. It is straight forward to modify the following proof with this definition.)</p>
<hr />
<p>	 	Let us write it as<br />
	 	\[p(t)=t^n+a_{n-1}t^{n-1}+\cdots+a_1x+a_0.\]
	 	Then the Cayley-Hamilton theorem states that<br />
	 	\[p(A)=A^n+a_{n-1}A^{n-1}+\cdots+a_1A+a_0I=O\]
	 	is the zero matrix.</p>
<p>	 	Since the coefficient of $A^n$ is $1$, this gives a non-trivial linear combination of $I, A, \dots, A^n$. Therefore the set<br />
	 	\[T:=\{I, A, \dots, A^n\}\]
	 	is linearly dependent.</p>
<p>	 	As $T$ is a subset of $S_A$, the set $S_A$ is also linearly dependent.<br />
	 	Therefore, $S_A$ is not a basis of $V$. This completes the proof.</p>
<button class="simplefavorite-button has-count" data-postid="2667" data-siteid="1" data-groupid="1" data-favoritecount="33" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">33</span></button>The post <a href="https://yutsumura.com/powers-of-a-matrix-cannot-be-a-basis-of-the-vector-space-of-matrices/">Powers of a Matrix Cannot be a Basis of the Vector Space of Matrices</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/powers-of-a-matrix-cannot-be-a-basis-of-the-vector-space-of-matrices/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2667</post-id>	</item>
		<item>
		<title>Coordinate Vectors and Dimension of Subspaces (Span)</title>
		<link>https://yutsumura.com/coordinate-vectors-and-dimension-of-subspaces-span/</link>
				<comments>https://yutsumura.com/coordinate-vectors-and-dimension-of-subspaces-span/#respond</comments>
				<pubDate>Fri, 24 Mar 2017 03:48:48 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[coordinate vectors]]></category>
		<category><![CDATA[leading 1 method]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[span]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2512</guid>
				<description><![CDATA[<p>Let $V$ be a vector space over $\R$ and let $B$ be a basis of $V$. Let $S=\{v_1, v_2, v_3\}$ be a set of vectors in $V$. If the coordinate vectors of these vectors&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/coordinate-vectors-and-dimension-of-subspaces-span/">Coordinate Vectors and Dimension of Subspaces (Span)</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 350</h2>
<p>	Let $V$ be a vector space over $\R$ and let $B$ be a basis of $V$.<br />
		Let $S=\{v_1, v_2, v_3\}$ be a set of vectors in $V$. If the coordinate vectors of these vectors with respect to the basis $B$ is given as follows, then find the dimension of $V$ and the dimension of the span of $S$.<br />
		\[[v_1]_B=\begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix}, [v_2]_B=\begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix}, [v_3]_B=\begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix}.\]
<p>&nbsp;<br />
<span id="more-2512"></span><br />

<h2>Solution.</h2>
<h3>Coordinate vectors</h3>
<p>	Let us first recall the definition of coordinate vectors.<br />
	Suppose that $V$ is a vector space over $\R$ and let $B=\{v_1, v_2, \dots, v_n\}$ be a basis of $V$ (hence the dimension of $V$ is $n$).</p>
<p>	Then any element of $v$ of $V$ can be written as<br />
	\[v=c_1v_1+c_2v_2+\cdots +c_n v_n,\]
	where $c_1, c_2, \dots, c_n$ are scalars in $\R$.</p>
<p>	This expression is unique and the coordinate vector of $v$ with respect to the basis $B$ is defined as<br />
	\[[v]_B=\begin{bmatrix}<br />
	  c_1 \\<br />
	   c_2 \\<br />
	    \vdots \\<br />
	   c_n<br />
	   \end{bmatrix}.\]
	   Thus the coordinate vector $[v]_B$ is just a usual $n$-dimensional vector.</p>
<h3>Main part of the solution</h3>
<p>	   Note that in the current problem, the coordinate vectors are $4$-dimensional vectors.<br />
	   This implies that the basis $B$ consists of four vectors. Hence the dimension of $V$ is $4$.</p>
<p>	   By the correspondence of the coordinate vectors, the dimension of $\Span(S)$ is the same as the dimension of $\Span(T)$, where<br />
	   \begin{align*}<br />
	T&#038;=\{[v_1]_B, [v_2]_B, [v_2]_B\}\\<br />
	&#038;=\left\{\, \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix}, \begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix}, \begin{bmatrix}<br />
	  1 \\<br />
	   1 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix}  \,\right\}.<br />
	\end{align*}</p>
<p>	To find the dimension of $\Span(T)$, we need to find a basis of $\Span(T)$.<br />
	One way to do this is to note that the third vector is the sum of the first two vectors. Also, it&#8217;s clear that the first two vectors are linearly independent.</p>
<p>	Thus, the set<br />
	\begin{align*}<br />
	\left\{\, \begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix}, \begin{bmatrix}<br />
	  0 \\<br />
	   1 \\<br />
	    0 \\<br />
	   0<br />
	   \end{bmatrix} \,\right\}<br />
	\end{align*}<br />
	is a basis of $\Span(T)$, hence the dimension of $\Span(T)$ is $2$.<br />
	We conclude that the dimension of $\Span(S)$ is $2$ as well.<br />
(We can also conclude that the set $\{v_1, v_2\}$ is a basis of $\Span(S)$.)</p>
<h3>Another way to find a basis of $\Span(T)$</h3>
<p>	Here is another way to find a basis of $\Span(T)$. We can use the leading 1 method.<br />
	We form the matrix whose column vectors are the vectors in $T$:<br />
	\begin{align*}<br />
	\begin{bmatrix}<br />
	  1 &#038; 0 &#038; 1 \\<br />
	   0 &#038;1 &#038;1 \\<br />
	   0 &#038; 0 &#038; 0 \\<br />
	   0 &#038; 0 &#038; 0<br />
	\end{bmatrix}<br />
	\end{align*}</p>
<p>	Note that this matrix is already in reduced row echelon form.<br />
	The first two columns contain the leading 1&#8217;s. Thus the first two columns form a basis of $\Span(T)$ (this is the leading 1 method).</p>
<p>Remark: In general we apply elementary row operations to reduce the matrix into a matrix in reduced row echelon form.</p>
<button class="simplefavorite-button has-count" data-postid="2512" data-siteid="1" data-groupid="1" data-favoritecount="33" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">33</span></button>The post <a href="https://yutsumura.com/coordinate-vectors-and-dimension-of-subspaces-span/">Coordinate Vectors and Dimension of Subspaces (Span)</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/coordinate-vectors-and-dimension-of-subspaces-span/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2512</post-id>	</item>
		<item>
		<title>The Subset Consisting of the Zero Vector is a Subspace and its Dimension is Zero</title>
		<link>https://yutsumura.com/the-subset-consisting-of-the-zero-vector-is-a-subspace-and-its-dimension-is-zero/</link>
				<comments>https://yutsumura.com/the-subset-consisting-of-the-zero-vector-is-a-subspace-and-its-dimension-is-zero/#respond</comments>
				<pubDate>Sat, 11 Feb 2017 23:39:26 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis for a vector space]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[dimension]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear combination]]></category>
		<category><![CDATA[linearly dependent]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[spanning set]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[subspace criteria]]></category>
		<category><![CDATA[vector]]></category>
		<category><![CDATA[vector space]]></category>
		<category><![CDATA[zero vector]]></category>
		<category><![CDATA[zero vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2155</guid>
				<description><![CDATA[<p>Let $V$ be a subset of the vector space $\R^n$ consisting only of the zero vector of $\R^n$. Namely $V=\{\mathbf{0}\}$. Then prove that $V$ is a subspace of $\R^n$. &#160; Proof. To prove that&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/the-subset-consisting-of-the-zero-vector-is-a-subspace-and-its-dimension-is-zero/">The Subset Consisting of the Zero Vector is a Subspace and its Dimension is Zero</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 292</h2>
<p> Let $V$ be a subset of the vector space $\R^n$ consisting only of the zero vector of $\R^n$. Namely $V=\{\mathbf{0}\}$.<br />
 Then prove that $V$ is a subspace of $\R^n$.</p>
<p>&nbsp;<br />
<span id="more-2155"></span><br />

<h2> Proof. </h2>
<p>		To prove that $V=\{\mathbf{0}\}$ is a subspace of $\R^n$, we check the following subspace criteria.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Subspace Criteria</strong><br />
(a) The zero vector $\mathbf{0} \in \R^n$ is in $V$.<br />
(b) If $\mathbf{x}, \mathbf{y} \in V$, then $\mathbf{x}+\mathbf{y}\in V$.<br />
(c) If $\mathbf{x} \in V$ and $c\in \R$, then $c\mathbf{x} \in V$.
</div>
<p>		 Condition (a) is clear since $V$ consists of the zero vector $\mathbf{0}$.</p>
<p>		 To check condition (b), note that the only element in $V=\{\mathbf{0}\}$ is $\mathbf{0}$. Thus if $\mathbf{x}, \mathbf{y} \in V$, then both $\mathbf{x}, \mathbf{y}$ are $\mathbf{0}$. Hence<br />
	\[\mathbf{x}+\mathbf{y} =\mathbf{0}+\mathbf{0}=\mathbf{0}\in V\]
	and condition (b) is met.</p>
<p>	To confirm condition (c), let $\mathbf{x}\in V$ and $c\in \R$. Then $\mathbf{x}=\mathbf{0}$.<br />
	We have<br />
	\[c\mathbf{x}=c\mathbf{0}=\mathbf{0}\in V\]
	and condition (c) is satisfied.</p>
<p>	Hence we have checked all the subspace criteria, and hence the subset $V=\{\mathbf{0}\}$ consisting only of the zero vector is a subspace of $\R^n$.</p>
<h2>What&#8217;s the dimension of the zero vector space? </h2>
<p>What&#8217;s the dimension of the subspace $V=\{\mathbf{0}\}$?</p>
<p>	The dimension of a subspace is the number of vectors in a basis. So let us first find a basis of $V$.</p>
<p>	Note that a basis of $V$ consists of vectors in $V$ that are linearly independent spanning set. Since $0$ is the only vector in $V$, the set $S=\{\mathbf{0}\}$ is the only possible set for a basis. </p>
<p>However, $S$ is not a linearly independent set since, for example, we have a nontrivial linear combination $1\cdot \mathbf{0}=\mathbf{0}$.</p>
<p>	Therefore, the subspace $V=\{\mathbf{0}\}$ does not have a basis.<br />
	Hence the dimension of $V$ is zero.</p>
<button class="simplefavorite-button has-count" data-postid="2155" data-siteid="1" data-groupid="1" data-favoritecount="34" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">34</span></button>The post <a href="https://yutsumura.com/the-subset-consisting-of-the-zero-vector-is-a-subspace-and-its-dimension-is-zero/">The Subset Consisting of the Zero Vector is a Subspace and its Dimension is Zero</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-subset-consisting-of-the-zero-vector-is-a-subspace-and-its-dimension-is-zero/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2155</post-id>	</item>
		<item>
		<title>Basis For Subspace Consisting of Matrices Commute With a Given Diagonal Matrix</title>
		<link>https://yutsumura.com/basis-for-subspace-consisting-of-matrices-commute-with-a-given-diagonal-matrix/</link>
				<comments>https://yutsumura.com/basis-for-subspace-consisting-of-matrices-commute-with-a-given-diagonal-matrix/#respond</comments>
				<pubDate>Tue, 07 Feb 2017 02:19:39 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[basis for a vector space]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[matrix multiplication]]></category>
		<category><![CDATA[matrix product]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2122</guid>
				<description><![CDATA[<p>Let $V$ be the vector space of all $3\times 3$ real matrices. Let $A$ be the matrix given below and we define \[W=\{M\in V \mid AM=MA\}.\] That is, $W$ consists of matrices that commute&#46;&#46;&#46;</p>
The post <a href="https://yutsumura.com/basis-for-subspace-consisting-of-matrices-commute-with-a-given-diagonal-matrix/">Basis For Subspace Consisting of Matrices Commute With a Given Diagonal Matrix</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></description>
								<content:encoded><![CDATA[<h2> Problem 287</h2>
<p>Let $V$ be the vector space of all $3\times 3$ real matrices.<br />
Let $A$ be the matrix given below and we define<br />
\[W=\{M\in V \mid AM=MA\}.\]
That is, $W$ consists of matrices that commute with $A$.<br />
Then $W$ is a subspace of $V$. </p>
<p>Determine which matrices are in the subspace $W$ and find the dimension of $W$.</p>
<p><strong>(a)</strong> \[A=\begin{bmatrix}<br />
  a &#038; 0 &#038; 0 \\<br />
   0 &#038;b &#038;0 \\<br />
   0 &#038; 0 &#038; c<br />
\end{bmatrix},\]
where $a, b, c$ are distinct real numbers.</p>
<p><strong>(b)</strong> \[A=\begin{bmatrix}<br />
  a &#038; 0 &#038; 0 \\<br />
   0 &#038;a &#038;0 \\<br />
   0 &#038; 0 &#038; b<br />
\end{bmatrix},\]
where $a, b$ are distinct real numbers.</p>
<p>&nbsp;<br />
<span id="more-2122"></span><br />

<h2> Solution. </h2>
<h3>(a) Diagonal matrix with distinct diagonal entries</h3>
<p> Let us first determine when a matrix $M$ commutes with $A$.<br />
	Let<br />
	\[M=\begin{bmatrix}<br />
	a_{1 1} &#038; a_{1 2} &#038; a_{1 3} \\<br />
	a_{2 1} &#038; a_{2 2} &#038; a_{2 3} \\<br />
	a_{3 1} &#038; a_{3 2} &#038; a_{3 3}<br />
	\end{bmatrix}\]
	and suppose that $AM=MA$:<br />
	\[\begin{bmatrix}<br />
  a &#038; 0 &#038; 0 \\<br />
   0 &#038; b &#038;0 \\<br />
   0 &#038; 0 &#038; c<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
	a_{1 1} &#038; a_{1 2} &#038; a_{1 3} \\<br />
	a_{2 1} &#038; a_{2 2} &#038; a_{2 3} \\<br />
	a_{3 1} &#038; a_{3 2} &#038; a_{3 3}<br />
	\end{bmatrix}<br />
	=<br />
	\begin{bmatrix}<br />
	a_{1 1} &#038; a_{1 2} &#038; a_{1 3} \\<br />
	a_{2 1} &#038; a_{2 2} &#038; a_{2 3} \\<br />
	a_{3 1} &#038; a_{3 2} &#038; a_{3 3}<br />
	\end{bmatrix}<br />
\begin{bmatrix}<br />
  a &#038; 0 &#038; 0 \\<br />
   0 &#038;b &#038;0 \\<br />
   0 &#038; 0 &#038; c<br />
\end{bmatrix}.\]
Computing matrix products, we obtain<br />
\[\begin{bmatrix}<br />
	aa_{1 1} &#038; aa_{1 2} &#038; aa_{1 3} \\<br />
	ba_{2 1} &#038; ba_{2 2} &#038; ba_{2 3} \\<br />
	ca_{3 1} &#038; ca_{3 2} &#038;c a_{3 3}<br />
	\end{bmatrix}<br />
	=<br />
	\begin{bmatrix}<br />
	a_{1 1}a &#038; a_{1 2}b &#038; a_{1 3}c \\<br />
	a_{2 1}a &#038; a_{2 2}b &#038; a_{2 3}c\\<br />
	a_{3 1}a &#038; a_{3 2}b &#038; a_{3 3}c<br />
	\end{bmatrix}. \tag{*}\]
	Compare the $(1,2)$ entries and we have $aa_{1 2}=ba_{1 2}$.<br />
	Since $a\neq b$, we must have $a_{1 2}=0$.</p>
<p>	Similarly, comparing the off-diagonal entries and noting $a, b, c$ are distinct, we find that off diagonal entries $a_{i j} , i\neq j$ must be $0$.</p>
<p>	Thus, $M$ commutes with $A$ if and only if<br />
	\[M=\begin{bmatrix}<br />
	a_{1 1} &#038; 0 &#038; 0 \\<br />
	0 &#038; a_{2 2} &#038; 0 \\<br />
	0 &#038; 0 &#038; a_{3 3}<br />
	\end{bmatrix}.\]
<p>	Therefore, the subspace $W$ consists of all $3\times 3$ diagonal matrices:<br />
	\[W=\{W\in V\mid W \text{ is diagonal}\}.\]
	Then it is easy to see that the set $\{E_{1 1}, E_{2 2}, E_{3 3}\}$ is a basis for $W$, where $E_{i j}$ is the $3\times 3$ matrix whose $(i,j)$-entry is $1$ and the other entries are zero. Thus the dimension of $W$ is $3$.</p>
<h3>(b) Diagonal matrix two diagonal entries are the same</h3>
<p> Now consider the case<br />
	 \[A=\begin{bmatrix}<br />
  a &#038; 0 &#038; 0 \\<br />
   0 &#038;a &#038;0 \\<br />
   0 &#038; 0 &#038; b<br />
\end{bmatrix}.\]
Let<br />
\[M=\begin{bmatrix}<br />
	a_{1 1} &#038; a_{1 2} &#038; a_{1 3} \\<br />
	a_{2 1} &#038; a_{2 2} &#038; a_{2 3} \\<br />
	a_{3 1} &#038; a_{3 2} &#038; a_{3 3}<br />
	\end{bmatrix}\]
	and compute $AM=MA$ as in part (a) (or you just need to replace $b, c$ in (*) by $a, b$, respectively) and obtain<br />
	\[\begin{bmatrix}<br />
	aa_{1 1} &#038; aa_{1 2} &#038; aa_{1 3} \\<br />
	aa_{2 1} &#038; aa_{2 2} &#038; aa_{2 3} \\<br />
	ba_{3 1} &#038; ba_{3 2} &#038;  ba_{3 3}<br />
	\end{bmatrix}<br />
	=<br />
	\begin{bmatrix}<br />
	a_{1 1}a &#038; a_{1 2}a &#038; a_{1 3}b \\<br />
	a_{2 1}a &#038; a_{2 2}a &#038; a_{2 3}b\\<br />
	a_{3 1}a &#038; a_{3 2}a &#038; a_{3 3}b<br />
	\end{bmatrix}. \]
	Comparing entries and noting $a\neq b$, we have<br />
	\[a_{1 3}=0, a_{2 3}=0, a_{3 1}=0, a_{3 2}=0.\]
<p>Thus, $M$ commutes with $A$ is and only if<br />
\[M=\begin{bmatrix}<br />
	a_{1 1} &#038; a_{1 2} &#038; 0 \\<br />
	a_{2 1} &#038; a_{2 2} &#038; 0 \\<br />
	0 &#038; 0 &#038; a_{3 3}<br />
	\end{bmatrix},\]
and hence the subspace $W$ consists of such matrices.<br />
From this, we see that the set<br />
\[\{E_{1 1}, E_{1 2}, E_{2 1}, E_{2 2}, E_{3 3}\}\]
is a basis for $W$, and we conclude that the dimension of $W$ is $5$.</p>
<button class="simplefavorite-button has-count" data-postid="2122" data-siteid="1" data-groupid="1" data-favoritecount="20" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">20</span></button>The post <a href="https://yutsumura.com/basis-for-subspace-consisting-of-matrices-commute-with-a-given-diagonal-matrix/">Basis For Subspace Consisting of Matrices Commute With a Given Diagonal Matrix</a> first appeared on <a href="https://yutsumura.com">Problems in Mathematics</a>.]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/basis-for-subspace-consisting-of-matrices-commute-with-a-given-diagonal-matrix/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2122</post-id>	</item>
	</channel>
</rss>
