<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>counterexample &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/counterexample/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Thu, 28 Dec 2017 03:50:51 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.6</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$</title>
		<link>https://yutsumura.com/prove-that-the-dot-product-is-commutative-mathbfvcdot-mathbfw-mathbfw-cdot-mathbfv/</link>
				<comments>https://yutsumura.com/prove-that-the-dot-product-is-commutative-mathbfvcdot-mathbfw-mathbfw-cdot-mathbfv/#respond</comments>
				<pubDate>Mon, 25 Dec 2017 00:39:57 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose of a matrix]]></category>
		<category><![CDATA[vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6285</guid>
				<description><![CDATA[<p>Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors. (a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$. (b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w}&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/prove-that-the-dot-product-is-commutative-mathbfvcdot-mathbfw-mathbfw-cdot-mathbfv/" target="_blank">Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 637</h2>
<p>Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors. </p>
<p><strong>(a)</strong> Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$.  </p>
<p><strong>(b)</strong> Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} \mathbf{v}^\trans$.</p>
<p>&nbsp;<br />
<span id="more-6285"></span><br />

<h2>Solution.</h2>
<h3>(a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$. </h3>
<p>	Suppose the vectors have component<br />
	\[\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} \, \mbox{  and  }  \mathbf{w} = \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{bmatrix}.\]
	Then,<br />
	\[\mathbf{v}^\trans \mathbf{w} = \begin{bmatrix} v_1 &#038; v_2 &#038; \cdots &#038; v_n \end{bmatrix} \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{bmatrix} = \sum_{i=1}^n v_i w_i,\]
	while<br />
	\[\mathbf{w}^\trans \mathbf{v} = \begin{bmatrix} w_1 &#038; w_2 &#038; \cdots &#038; w_n \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} = \sum_{i=1}^n w_i v_i.\]
	We can see that they are equal because $v_i w_i = w_i v_i$. </p>
<h3>(b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} \mathbf{v}^\trans$.</h3>
<p>For the counterexample, let $\mathbf{v} = \begin{bmatrix} 1 \\ 0 \end{bmatrix}$ and $\mathbf{w} = \begin{bmatrix} 0 \\ 1 \end{bmatrix}$.  Then<br />
	\[\mathbf{v} \mathbf{w}^\trans =  \begin{bmatrix} 1 \\ 0 \end{bmatrix}  \begin{bmatrix} 0 &#038; 1 \end{bmatrix} = \begin{bmatrix} 0 &#038; 1 \\ 0 &#038; 0 \end{bmatrix}\] while<br />
	\[\quad \mathbf{w} \mathbf{v}^\trans =  \begin{bmatrix} 0 \\ 1 \end{bmatrix}  \begin{bmatrix} 1 &#038; 0 \end{bmatrix} = \begin{bmatrix} 0 &#038; 0 \\ 1 &#038; 0 \end{bmatrix}.\]
<h2>Comment.</h2>
<p>Recall that for two vectors $\mathbf{v}, \mathbf{w} \in \R^n$, the <strong>dot product</strong> (or <strong>inner product</strong>) of $\mathbf{v}, \mathbf{w}$ is defined to be<br />
\[\mathbf{v}\cdot \mathbf{w}:=\mathbf{v}^{\trans} \mathbf{w}.\]
<p>Part (a) of the problem deduces that the dot product is commutative. This means that we have<br />
\[\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}.\]
<p>In fact, we have<br />
\begin{align*}<br />
\mathbf{v}\cdot \mathbf{w}= \mathbf{v}^\trans \mathbf{w}  \stackrel{\text{(a)}}{=} \mathbf{w}^\trans \mathbf{v} \mathbf{w} \cdot \mathbf{v}.<br />
\end{align*}</p>
<hr />
<p>Also, notice that while $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} \mathbf{v}^\trans$, we know that $(\mathbf{v} \mathbf{w}^\trans)^\trans = \mathbf{w} \mathbf{v}^\trans$.</p>
<button class="simplefavorite-button has-count" data-postid="6285" data-siteid="1" data-groupid="1" data-favoritecount="22" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">22</span></button><p>The post <a href="https://yutsumura.com/prove-that-the-dot-product-is-commutative-mathbfvcdot-mathbfw-mathbfw-cdot-mathbfv/" target="_blank">Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/prove-that-the-dot-product-is-commutative-mathbfvcdot-mathbfw-mathbfw-cdot-mathbfv/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6285</post-id>	</item>
		<item>
		<title>Does the Trace Commute with Matrix Multiplication? Is $\tr (A B) = \tr (A) \tr (B) $?</title>
		<link>https://yutsumura.com/does-the-trace-commute-with-matrix-multiplication-is-tr-a-b-tr-a-tr-b/</link>
				<comments>https://yutsumura.com/does-the-trace-commute-with-matrix-multiplication-is-tr-a-b-tr-a-tr-b/#respond</comments>
				<pubDate>Sun, 24 Dec 2017 19:12:18 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[trace]]></category>
		<category><![CDATA[trace of a matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6274</guid>
				<description><![CDATA[<p>Let $A$ and $B$ be $n \times n$ matrices. Is it always true that $\tr (A B) = \tr (A) \tr (B) $? If it is true, prove it. If not, give a counterexample.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/does-the-trace-commute-with-matrix-multiplication-is-tr-a-b-tr-a-tr-b/" target="_blank">Does the Trace Commute with Matrix Multiplication? Is $\tr (A B) = \tr (A) \tr (B) $?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 634</h2>
<p>	Let $A$ and $B$ be $n \times n$ matrices. </p>
<p>Is it always true that $\tr (A B) = \tr (A) \tr (B) $? </p>
<p>If it is true, prove it.  If not, give a counterexample.</p>
<p>&nbsp;<br />
<span id="more-6274"></span></p>
<h2>Solution.</h2>
<p>	There are many counterexamples. </p>
<p>For one, take<br />
	\[A = \begin{bmatrix} 1 &#038; 0 \\ 0 &#038; 0 \end{bmatrix}  \text{ and } B = \begin{bmatrix} 0 &#038; 0 \\ 0 &#038; 1 \end{bmatrix}.\]
<p>Then $\tr(A)=1, \tr(B)=1$, and hence $\tr(A) \tr(B) = 1$, while $\tr(AB) = 0$ as $AB = \begin{bmatrix} 0 &#038; 0 \\ 0 &#038; 0 \end{bmatrix}$.</p>
<button class="simplefavorite-button has-count" data-postid="6274" data-siteid="1" data-groupid="1" data-favoritecount="40" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">40</span></button><p>The post <a href="https://yutsumura.com/does-the-trace-commute-with-matrix-multiplication-is-tr-a-b-tr-a-tr-b/" target="_blank">Does the Trace Commute with Matrix Multiplication? Is $\tr (A B) = \tr (A) \tr (B) $?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/does-the-trace-commute-with-matrix-multiplication-is-tr-a-b-tr-a-tr-b/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6274</post-id>	</item>
		<item>
		<title>Is the Sum of a Nilpotent Matrix and an Invertible Matrix Invertible?</title>
		<link>https://yutsumura.com/is-the-sum-of-a-nilpotent-matrix-and-an-invertible-matrix-invertible/</link>
				<comments>https://yutsumura.com/is-the-sum-of-a-nilpotent-matrix-and-an-invertible-matrix-invertible/#comments</comments>
				<pubDate>Wed, 11 Oct 2017 02:10:07 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[determinant of a matrix]]></category>
		<category><![CDATA[invertible matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nilpotent]]></category>
		<category><![CDATA[nilpotent matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5063</guid>
				<description><![CDATA[<p>A square matrix $A$ is called nilpotent if some power of $A$ is the zero matrix. Namely, $A$ is nilpotent if there exists a positive integer $k$ such that $A^k=O$, where $O$ is the&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/is-the-sum-of-a-nilpotent-matrix-and-an-invertible-matrix-invertible/" target="_blank">Is the Sum of a Nilpotent Matrix and an Invertible Matrix Invertible?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 582</h2>
<p>	 A square matrix $A$ is called <strong>nilpotent</strong> if some power of $A$ is the zero matrix.<br />
Namely, $A$ is nilpotent if there exists a positive integer $k$ such that $A^k=O$, where $O$ is the zero matrix.</p>
<p>	 Suppose that $A$ is a nilpotent matrix and let $B$ be an invertible matrix of the same size as $A$.<br />
	 Is the matrix $B-A$ invertible? If so prove it. Otherwise, give a counterexample. </p>
<p>&nbsp;<br />
<span id="more-5063"></span><br />

<h2>Solution.</h2>
<p>	 	We claim that the matrix $B-A$ is not necessarily invertible.<br />
	 	Consider the matrix<br />
	 	\[A=\begin{bmatrix}<br />
	 	0 &#038; -1 \\0&#038; 0<br />
	 	\end{bmatrix}.\]
	 	This matrix is nilpotent as we have<br />
	 	\[A^2=\begin{bmatrix}<br />
	 	0 &#038; -1 \\0&#038; 0<br />
	 	\end{bmatrix}<br />
	 	\begin{bmatrix}<br />
	 	0 &#038; -1 \\0&#038; 0<br />
	 	\end{bmatrix}<br />
	 	=<br />
	 	\begin{bmatrix}<br />
	 	0 &#038; 0 \\0&#038; 0<br />
	 	\end{bmatrix}.\]
<p>	 	Also consider the matrix<br />
	 	\[B=\begin{bmatrix}<br />
	 	1 &#038; 0 \\1&#038; 1<br />
	 	\end{bmatrix}.\]
	 	Since the determinant of the matrix $B$ is $1$, it is invertible.</p>
<p>	 	So the matrix $A$ and $B$ satisfy the assumption of the problem.<br />
	 	However the matrix<br />
	 	\[B-A=\begin{bmatrix}<br />
	 	1 &#038; 1 \\1&#038; 1<br />
	 	\end{bmatrix}\]
	 	is not invertible as its determinant is $0$.<br />
	 	Hence we found a counterexample.</p>
<h2> Related Question. </h2>
<p>Here is another problem about a nilpotent matrix.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Let $A$ be an $n\times n$ nilpotent matrix. Then prove that $I-A, I+A$ are both nonsingular matrices, where $I$ is the $n\times n$ identity matrix.
</div>
<p>The solution is given in the post &#8628;<br />
<a href="//yutsumura.com/nilpotent-matrices-and-non-singularity-of-such-matrices/" rel="noopener" target="_blank">Nilpotent Matrices and Non-Singularity of Such Matrices</a></p>
<button class="simplefavorite-button has-count" data-postid="5063" data-siteid="1" data-groupid="1" data-favoritecount="54" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">54</span></button><p>The post <a href="https://yutsumura.com/is-the-sum-of-a-nilpotent-matrix-and-an-invertible-matrix-invertible/" target="_blank">Is the Sum of a Nilpotent Matrix and an Invertible Matrix Invertible?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/is-the-sum-of-a-nilpotent-matrix-and-an-invertible-matrix-invertible/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5063</post-id>	</item>
		<item>
		<title>True or False. Every Diagonalizable Matrix is Invertible</title>
		<link>https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/</link>
				<comments>https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/#respond</comments>
				<pubDate>Mon, 05 Jun 2017 06:49:55 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[algebraic multiplicity]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[defective matrix]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[diagonalizable]]></category>
		<category><![CDATA[diagonalization]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[geometric multiplicity]]></category>
		<category><![CDATA[invertible matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[true or false]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3010</guid>
				<description><![CDATA[<p>Is every diagonalizable matrix invertible? &#160; Solution. The answer is No. Counterexample We give a counterexample. Consider the $2\times 2$ zero matrix. The zero matrix is a diagonal matrix, and thus it is diagonalizable.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/" target="_blank">True or False. Every Diagonalizable Matrix is Invertible</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 439</h2>
<p> Is every diagonalizable matrix invertible?</p>
<p>&nbsp;<br />
<span id="more-3010"></span><br />

<h2> Solution. </h2>
<p>The answer is No.</p>
<h3>Counterexample</h3>
<p>We give a counterexample. Consider the $2\times 2$ zero matrix.<br />
		The zero matrix is a diagonal matrix, and thus it is diagonalizable.<br />
		However, the zero matrix is not invertible as its determinant is zero.</p>
<h3>More Theoretical Explanation</h3>
<p>Let us give a more theoretical explanation.<br />
		If an $n\times n$ matrix $A$ is diagonalizable, then there exists an invertible matrix $P$ such that<br />
		\[P^{-1}AP=\begin{bmatrix}<br />
				 \lambda_1  &#038; 0 &#038; \cdots &#038; 0 \\<br />
				0 &#038; \lambda_2 &#038; \cdots &#038; 0 \\<br />
				\vdots  &#038; \vdots  &#038; \ddots &#038; \vdots  \\<br />
				0 &#038; 0 &#038; \cdots &#038; \lambda_n<br />
				\end{bmatrix},\]
				where $\lambda_1, \dots, \lambda_n$ are eigenvalues of $A$.<br />
				Then we consider the determinants of the matrices of both sides.<br />
			The determinant of the left hand side is<br />
			\begin{align*}<br />
	\det(P^{-1}AP)=\det(P)^{-1}\det(A)\det(P)=\det(A).<br />
	\end{align*}<br />
	On the other hand, the determinant of the right hand side is the product<br />
	\[\lambda_1\lambda_2\cdots \lambda_n\]
	since the right matrix is diagonal.<br />
	Hence we obtain<br />
	\[\det(A)=\lambda_1\lambda_2\cdots \lambda_n.\]
	(Note that it is always true that the determinant of a matrix is the product of its eigenvalues regardless diagonalizability.<br />
 See the post &#8220;<a href="//yutsumura.com/determinant-trace-and-eigenvalues-of-a-matrix/" target="_blank">Determinant/trace and eigenvalues of a matrix</a>&#8220;.)</p>
<p>	Hence if one of the eigenvalues of $A$ is zero, then the determinant of $A$ is zero, and hence $A$ is not invertible.</p>
<p>	The true statement is:</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">a diagonal matrix is invertible if and only if its eigenvalues are nonzero.</div>
<h3>Is Every Invertible Matrix Diagonalizable?</h3>
<p>	Note that it is not true that every invertible matrix is diagonalizable.</p>
<p>	For example, consider the matrix<br />
	\[A=\begin{bmatrix}<br />
	  1 &#038; 1\\<br />
	  0&#038; 1<br />
	\end{bmatrix}.\]
	The determinant of $A$ is $1$, hence $A$ is invertible.<br />
	The characteristic polynomial of $A$ is<br />
	\begin{align*}<br />
	p(t)=\det(A-tI)=\begin{vmatrix}<br />
	  1-t &#038; 1\\<br />
	  0&#038; 1-t<br />
	\end{vmatrix}=(1-t)^2.<br />
	\end{align*}<br />
	Thus, the eigenvalue of $A$ is $1$ with algebraic multiplicity $2$.<br />
	We have<br />
	\[A-I=\begin{bmatrix}<br />
	  0 &#038; 1\\<br />
	  0&#038; 0<br />
	\end{bmatrix}\]
	and thus eigenvectors corresponding to the eigenvalue $1$ are<br />
	\[a\begin{bmatrix}<br />
	  1 \\<br />
	  0<br />
	\end{bmatrix}\]
	for any nonzero scalar $a$.<br />
	Thus, the geometric multiplicity of the eigenvalue $1$ is $1$.<br />
	Since the geometric multiplicity is strictly less than the algebraic multiplicity, the matrix $A$ is defective and not diagonalizable.</p>
<h3>Is There a Matrix that is Not Diagonalizable and Not Invertible?</h3>
<p>Finally, note that there is a matrix which is not diagonalizable and not invertible.<br />
	For example, the matrix $\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  0&#038; 0<br />
\end{bmatrix}$ is such a matrix.</p>
<h2>Summary </h2>
<p>There are all possibilities.</p>
<ol>
<li>Diagonalizable, but not invertible.<br />
Example: \[\begin{bmatrix}<br />
  0 &#038; 0\\<br />
  0&#038; 0<br />
\end{bmatrix}.\]</li>
<li>Invertible, but not diagonalizable.<br />
Example: \[\begin{bmatrix}<br />
  1 &#038; 1\\<br />
  0&#038; 1<br />
\end{bmatrix}\]</li>
<li>Not diagonalizable and Not invertible.<br />
Example: \[\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  0&#038; 0<br />
\end{bmatrix}.\]</li>
<li>Diagonalizable and invertible<br />
Example: \[\begin{bmatrix}<br />
  1 &#038; 0\\<br />
  0&#038; 1<br />
\end{bmatrix}.\]</li>
</ol>
<button class="simplefavorite-button has-count" data-postid="3010" data-siteid="1" data-groupid="1" data-favoritecount="73" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">73</span></button><p>The post <a href="https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/" target="_blank">True or False. Every Diagonalizable Matrix is Invertible</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/true-or-false-every-diagonalizable-matrix-is-invertible/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3010</post-id>	</item>
		<item>
		<title>Order of Product of Two Elements in a Group</title>
		<link>https://yutsumura.com/order-of-product-of-two-elements-in-a-group/</link>
				<comments>https://yutsumura.com/order-of-product-of-two-elements-in-a-group/#comments</comments>
				<pubDate>Tue, 28 Mar 2017 03:50:08 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Group Theory]]></category>
		<category><![CDATA[abelian group]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[group]]></category>
		<category><![CDATA[group theory]]></category>
		<category><![CDATA[nonabelian group]]></category>
		<category><![CDATA[order]]></category>
		<category><![CDATA[symmetric group]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2535</guid>
				<description><![CDATA[<p>Let $G$ be a group. Let $a$ and $b$ be elements of $G$. If the order of $a, b$ are $m, n$ respectively, then is it true that the order of the product $ab$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/order-of-product-of-two-elements-in-a-group/" target="_blank">Order of Product of Two Elements in a Group</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 354</h2>
<p> Let $G$ be a group. Let $a$ and $b$ be elements of $G$.<br />
If the order of $a, b$ are $m, n$ respectively, then is it true that the order of the product $ab$ divides $mn$? If so give a proof. If not, give a counterexample.</p>
<p>&nbsp;<br />
<span id="more-2535"></span><br />

<h2> Proof. </h2>
<p>		We claim that it is not true. As a counterexample, consider $G=S_3$, the symmetric group of three letters.<br />
		Let $a=(1\, 2), b=(1 \,3)$ be transposition elements in $S_3$.<br />
		The orders of $a$ and $b$ are both $2$.</p>
<p>		Consider the product<br />
		\[ab=(1\, 2)(1 \,3)=(1 \, 3 \, 2).\]
		Then it is straightforward to check that the order of $ab$ is $3$, which does not divide $4$ (the product of orders of $a$ and $b$).</p>
<p>		Therefore, the group $G=S_3$ and elements $a=(1\, 2), b=(1 \,3)\in G$ serve as a counterexample.</p>
<h2> Remark. (Abelian group case) </h2>
<p>If we further assume that $G$ is an abelian group, then the statement is true.<br />
Here is the proof if $G$ is abelian.</p>
<p>Let $e$ be the identity element of $G$.<br />
\begin{align*}<br />
(ab)^{mn} &#038;=a^{mn}b^{mn} &#038;&#038; \text{ since $G$ is abelian}\\<br />
&#038;=(a^m)^n(b^n)^m\\<br />
&#038;=e^n e^m &#038;&#038; \text{since the order of $a, b$ are $m, n$ respectively}\\<br />
&#038;=e.<br />
\end{align*}<br />
	Thus the order of $ab$ divides $mn$.</p>
<h2> Related Question. </h2>
<p>If the group is abelian, then the statement is true.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Let $G$ be an abelian group with the identity element $1$.<br />
Let $a, b$ be elements of $G$ with order $m$ and $n$, respectively.<br />
If $m$ and $n$ are relatively prime, then show that the order of the element $ab$ is $mn$.
</div>
<p>See the post &#8220;<a href="//yutsumura.com/order-of-the-product-of-two-elements-in-an-abelian-group/" target="_blank">Order of the Product of Two Elements in an Abelian Group</a>&#8221; for a proof of this problem.</p>
<p>More generally, we can prove the following.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.Let $G$ be an abelian group.<br />
	Let $a$ and $b$ be elements in $G$ of order $m$ and $n$, respectively.<br />
	Prove that there exists an element $c$ in $G$ such that the order of $c$ is the least common multiple of $m$ and $n$.</p>
<p>Also determine whether the statement is true if $G$ is a non-abelian group.
</p></div>
<p>A proof of this problem is given in the post &#8220;<a href="//yutsumura.com/the-existence-of-an-element-in-an-abelian-group-of-order-the-least-common-multiple-of-two-elements/" target="_blank">The Existence of an Element in an Abelian Group of Order the Least Common Multiple of Two Elements</a>&#8220;.</p>
<button class="simplefavorite-button has-count" data-postid="2535" data-siteid="1" data-groupid="1" data-favoritecount="66" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">66</span></button><p>The post <a href="https://yutsumura.com/order-of-product-of-two-elements-in-a-group/" target="_blank">Order of Product of Two Elements in a Group</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/order-of-product-of-two-elements-in-a-group/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2535</post-id>	</item>
		<item>
		<title>12 Examples of Subsets that Are Not Subspaces of Vector Spaces</title>
		<link>https://yutsumura.com/10-examples-of-subsets-that-are-not-subspaces-of-vector-spaces/</link>
				<comments>https://yutsumura.com/10-examples-of-subsets-that-are-not-subspaces-of-vector-spaces/#comments</comments>
				<pubDate>Thu, 16 Mar 2017 01:38:48 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[continuous function]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[derivative]]></category>
		<category><![CDATA[determinant]]></category>
		<category><![CDATA[determinant of a matrix]]></category>
		<category><![CDATA[general vector space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[subspace criteria]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2443</guid>
				<description><![CDATA[<p>Each of the following sets are not a subspace of the specified vector space. For each set, give a reason why it is not a subspace. (1) \[S_1=\left \{\, \begin{bmatrix} x_1 \\ x_2 \\&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/10-examples-of-subsets-that-are-not-subspaces-of-vector-spaces/" target="_blank">12 Examples of Subsets that Are Not Subspaces of Vector Spaces</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 338</h2>
<p> Each of the following sets are not a subspace of the specified vector space. For each set, give a reason why it is not a subspace.<br />
<strong>(1)</strong> \[S_1=\left \{\, \begin{bmatrix}<br />
	  x_1 \\<br />
	   x_2 \\<br />
	    x_3<br />
	  \end{bmatrix} \in \R^3 \quad \middle | \quad x_1\geq 0 \,\right \}\]
	  in the vector space $\R^3$.</p>
<hr />
<p><strong>(2)</strong> \[S_2=\left \{\, \begin{bmatrix}<br />
	  x_1 \\<br />
	   x_2 \\<br />
	    x_3<br />
	  \end{bmatrix} \in \R^3 \quad \middle | \quad x_1-4x_2+5x_3=2 \,\right \}\]
	  in the vector space $\R^3$.</p>
<hr />
<p><strong>(3)</strong> \[S_3=\left \{\, \begin{bmatrix}<br />
	  x \\<br />
	  y<br />
	\end{bmatrix}\in \R^2 \quad \middle | \quad y=x^2 \quad \,\right \}\]
	in the vector space $\R^2$.</p>
<hr />
<p><strong>(4)</strong> Let $P_4$ be the vector space of all polynomials of degree $4$ or less with real coefficients.<br />
	\[S_4=\{ f(x)\in P_4 \mid f(1) \text{ is an integer}\}\]
	in the vector space $P_4$.</p>
<hr />
<p><strong>(5)</strong> \[S_5=\{ f(x)\in P_4 \mid f(1) \text{ is a rational number}\}\]
	in the vector space $P_4$.</p>
<hr />
<p><strong>(6)</strong>  Let $M_{2 \times 2}$ be the vector space of all $2\times 2$ real matrices.<br />
	\[S_6=\{ A\in M_{2\times 2} \mid \det(A) \neq 0\} \]
	in the vector space $M_{2\times 2}$.</p>
<hr />
<p><strong>(7)</strong> \[S_7=\{ A\in M_{2\times 2} \mid \det(A)=0\} \]
	in the vector space $M_{2\times 2}$.</p>
<p>(<em>Linear Algebra Exam Problem, the Ohio State University</em>)</p>
<hr />
<p><strong>(8)</strong> Let $C[-1, 1]$ be the vector space of all real continuous functions defined on the interval $[a, b]$.<br />
	\[S_8=\{ f(x)\in C[-2,2] \mid f(-1)f(1)=0\} \]
	in the vector space $C[-2, 2]$.</p>
<hr />
<p><strong>(9)</strong> \[S_9=\{ f(x) \in C[-1, 1] \mid f(x)\geq 0 \text{ for all } -1\leq x \leq 1\}\]
	in the vector space $C[-1, 1]$.</p>
<hr />
<p><strong>(10)</strong> Let $C^2[a, b]$ be the vector space of all real-valued functions $f(x)$ defined on $[a, b]$, where $f(x), f'(x)$, and $f^{\prime\prime}(x)$ are continuous on $[a, b]$. Here $f'(x), f^{\prime\prime}(x)$ are the first and second derivative of $f(x)$.<br />
	\[S_{10}=\{ f(x) \in C^2[-1, 1] \mid f^{\prime\prime}(x)+f(x)=\sin(x) \text{ for all } -1\leq x \leq 1\}\]
	in the vector space $C[-1, 1]$.</p>
<hr />
<p><strong>(11)</strong> Let $S_{11}$ be the set of real polynomials of degree exactly $k$, where $k \geq 1$ is an integer, in the vector space $P_k$.</p>
<hr />
<p><strong>(12)</strong> Let $V$ be a vector space and $W \subset V$ a vector subspace.  Define the subset $S_{12}$ to be the <strong>complement</strong> of $W$,<br />
\[ V \setminus W = \{ \mathbf{v} \in V \mid \mathbf{v} \not\in W \}.\]
<p>&nbsp;<br />
<span id="more-2443"></span><br />

<h2>Solution.</h2>
<p>	Recall the following subspace criteria.<br />
	A subset $W$ of a vector space $V$ over the scalar field $K$ is a subspace of $V$ if and only if the following three criteria are met.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<ol>
<li> The subset $W$ contains the zero vector of $V$.</li>
<li>If $u, v\in W$, then $u+v\in W$.</li>
<li>If $u\in W$ and $a\in K$, then $au\in W$.</li>
</ol>
</div>
<p>	Thus, to prove a subset $W$ is not a subspace, we just need to find a counterexample of any of the three criteria.<br />
	&nbsp;&nbsp;</p>
<h3>Solution (1). $S_1=\{ \mathbf{x} \in \R^3 \mid x_1\geq 0  \}$</h3>
<p>	 The subset $S_1$ does not satisfy condition 3. For example, consider the vector<br />
		 \[\mathbf{x}=\begin{bmatrix}<br />
	  1 \\<br />
	   0 \\<br />
	    0<br />
	  \end{bmatrix}.\]
	  Then since $x_1=1\geq 0$, the vector $\mathbf{x}\in S_1$. Then consider the scalar product of $\mathbf{x}$ and the scalar $-1$. Then we have<br />
	  \[(-1)\cdot\mathbf{x}=\begin{bmatrix}<br />
	  -1 \\<br />
	   0 \\<br />
	    0<br />
	  \end{bmatrix},\]
	  and the first entry is $-1$, hence $-\mathbf{x}$ is not in $S_1$. Thus $S_1$ does not satisfy condition 3 and it is not a subspace of $\R^3$.<br />
	  (You can check that conditions 1, 2 are met.)<br />
	  &nbsp;&nbsp;</p>
<h3>Solution (2). $S_2= \{ \mathbf{x}\in \R^3\mid x_1-4x_2+5x_3=2  \}$</h3>
<p> The zero vector of the vector space $\R^3$ is<br />
	  \[\mathbf{0}=\begin{bmatrix}<br />
	  0 \\<br />
	   0 \\<br />
	    0<br />
	  \end{bmatrix}.\]
	  Since the zero vector $\mathbf{0}$ does not satisfy the defining relation $x_1-4x_2+5x_3=2$, it is not in $S_2$. Hence condition 1 is not met, hence $S_2$ is not a subspace of $\R^3$.<br />
	  (You can check that conditions 2, 3 are not met as well.)<br />
	  &nbsp;&nbsp;</p>
<h3>Solution (3). $S_3=\{\mathbf{x}\in \R^2 \mid y=x^2 \quad  \}$</h3>
<p> Consider vectors<br />
	  \[\begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix} \text{ and } \begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix}.\]
	These are vectors in $S_3$ since both vectors satisfy the defining relation $y=x^2$.</p>
<p>	However, their sum<br />
	\[\begin{bmatrix}<br />
	  1 \\<br />
	  1<br />
	\end{bmatrix} + \begin{bmatrix}<br />
	  -1 \\<br />
	  1<br />
	\end{bmatrix}<br />
	=<br />
	\begin{bmatrix}<br />
	  0 \\<br />
	  1<br />
	\end{bmatrix}\]
	is not in $S_3$ since $1\neq 0^2$.<br />
	Hence condition 2 is not met, and thus $S_3$ is not a subspace of $\R^2$.<br />
	(You can check that condition 1 is fulfilled yet condition 3 is not met.)<br />
	&nbsp;&nbsp;</p>
<h3>Solution (4). $S_4=\{ f(x)\in P_4 \mid f(1) \text{ is an integer}\}$</h3>
<p>Consider the polynomial $f(x)=x$. Since the degree of $f(x)$ is $1$ and $f(1)=1$ is an integer, it is in $S_4$. Consider the scalar product of $f(x)$ and the scalar $1/2\in \R$.<br />
	 Then we evaluate the scalar product at $x=1$ and we have<br />
	 \begin{align*}<br />
	\frac{1}{2}f(1)=\frac{1}{2},<br />
	\end{align*}<br />
	which is not an integer.<br />
	Thus $(1/2)f(x)$ is not in $S_4$, hence condition 3 is not met. Thus $S_4$ is not a subspace of $P_4$.<br />
	(You can check that conditions 1, 2 are met.)<br />
	&nbsp;&nbsp;</p>
<h3>Solution (5). $S_5=\{ f(x)\in P_4 \mid f(1) \text{ is a rational number}\}$</h3>
<p> Let $f(x)=x$. Then $f(x)$ is a degree $1$ polynomial and $f(1)=1$ is a rational number.<br />
	However, the scalar product $\sqrt{2} f(x)$ of $f(x)$ and the scalar $\sqrt{2} \in \R$ is not in $S_5$ since<br />
	\[\sqrt{2}f(1)=\sqrt{2},\]
	which is not a rational number. Hence condition 3 is not met and $S_5$ is not a subspace of $P_4$.<br />
	(You can check that conditions 1, 2 are met.)<br />
	&nbsp;&nbsp;</p>
<h3>Solution (6). $S_6=\{ A\in M_{2\times 2} \mid \det(A) \neq 0\}$</h3>
<p> The zero vector of the vector space $M_{2 \times 2}$ is the $2\times 2$ zero matrix $O$.<br />
	Since the determinant of the zero matrix $O$ is $0$, it is not in $S_6$. Thus, condition 1 is not met and $S_6$ is not a subspace of $M_{2 \times 2}$.<br />
	(You can check that conditions 2, 3 are not met as well.)<br />
	&nbsp;&nbsp;</p>
<h3>Solution (7). $S_7=\{ A\in M_{2\times 2} \mid \det(A)=0\}$</h3>
<p>Consider the matrices<br />
	\[A=\begin{bmatrix}<br />
	  1 &#038; 0\\<br />
	  0&#038; 0<br />
	\end{bmatrix} \text{ and } B=\begin{bmatrix}<br />
	  0 &#038; 0\\<br />
	  0&#038; 1<br />
	\end{bmatrix}.\]
	The determinants of $A$ and $B$ are both $0$, hence they belong to $S_7$.<br />
	However, their sum<br />
	\[A+B=\begin{bmatrix}<br />
	  1 &#038; 0\\<br />
	  0&#038; 1<br />
	\end{bmatrix}\]
	has the determinant $1$, hence the sum $A+B$ is not in $S_7$.<br />
	So condition 2 is not met and $S_7$ is not a subspace of $M_{2 \times 2}$.<br />
	(You can check that conditions 1, 3 are met.)<br />
	&nbsp;&nbsp;</p>
<h3>Solution (8). $S_8=\{ f(x)\in C[-2,2] \mid f(-1)f(1)=0\}$</h3>
<p> Consider the continuous functions<br />
	\[f(x)=x-1 \text{ and } g(x)=x+1.\]
	(These are polynomials, hence they are continuous.)<br />
	We have<br />
	\begin{align*}<br />
	&#038;f(-1)f(1)=(-2)\cdot(0)=0 \text{ and }\\<br />
	&#038;g(-1)g(1)=(0)\cdot 2=0.<br />
	\end{align*}<br />
	So these functions are in $S_8$.</p>
<p>	However, their sum $h(x):=f(x)+g(x)$ does not belong to $S_8$ since we have<br />
	\begin{align*}<br />
	h(-1)h(1)&#038;=\big(f(-1)+g(-1)\big) \big(f(1)+g(1) \big)\\<br />
	&#038;=(-2+0)(0+2)=-4\neq 0.<br />
	\end{align*}<br />
	Therefore, condition 2 is not met and $S_8$ is not a subspace of $C[-1, 1]$.<br />
	(You can check that conditions 1, 3 are met.)<br />
&nbsp;&nbsp;</p>
<h3>Solution (9). $S_9=\{ f(x) \in C[-1, 1] \mid f(x)\geq 0 \text{ for all } -1\leq x \leq 1\}$</h3>
<p>Let $f(x)=x^2$, an open-up parabola.<br />
	Then $f(x)$ is continuous and non-negative for $-1 \leq x \leq 1$. Hence $f(x)=x^2$ is in $S_9$.<br />
	However, the scalar product $(-1)f(x)$ of $f(x)$ and the scalar $-1$ is not in $S_9$ since, say,<br />
	\[(-1)f(1)=-1\]
	is negative.<br />
	So condition 3 is not met and $S_9$ is not a subspace of $C[-1, 1]$.<br />
	(You can check that conditions 1, 2 are met.)<br />
	&nbsp;&nbsp;</p>
<h3>Solution (10). $S_{10}=\{ f(x) \in C^2[-1, 1] \mid f^{\prime\prime}(x)+f(x)=\sin(x) \text{ for all } -1\leq x \leq 1\}$</h3>
<p>The zero vector of the vector space $C^2[-1, 1]$ is the zero function $\theta(x)=0$.<br />
	The second derivative of the zero function is still the zero function.<br />
	Thus,<br />
	\[\theta^{\prime\prime}(x)+\theta(x)=0\]
	and since $\sin(x)$ is not the zero function, $\theta(x)$ is not in $S_{10}$.<br />
	Hence $S_{10}$ is not a subspace of $C^2[-1, 1]$.</p>
<p>	(You can check that conditions 2, 3 are not met as well.<br />
For example, consider the function $f(x)=-\frac{1}{2}x\cos(x)\in S_{10}$.)</p>
<h3>Solution (11). Let $S_{11}$ be the set of real polynomials of degree exactly $k$.</h3>
<p>The set $S_{11}$ is not a vector subspace of $\mathbf{P}_k$.  One reason is that the zero function $\mathbf{0}$ has degree $0$, and so does not lie in $S_{11}$.  The set $S_{11}$ is also not closed under addition.  Consider the two polynomials $f(x) = x^k + 1$ and $g(x) = &#8211; x^k + 1$.  Both of these polynomials lie in $S_{11}$, however $f(x) + g(x) = 2$ has degree $0$ and so does not lie in $S_{11}$.</p>
<h3>Solution (12). The complement </h3>
<p>The complement $S_{12}= V \setminus W$ is not a vector subspace.  Specifically, if $\mathbf{0} \in V$ is the zero vector, then we know $\mathbf{0} \in W$ because $W$ is a subspace.  But then $\mathbf{0} \not\in V \setminus W$, and so $V \setminus W$ cannot be a vector subspace. </p>
<h2>Linear Algebra Midterm Exam 2 Problems and Solutions </h2>
<ul>
<li><a href="//yutsumura.com/true-or-false-problems-of-vector-spaces-and-linear-transformations/" target="_blank">True of False Problems  and Solutions</a>: True or False problems of vector spaces and linear transformations</li>
<li>Problem 1 and its solution (current problem): See (7) in the post &#8220;10 examples of subsets that are not subspaces of vector spaces&#8221;</li>
<li><a href="//yutsumura.com/determine-whether-trigonometry-functions-sin2x-cos2x-1-are-linearly-independent-or-dependent/" target="_blank">Problem 2 and its solution</a>: Determine whether trigonometry functions $\sin^2(x), \cos^2(x), 1$ are linearly independent or dependent</li>
<li><a href="//yutsumura.com/orthonormal-basis-of-null-space-and-row-space/" target="_blank">Problem 3 and its solution</a>: Orthonormal basis of null space and row space</li>
<li><a href="//yutsumura.com/basis-of-span-in-vector-space-of-polynomials-of-degree-2-or-less/" target="_blank">Problem 4 and its solution</a>: Basis of span in vector space of polynomials of degree 2 or less</li>
<li><a href="//yutsumura.com/determine-value-of-linear-transformation-from-r3-to-r2/" target="_blank">Problem 5 and its solution</a>: Determine value of linear transformation from $R^3$ to $R^2$</li>
<li><a href="//yutsumura.com/rank-and-nullity-of-linear-transformation-from-r3-to-r2/" target="_blank">Problem 6 and its solution</a>: Rank and nullity of linear transformation from $R^3$ to $R^2$</li>
<li><a href="//yutsumura.com/find-matrix-representation-of-linear-transformation-from-r2-to-r2/" target="_blank">Problem 7 and its solution</a>: Find matrix representation of linear transformation from $R^2$ to $R^2$</li>
<li><a href="//yutsumura.com/hyperplane-through-origin-is-subspace-of-4-dimensional-vector-space/" target="_blank">Problem 8 and its solution</a>: Hyperplane through origin is subspace of 4-dimensional vector space</li>
</ul>
<button class="simplefavorite-button has-count" data-postid="2443" data-siteid="1" data-groupid="1" data-favoritecount="37" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">37</span></button><p>The post <a href="https://yutsumura.com/10-examples-of-subsets-that-are-not-subspaces-of-vector-spaces/" target="_blank">12 Examples of Subsets that Are Not Subspaces of Vector Spaces</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/10-examples-of-subsets-that-are-not-subspaces-of-vector-spaces/feed/</wfw:commentRss>
		<slash:comments>8</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2443</post-id>	</item>
		<item>
		<title>True or False. The Intersection of Bases is a Basis of the Intersection of Subspaces</title>
		<link>https://yutsumura.com/true-or-false-the-intersection-of-bases-is-a-basis-of-the-intersection-of-subspaces/</link>
				<comments>https://yutsumura.com/true-or-false-the-intersection-of-bases-is-a-basis-of-the-intersection-of-subspaces/#respond</comments>
				<pubDate>Thu, 12 Jan 2017 20:45:24 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[dimension]]></category>
		<category><![CDATA[dimension of a vector space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[true or false]]></category>
		<category><![CDATA[vector]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1916</guid>
				<description><![CDATA[<p>Determine whether the following is true or false. If it is true, then give a proof. If it is false, then give a counterexample. Let $W_1$ and $W_2$ be subspaces of the vector space&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/true-or-false-the-intersection-of-bases-is-a-basis-of-the-intersection-of-subspaces/" target="_blank">True or False. The Intersection of Bases is a Basis of the Intersection of Subspaces</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 253</h2>
<p>Determine whether the following is true or false. If it is true, then give a proof. If it is false, then give a counterexample. </p>
<p>Let $W_1$ and $W_2$ be subspaces of the vector space $\R^n$.<br />
If $B_1$ and $B_2$ are bases for $W_1$ and $W_2$, respectively, then $B_1\cap B_2$ is a basis of the subspace $W_1\cap W_2$.</p>
<p>&nbsp;<br />
<span id="more-1916"></span></p>
<h2>Solution.</h2>
<p>	The statement is false. We give a counterexample.<br />
	Let us consider the vector space $\R^2$, the plane.</p>
<p>	Let $W_1$ and $W_2$ be $\R^2$ itself. Then they are subspaces of $\R^2$.<br />
	Let<br />
	\[B_1=\left\{\begin{bmatrix}<br />
  1 \\<br />
  0<br />
\end{bmatrix}, \begin{bmatrix}<br />
  0 \\<br />
  1<br />
\end{bmatrix} \right\}\text{ and }<br />
B_2=\left\{ \begin{bmatrix}<br />
  -1 \\<br />
  0<br />
\end{bmatrix}, \begin{bmatrix}<br />
  0 \\<br />
  -1<br />
\end{bmatrix} \right\}\]
be bases of $W_1$ and $W_2$, respectively.</p>
<p>(Note. $B_1$ consists of two linearly independent vectors in the 2-dimensional vector space $W_1=\R^2$, hence $B_1$ is a basis of $W_1$. Similarly, $B_2$ is a basis of $W_2$.)</p>
<hr />
<p>Since $W_1$ and $W_2$ are both $\R^2$, we have $W_1\cap W_2=\R^2$.<br />
However, the intersection $B_1\cap B_2$ is the empty set, and the empty set is not a basis of $W_1\cap W_2=\R^2$. Thus, we have found a counterexample to the statement.</p>
<button class="simplefavorite-button has-count" data-postid="1916" data-siteid="1" data-groupid="1" data-favoritecount="23" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">23</span></button><p>The post <a href="https://yutsumura.com/true-or-false-the-intersection-of-bases-is-a-basis-of-the-intersection-of-subspaces/" target="_blank">True or False. The Intersection of Bases is a Basis of the Intersection of Subspaces</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/true-or-false-the-intersection-of-bases-is-a-basis-of-the-intersection-of-subspaces/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1916</post-id>	</item>
		<item>
		<title>Is the Determinant of a Matrix Additive?</title>
		<link>https://yutsumura.com/is-the-determinant-of-a-matrix-additive/</link>
				<comments>https://yutsumura.com/is-the-determinant-of-a-matrix-additive/#respond</comments>
				<pubDate>Fri, 18 Nov 2016 06:13:31 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[determinant]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=1410</guid>
				<description><![CDATA[<p>Let $A$ and $B$ be $n\times n$ matrices, where $n$ is an integer greater than $1$. Is it true that \[\det(A+B)=\det(A)+\det(B)?\] If so, then give a proof. If not, then give a counterexample. &#160;&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/is-the-determinant-of-a-matrix-additive/" target="_blank">Is the Determinant of a Matrix Additive?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 186</h2>
<p>Let $A$ and $B$ be $n\times n$ matrices, where $n$ is an integer greater than $1$. </p>
<p>Is it true that<br />
\[\det(A+B)=\det(A)+\det(B)?\]
If so, then give a proof. If not, then give a counterexample.<br />
&nbsp;<br />
<span id="more-1410"></span><br />

<h2>Solution.</h2>
<p>	We claim that the statement is false.<br />
	As a counterexample, consider the matrices<br />
	\[A=\begin{bmatrix}<br />
  1 &#038; 0\\<br />
  0&#038; 0<br />
\end{bmatrix} \text{ and } B=\begin{bmatrix}<br />
  0 &#038; 0\\<br />
  0&#038; 1<br />
\end{bmatrix}.\]
Then<br />
\[A+B=\begin{bmatrix}<br />
  1 &#038; 0\\<br />
  0&#038; 1<br />
\end{bmatrix}\]
and we have<br />
\[\det(A+B)=\begin{vmatrix}<br />
  1 &#038; 0\\<br />
  0&#038; 1<br />
\end{vmatrix}=1.\]
<p>On the other hand, the determinants of $A$ and $B$ are<br />
\[\det(A)=0 \text{ and } \det(B)=0,\]
and hence<br />
\[\det(A)+\det(B)=0\neq 1=\det(A+B).\]
<p>Therefore, the statement is false and in general we have<br />
\[\det(A+B)\neq \det(A)+\det(B).\]
<h2> Remark. </h2>
<p>When we computed the $2\times 2$ matrices, we used the formula<br />
\[\begin{vmatrix}<br />
  a &#038; b\\<br />
  c&#038; d<br />
\end{vmatrix}=ad-bc.\]
<p>This problem showed that the determinant does not preserve the addition.<br />
However, the determinant is multiplicative.<br />
In general, the following is true:<br />
\[\det(AB)=\det(A)\det(B).\]
<button class="simplefavorite-button has-count" data-postid="1410" data-siteid="1" data-groupid="1" data-favoritecount="22" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">22</span></button><p>The post <a href="https://yutsumura.com/is-the-determinant-of-a-matrix-additive/" target="_blank">Is the Determinant of a Matrix Additive?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/is-the-determinant-of-a-matrix-additive/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">1410</post-id>	</item>
		<item>
		<title>If the Matrix Product $AB=0$, then is $BA=0$ as Well?</title>
		<link>https://yutsumura.com/if-the-matrix-product-ab0-then-is-ba0-as-well/</link>
				<comments>https://yutsumura.com/if-the-matrix-product-ab0-then-is-ba0-as-well/#respond</comments>
				<pubDate>Fri, 02 Sep 2016 03:10:54 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[matrix multiplication]]></category>
		<category><![CDATA[matrix product]]></category>
		<category><![CDATA[zero matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=903</guid>
				<description><![CDATA[<p>Let $A$ and $B$ be $n\times n$ matrices. Suppose that the matrix product $AB=O$, where $O$ is the $n\times n$ zero matrix. Is it true that the matrix product with opposite order $BA$ is&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/if-the-matrix-product-ab0-then-is-ba0-as-well/" target="_blank">If the Matrix Product $AB=0$, then is $BA=0$ as Well?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<br />
<h2> Problem 98</h2>
<p>   Let $A$ and $B$ be $n\times n$ matrices. Suppose that the matrix product $AB=O$, where $O$ is the $n\times n$ zero matrix.</p>
<p> Is it true that the matrix product with opposite order $BA$ is also the zero matrix?<br />
If so, give a proof. If not, give a counterexample.</p>
<p>&nbsp;<br />
<span id="more-903"></span></p>
<h2>Solution.</h2>
<p>     	The statement is in general not true. We give a counter example.<br />
     	Consider the following $2\times 2$ matrices.<br />
     	\[A=\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  0&#038; 1<br />
\end{bmatrix} \text{ and } \begin{bmatrix}<br />
  1 &#038; 1\\<br />
  0&#038; 0<br />
\end{bmatrix}.\]
<p>Then we compute<br />
\[AB=\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  0&#038; 1<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
  1 &#038; 1\\<br />
  0&#038; 0<br />
\end{bmatrix}<br />
=\begin{bmatrix}<br />
  0 &#038; 0\\<br />
  0&#038; 0<br />
\end{bmatrix}=O.\]
Thus the matrix product $AB$ is the $2\times 2$ zero matrix $O$.</p>
<hr />
<p>On the other hand, we compute<br />
\[BA=\begin{bmatrix}<br />
  1 &#038; 1\\<br />
  0&#038; 0<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
  0 &#038; 1\\<br />
  0&#038; 1<br />
\end{bmatrix}=\begin{bmatrix}<br />
  0 &#038; 2\\<br />
  0&#038; 0<br />
\end{bmatrix}.\]
<p>Thus the matrix product $BA$ is not the zero matrix.<br />
Therefore the statement is not true in general.</p>
<button class="simplefavorite-button has-count" data-postid="903" data-siteid="1" data-groupid="1" data-favoritecount="72" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">72</span></button><p>The post <a href="https://yutsumura.com/if-the-matrix-product-ab0-then-is-ba0-as-well/" target="_blank">If the Matrix Product $AB=0$, then is $BA=0$ as Well?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/if-the-matrix-product-ab0-then-is-ba0-as-well/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">903</post-id>	</item>
		<item>
		<title>Is the Product of a Nilpotent Matrix and an Invertible Matrix Nilpotent?</title>
		<link>https://yutsumura.com/is-the-product-of-a-nilpotent-matrix-and-an-invertible-matrix-nilpotent/</link>
				<comments>https://yutsumura.com/is-the-product-of-a-nilpotent-matrix-and-an-invertible-matrix-nilpotent/#respond</comments>
				<pubDate>Sat, 20 Aug 2016 03:11:56 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[counterexample]]></category>
		<category><![CDATA[invertible matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[nilpotent matrix]]></category>
		<category><![CDATA[upper triangular matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=631</guid>
				<description><![CDATA[<p>A square matrix $A$ is called nilpotent if there exists a positive integer $k$ such that $A^k=O$, where $O$ is the zero matrix. (a) If $A$ is a nilpotent $n \times n$ matrix and&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/is-the-product-of-a-nilpotent-matrix-and-an-invertible-matrix-nilpotent/" target="_blank">Is the Product of a Nilpotent Matrix and an Invertible Matrix Nilpotent?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2>Problem 77</h2>
<p>A square matrix $A$ is called <strong><em>nilpotent</em></strong> if there exists a positive integer $k$ such that $A^k=O$, where $O$ is the zero matrix.</p>
<p><strong>(a)</strong> If $A$ is a nilpotent $n \times n$ matrix and $B$ is an $n\times n$ matrix such that $AB=BA$. Show that the product $AB$ is nilpotent.</p>
<p><strong>(b)</strong> Let $P$ be an invertible $n \times n$ matrix and let $N$ be a nilpotent $n\times n$ matrix. Is the product $PN$ nilpotent? If so, prove it. If not, give a counterexample.</p>
<p>&nbsp;</p>
<p><span id="more-631"></span><br />
&nbsp;<br />

<h2>Hint.</h2>
<p>For (b), the statement is false. Try to find a counter example.<br />
A typical nilpotent matrix is an upper triangular matrix whose diagonal entries are all zero.</p>
<h2>Proof.</h2>
<h3>(a) Show that $AB$ is nilpotent</h3>
<p>Since $A$ is nilpotent, there exists a positive integer $k$ such that $A^k=O$. Then we have<br />
\[(AB)^k=(AB)(AB)\cdots (AB)=A^kB^k=OB^k=O.\]
<p>Here in the second equality, we used the assumption that $AB=BA$.<br />
Thus we have $(AB)^k=O$, hence the product matrix $AB$ is nilpotent.</p>
<p>&nbsp;</p>
<h3>(b) Is $PN$ nilpotent?</h3>
<p>In general, the product $PN$ of an invertible matrix $P$ and a nilpotent matrix $N$ is not nilpotent.<br />
Here is a counterexample.<br />
Let<br />
\[P=\begin{bmatrix}<br />
1 &amp; 0 &amp; 0 \\<br />
1 &amp;1 &amp;0 \\<br />
0 &amp; 0 &amp; 1<br />
\end{bmatrix} \text{ and }<br />
N=\begin{bmatrix}<br />
0 &amp; 1 &amp; 1 \\<br />
0 &amp;0 &amp;1 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}.\]
<p>Then the matrix $P$ is invertible since $\det(P)=1$.<br />
(Note that $P$ is a lower triangular matrix. So the determinant is the product of diagonal entries.)</p>
<hr />
<p>Also, it is easy to see by direct computation that $N^3=O$, hence $N$ is nilpotent. Indeed,<br />
\[N^2=\begin{bmatrix}<br />
0 &amp; 0 &amp; 1 \\<br />
0 &amp;0 &amp;0 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix} \] and<br />
\[<br />
N^3=N^2N=\begin{bmatrix}<br />
0 &amp; 0 &amp; 1 \\<br />
0 &amp;0 &amp;0 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
0 &amp; 1 &amp; 1 \\<br />
0 &amp;0 &amp;1 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}=O.\]
<hr />
<p>Now the product $PN$ is<br />
\[PN=\begin{bmatrix}<br />
0 &amp; 1 &amp; 1 \\<br />
0 &amp;1 &amp;2 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}.\]
We show that $PN$ is not nilpotent.<br />
We have<br />
\[(PN)^2=\begin{bmatrix}<br />
0 &amp; 1 &amp; 2 \\<br />
0 &amp;1 &amp;2 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}\]
\[(PN)^3=(PN)^2(PN)=\begin{bmatrix}<br />
0 &amp; 1 &amp; 2 \\<br />
0 &amp;1 &amp;2 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}\begin{bmatrix}<br />
0 &amp; 1 &amp; 1 \\<br />
0 &amp;1 &amp;2 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}<br />
=\begin{bmatrix}<br />
0 &amp; 1 &amp; 2 \\<br />
0 &amp;1 &amp;2 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}.\]
<p>This calculation shows that<br />
\[(PN)^k=\begin{bmatrix}<br />
0 &amp; 1 &amp; 2 \\<br />
0 &amp;1 &amp;2 \\<br />
0 &amp; 0 &amp; 0<br />
\end{bmatrix}\neq O \text{ for all } k \geq 2.\]
<p>Thus $PN$ is not nilpotent. In conclusion, the product $PN$ of the invertible matrix $P$ and the nilpotent matrix $N$ is not nilpotent.</p>
<button class="simplefavorite-button has-count" data-postid="631" data-siteid="1" data-groupid="1" data-favoritecount="20" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">20</span></button><p>The post <a href="https://yutsumura.com/is-the-product-of-a-nilpotent-matrix-and-an-invertible-matrix-nilpotent/" target="_blank">Is the Product of a Nilpotent Matrix and an Invertible Matrix Nilpotent?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/is-the-product-of-a-nilpotent-matrix-and-an-invertible-matrix-nilpotent/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">631</post-id>	</item>
	</channel>
</rss>
