<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>symmetric matrix &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/symmetric-matrix/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Mon, 12 Feb 2018 07:19:41 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Find All Symmetric Matrices satisfying the Equation</title>
		<link>https://yutsumura.com/find-all-symmetric-matrices-satisfying-the-equation/</link>
				<comments>https://yutsumura.com/find-all-symmetric-matrices-satisfying-the-equation/#respond</comments>
				<pubDate>Mon, 12 Feb 2018 16:05:59 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[free variable]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6882</guid>
				<description><![CDATA[<p>Find all $2\times 2$ symmetric matrices $A$ satisfying $A\begin{bmatrix} 1 \\ -1 \end{bmatrix} = \begin{bmatrix} 2 \\ 3 \end{bmatrix}$? Express your solution using free variable(s). &#160; Solution. Let $A=\begin{bmatrix} a &#038; b\\ c&#038; d&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/find-all-symmetric-matrices-satisfying-the-equation/" target="_blank">Find All Symmetric Matrices satisfying the Equation</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 697</h2>
<p>Find all $2\times 2$ symmetric matrices $A$ satisfying $A\begin{bmatrix}<br />
  1 \\<br />
  -1<br />
\end{bmatrix}<br />
=<br />
\begin{bmatrix}<br />
  2 \\<br />
  3<br />
\end{bmatrix}$? Express your solution using free variable(s).</p>
<p>&nbsp;<br />
<span id="more-6882"></span><br />

<h2>Solution.</h2>
<p>	Let $A=\begin{bmatrix}<br />
  a &#038; b\\<br />
  c&#038; d<br />
\end{bmatrix}$ be a $2\times 2$ matrix satisfying the conditions. Then as $A$ is symmetric, we have $A^{\trans}=A$. This yields that $b=c$.<br />
So, we find all matrices $A=\begin{bmatrix}<br />
  a &#038; b\\<br />
  b&#038; d<br />
\end{bmatrix}$ satisfying $A\begin{bmatrix}<br />
  1 \\<br />
  -1<br />
\end{bmatrix}<br />
=<br />
\begin{bmatrix}<br />
  2 \\<br />
  3<br />
\end{bmatrix}$.<br />
We have<br />
\begin{align*}<br />
\begin{bmatrix}<br />
  2 \\<br />
  3<br />
\end{bmatrix}=\begin{bmatrix}<br />
  a &#038; b\\<br />
  b&#038; d<br />
\end{bmatrix}\begin{bmatrix}<br />
  1 \\<br />
  -1<br />
\end{bmatrix}<br />
=\begin{bmatrix}<br />
  a-b \\<br />
  b-d<br />
\end{bmatrix}.<br />
\end{align*}<br />
Hence, we need $a-b=2$ and $b-d=3$.<br />
Equivalently, $a=b+2, d=b-3$. So, we have<br />
\begin{align*}<br />
A=\begin{bmatrix}<br />
  a &#038; b\\<br />
  b&#038; d<br />
\end{bmatrix}=\begin{bmatrix}<br />
  b+2 &#038; b\\<br />
  b&#038; b-3<br />
\end{bmatrix}=b\begin{bmatrix}<br />
  1 &#038; 1\\<br />
  1&#038; 1<br />
\end{bmatrix}+\begin{bmatrix}<br />
  2 &#038; 0\\<br />
  0&#038; -3<br />
\end{bmatrix},<br />
\end{align*}<br />
where $b$ is a free variable.</p>
<h2>Common Mistake</h2>
<p>This is a midterm exam problem of Lienar Algebra at the Ohio State University.</p>
<p>One common mistake is not using the assumption that $A$ is symmetric or using wrongly.<br />
A matrix $A$ is symmetric if $A^{\trans}=A$. For a 2 by 2 matrix, this yields that the off-diagonal entries must be the same.<br />
However, note that the diagonal entries can be distinct. Some students assumed the same diagonal entries and concluded that there are no matrices satisfying the conditions.</p>
<button class="simplefavorite-button has-count" data-postid="6882" data-siteid="1" data-groupid="1" data-favoritecount="28" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">28</span></button><p>The post <a href="https://yutsumura.com/find-all-symmetric-matrices-satisfying-the-equation/" target="_blank">Find All Symmetric Matrices satisfying the Equation</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/find-all-symmetric-matrices-satisfying-the-equation/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6882</post-id>	</item>
		<item>
		<title>If a Symmetric Matrix is in Reduced Row Echelon Form, then Is it Diagonal?</title>
		<link>https://yutsumura.com/if-a-symmetric-matrix-is-in-reduced-row-echelon-form-then-is-it-diagonal/</link>
				<comments>https://yutsumura.com/if-a-symmetric-matrix-is-in-reduced-row-echelon-form-then-is-it-diagonal/#respond</comments>
				<pubDate>Mon, 25 Dec 2017 05:52:57 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonal matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[reduced row echelon form matrix]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6331</guid>
				<description><![CDATA[<p>Recall that a matrix $A$ is symmetric if $A^\trans = A$, where $A^\trans$ is the transpose of $A$. Is it true that if $A$ is a symmetric matrix and in reduced row echelon form,&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/if-a-symmetric-matrix-is-in-reduced-row-echelon-form-then-is-it-diagonal/" target="_blank">If a Symmetric Matrix is in Reduced Row Echelon Form, then Is it Diagonal?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 647</h2>
<p>Recall that a matrix $A$ is <strong>symmetric</strong> if $A^\trans = A$, where $A^\trans$ is the transpose of $A$. </p>
<p>Is it true that if $A$ is a symmetric matrix and in reduced row echelon form, then $A$ is diagonal?  If so, prove it. </p>
<p>Otherwise, provide a counterexample.</p>
<p>&nbsp;<br />
<span id="more-6331"></span></p>
<h2> Proof. </h2>
<p>	This is true. </p>
<p>If $A$ is in reduced row echelon form, then every term below the diagonal must be 0.<br />
That is, the entry $a_{i j} = 0$ for all $i > j$.  </p>
<p>If $A$ is, additionally, symmetric, then for $i &lt; j$ we also have $a_{j i} = a_{i j} = 0$.  </p>
<p>These two facts together means that $a_{i j} = 0$ whenever $i \neq j$. </p>
<p>In this case, the only non-zero terms in the matrix $A$ lie on the diagonal, and so $A$ is a diagonal matrix.</p>
<button class="simplefavorite-button has-count" data-postid="6331" data-siteid="1" data-groupid="1" data-favoritecount="35" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">35</span></button><p>The post <a href="https://yutsumura.com/if-a-symmetric-matrix-is-in-reduced-row-echelon-form-then-is-it-diagonal/" target="_blank">If a Symmetric Matrix is in Reduced Row Echelon Form, then Is it Diagonal?</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/if-a-symmetric-matrix-is-in-reduced-row-echelon-form-then-is-it-diagonal/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6331</post-id>	</item>
		<item>
		<title>Prove that $\mathbf{v} \mathbf{v}^\trans$ is a Symmetric Matrix for any Vector $\mathbf{v}$</title>
		<link>https://yutsumura.com/prove-that-mathbfv-mathbfvtrans-is-a-symmetric-matrix-for-any-vector-mathbfv/</link>
				<comments>https://yutsumura.com/prove-that-mathbfv-mathbfvtrans-is-a-symmetric-matrix-for-any-vector-mathbfv/#respond</comments>
				<pubDate>Mon, 25 Dec 2017 02:13:56 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[symmetric matrix]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose of a vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=6301</guid>
				<description><![CDATA[<p>Let $\mathbf{v}$ be an $n \times 1$ column vector. Prove that $\mathbf{v} \mathbf{v}^\trans$ is a symmetric matrix. &#160; Definition (Symmetric Matrix). A matrix $A$ is called symmetric if $A^{\trans}=A$. In terms of entries, an&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/prove-that-mathbfv-mathbfvtrans-is-a-symmetric-matrix-for-any-vector-mathbfv/" target="_blank">Prove that $\mathbf{v} \mathbf{v}^\trans$ is a Symmetric Matrix for any Vector $\mathbf{v}$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 640</h2>
<p>Let $\mathbf{v}$ be an $n \times 1$ column vector.</p>
<p>Prove that $\mathbf{v} \mathbf{v}^\trans$ is a symmetric matrix.</p>
<p>&nbsp;<br />
<span id="more-6301"></span></p>
<h2>Definition (Symmetric Matrix).</h2>
<p>A matrix $A$ is called <strong>symmetric</strong> if $A^{\trans}=A$.</p>
<p>In terms of entries, an $n\times n$ matrix $A=(a_{ij})$ is symmetric if $a_{ij}=a_{ji}$ for all $1 \leq i, j \leq n$.</p>
<h2> Proof. </h2>
<p>	Let $\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix}$.  Then we have<br />
	\begin{align*}<br />
\mathbf{v} \mathbf{v}^\trans = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix}\begin{bmatrix}<br />
  v_1 &#038; v_2 &#038; \cdots &#038; v_n<br />
  \end{bmatrix}=<br />
  \begin{bmatrix} v_1 v_1 &#038; v_1 v_2 &#038; \cdots &#038; v_1 v_n \\ v_2 v_1 &#038; v_2 v_2 &#038; \cdots &#038; v_2 v_n \\ \vdots &#038; \vdots &#038; \vdots &#038; \vdots \\ v_n v_1 &#038; v_n v_2 &#038; \cdots &#038; v_n v_n \end{bmatrix}.<br />
\end{align*}</p>
<p>	  In particular, the the $(i, j)$-th component is<br />
\[(\mathbf{v} \mathbf{v}^\trans)_{i j} = v_i v_j = v_j v_i = (\mathbf{v} \mathbf{v}^\trans)_{j i}.\]
	This shows that the matrix $\mathbf{v} \mathbf{v}^\trans$ is symmetric.</p>
<button class="simplefavorite-button has-count" data-postid="6301" data-siteid="1" data-groupid="1" data-favoritecount="22" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">22</span></button><p>The post <a href="https://yutsumura.com/prove-that-mathbfv-mathbfvtrans-is-a-symmetric-matrix-for-any-vector-mathbfv/" target="_blank">Prove that $\mathbf{v} \mathbf{v}^\trans$ is a Symmetric Matrix for any Vector $\mathbf{v}$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/prove-that-mathbfv-mathbfvtrans-is-a-symmetric-matrix-for-any-vector-mathbfv/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">6301</post-id>	</item>
		<item>
		<title>Eigenvalues of $2\times 2$ Symmetric Matrices are Real by Considering Characteristic Polynomials</title>
		<link>https://yutsumura.com/eigenvalues-of-2times-2-symmetric-matrices-are-real-by-considering-characteristic-polynomials/</link>
				<comments>https://yutsumura.com/eigenvalues-of-2times-2-symmetric-matrices-are-real-by-considering-characteristic-polynomials/#respond</comments>
				<pubDate>Thu, 16 Nov 2017 02:35:35 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[characteristic polynomial]]></category>
		<category><![CDATA[discriminant]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[quadratic formula]]></category>
		<category><![CDATA[real eigenvalue]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5352</guid>
				<description><![CDATA[<p>Let $A$ be a $2\times 2$ real symmetric matrix. Prove that all the eigenvalues of $A$ are real numbers by considering the characteristic polynomial of $A$. &#160; Proof. Let $A=\begin{bmatrix} a&#038; b \\ c&#038;&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/eigenvalues-of-2times-2-symmetric-matrices-are-real-by-considering-characteristic-polynomials/" target="_blank">Eigenvalues of \times 2$ Symmetric Matrices are Real by Considering Characteristic Polynomials</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 609</h2>
<p>		Let $A$ be a $2\times 2$ real symmetric matrix.<br />
		Prove that all the eigenvalues of $A$ are real numbers by considering the characteristic polynomial of $A$.</p>
<p>&nbsp;<br />
<span id="more-5352"></span></p>
<h2> Proof. </h2>
<p>			Let $A=\begin{bmatrix}<br />
			   a&#038; b \\<br />
			      c&#038; d<br />
	        \end{bmatrix}$.<br />
	        Then as $A$ is a symmetric matrix, we have $A^{\trans}=A$.<br />
	        This implies that<br />
	        \[\begin{bmatrix}<br />
	        a&#038; c \\<br />
	        b&#038; d<br />
	        \end{bmatrix}=\begin{bmatrix}<br />
	        a&#038; b \\<br />
	        c&#038; d<br />
	        \end{bmatrix}.\]
	        Hence we have $b=c$ by comparing entries.</p>
<hr />
<p>	        Now, we find the characteristic polynomial $p(t)$ of $A$.<br />
	       We have<br />
	       \begin{align*}<br />
	       p(t)&#038;=\det(A-t I)=\begin{vmatrix}<br />
	       	a-t &#038; b\\<br />
	       	b&#038; d-t<br />
	       \end{vmatrix}\\[6pt]
	       &#038;=(a-t)(d-t)-b^2\\<br />
	       &#038;=t^2-(a+d)t+ad-b^2.<br />
	    \end{align*}</p>
<p>		Note that the eigenvalues of $A$ are roots of the characteristic polynomial $p(t)$. Hence, it suffices to show that the roots of $p(t)$ are real numbers.<br />
		The quadratic polynomial has only real roots if and only if its discriminant is non-negative.<br />
		The discriminant of $p(t)$ is given by<br />
		\begin{align*}<br />
		(a+d)^2-4(ad-b^2)&#038;=a^2+2ad+d^2-4ad+4b^2\\<br />
		&#038;=a^2-2ad+d^2+4b^2\\<br />
		&#038;=(a-d)^2+4b^2.	\end{align*}<br />
	       Observe that the last expression is the sum of two squares of real numbers. Hence the discriminant of $p(t)$ is nonnegative.</p>
<p>	       We conclude that every $2\times 2$ symmetric matrix has only real eigenvalues.</p>
<h3>Remark</h3>
<p>We also could find the eigenvalues directly. By the quadratic formula, the eigenvalues of $A$ are<br />
	       \[\frac{a+d\pm\sqrt{(a+d)^2-4(ad-b^2)}}{2}=\frac{a+d\pm \sqrt{(a-d)^2+4b^2}}{2}\]
	and as the number inside the square root (discriminant) is positive, we conclude that the eigenvalues are real.</p>
<button class="simplefavorite-button has-count" data-postid="5352" data-siteid="1" data-groupid="1" data-favoritecount="25" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">25</span></button><p>The post <a href="https://yutsumura.com/eigenvalues-of-2times-2-symmetric-matrices-are-real-by-considering-characteristic-polynomials/" target="_blank">Eigenvalues of \times 2$ Symmetric Matrices are Real by Considering Characteristic Polynomials</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/eigenvalues-of-2times-2-symmetric-matrices-are-real-by-considering-characteristic-polynomials/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5352</post-id>	</item>
		<item>
		<title>The Inverse Matrix of a Symmetric Matrix whose Diagonal Entries are All Positive</title>
		<link>https://yutsumura.com/the-inverse-matrix-of-a-symmetric-matrix-whose-diagonal-entries-are-all-positive/</link>
				<comments>https://yutsumura.com/the-inverse-matrix-of-a-symmetric-matrix-whose-diagonal-entries-are-all-positive/#comments</comments>
				<pubDate>Sat, 04 Nov 2017 03:27:55 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonal entry]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[inverse matrix of a 2 by 2 matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5227</guid>
				<description><![CDATA[<p>Let $A$ be a real symmetric matrix whose diagonal entries are all positive real numbers. Is it true that the all of the diagonal entries of the inverse matrix $A^{-1}$ are also positive? If&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-inverse-matrix-of-a-symmetric-matrix-whose-diagonal-entries-are-all-positive/" target="_blank">The Inverse Matrix of a Symmetric Matrix whose Diagonal Entries are All Positive</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 599</h2>
<p>Let $A$ be a real symmetric matrix whose diagonal entries are all positive real numbers.</p>
<p>	Is it true that the all of the diagonal entries of the inverse matrix $A^{-1}$ are also positive?<br />
	If so, prove it. Otherwise, give a counterexample.</p>
<p>&nbsp;<br />
<span id="more-5227"></span></p>
<h2> Solution. </h2>
<p>		The statement is in general false. We give a counterexample.</p>
<p>		Let us consider the following $2\times 2$ matrix:<br />
		\[A=\begin{bmatrix}<br />
	  1 &#038; 2\\<br />
	  2&#038; 1<br />
	\end{bmatrix}.\]
	The matrix $A$ satisfies the required conditions, that is, $A$ is symmetric and its diagonal entries are positive.</p>
<p>	The determinant $\det(A)=(1)(1)-(2)(2)=-3$ and the inverse of $A$ is given by<br />
	\[A^{-1}=\frac{1}{-3}\begin{bmatrix}<br />
	  1 &#038; -2\\<br />
	  -2&#038; 1<br />
	\end{bmatrix}=\begin{bmatrix}<br />
	  -1/3 &#038; 2/3\\<br />
	  2/3&#038; -1/3<br />
	\end{bmatrix}\]
	by the formula for the inverse matrix for $2\times 2$ matrices.</p>
<p>	This shows that the diagonal entries of the inverse matrix $A^{-1}$ are negative.</p>
<button class="simplefavorite-button has-count" data-postid="5227" data-siteid="1" data-groupid="1" data-favoritecount="24" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">24</span></button><p>The post <a href="https://yutsumura.com/the-inverse-matrix-of-a-symmetric-matrix-whose-diagonal-entries-are-all-positive/" target="_blank">The Inverse Matrix of a Symmetric Matrix whose Diagonal Entries are All Positive</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-inverse-matrix-of-a-symmetric-matrix-whose-diagonal-entries-are-all-positive/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5227</post-id>	</item>
		<item>
		<title>The set of $2\times 2$ Symmetric Matrices is a Subspace</title>
		<link>https://yutsumura.com/the-set-of-2times-2-symmetric-matrices-is-a-subspace/</link>
				<comments>https://yutsumura.com/the-set-of-2times-2-symmetric-matrices-is-a-subspace/#comments</comments>
				<pubDate>Tue, 17 Oct 2017 05:28:04 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis]]></category>
		<category><![CDATA[dimension of a vector space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[symmetric matrix]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5112</guid>
				<description><![CDATA[<p>Let $V$ be the vector space over $\R$ of all real $2\times 2$ matrices. Let $W$ be the subset of $V$ consisting of all symmetric matrices. (a) Prove that $W$ is a subspace of&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-set-of-2times-2-symmetric-matrices-is-a-subspace/" target="_blank">The set of \times 2$ Symmetric Matrices is a Subspace</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 586</h2>
<p>	Let $V$ be the vector space over $\R$ of all real $2\times 2$ matrices.<br />
	Let $W$ be the subset of $V$ consisting of all symmetric matrices.</p>
<p><strong>(a)</strong> Prove that $W$ is a subspace of $V$.</p>
<p><strong>(b)</strong> Find a basis of $W$.</p>
<p><strong>(c)</strong> Determine the dimension of $W$.</p>
<p>&nbsp;<br />
<span id="more-5112"></span><br />

<h2> Proof. </h2>
<p>Recall that $A$ is <strong>symmetric</strong> if $A^{\trans}=A$.</p>
<h3>(a) Prove that $W$ is a subspace of $V$.</h3>
<p>	 We verify the following subspace criteria:</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Subspace Criteria</strong>.</p>
<ol>
<li>The zero vector of $V$ is in $W$.</li>
<li>For any $A, B\in W$, the sum $A+B\in W$.</li>
<li>For any $A\in W$ and $r\in \R$, the scalar product $rA\in W$.</li>
</ol>
</div>
<p>		The zero vector in $V$ is the $2\times 2$ zero matrix $O$.<br />
		It is clear that $O^{\trans}=O$, and hence $O$ is symmetric.<br />
		Thus $O\in W$ and condition 1 is met.</p>
<hr />
<p>		Let $A, B$ be arbitrary elements in $W$.<br />
		That is, $A$ and $B$ are symmetric matrices.<br />
		We show that the sum $A+B$ is also symmetric.<br />
		We have<br />
		\begin{align*}<br />
		(A+B)^{\trans}=A^{\trans}+B^{\trans}=A+B.<br />
		\end{align*}<br />
		The second equality follows as $A, B$ are symmetric.<br />
		Hence $A+B$ is symmetric and $A+B\in W$.<br />
		Condition 2 is met.</p>
<hr />
<p>		To check condition 3, let $A\in W$ and $r\in \R$.<br />
		We have<br />
		\begin{align*}<br />
		(rA)^{\trans}=rA^{\trans}=rA,<br />
		\end{align*}<br />
		where the second equality follows since $A$ is symmetric.<br />
		This implies that $rA$ is symmetric, and hence $rA\in W$.<br />
		So condition 3 is met, and we conclude that $W$ is a subspace of $V$ by subspace criteria.</p>
<h3>(b) Find a basis of $W$.</h3>
<p>Let<br />
		\[A=\begin{bmatrix}<br />
		  a_{11} &#038; a_{12}\\<br />
		  a_{21}&#038; a_{22}<br />
		\end{bmatrix}\]
		be an arbitrary element in the subspace $W$.<br />
		Then since $A^{\trans}=A$, we have<br />
		\[\begin{bmatrix}<br />
		  a_{11} &#038; a_{21}\\<br />
		  a_{12}&#038; a_{22}<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  a_{11} &#038; a_{12}\\<br />
		  a_{21}&#038; a_{22}<br />
		\end{bmatrix}.\]
		This implies that $a_{12}=a_{21}$, and hence<br />
		\begin{align*}<br />
		A&#038;=\begin{bmatrix}<br />
		  a_{11} &#038; a_{12}\\<br />
		  a_{12}&#038; a_{22}<br />
		\end{bmatrix}\\[6pt]
		&#038;=a_{11}\begin{bmatrix}<br />
		  1 &#038; 0\\<br />
		  0&#038; 0<br />
		\end{bmatrix}+a_{12}\begin{bmatrix}<br />
		  0 &#038; 1\\<br />
		  1&#038; 0<br />
		\end{bmatrix}+a_{22}\begin{bmatrix}<br />
		  0 &#038; 0\\<br />
		  0&#038; 1<br />
		\end{bmatrix}.<br />
		\end{align*}</p>
<p>			Let $B=\{v_1, v_2, v_3\}$, where $v_1, v_2, v_3$ are $2\times 2$ matrices appearing in the above linear combination of $A$.<br />
			Note that these matrices are symmetric.<br />
			Hence we showed that any element in $W$ is a linear combination of matrices in $B$.<br />
			Thus $B$ is a spanning set for the subspace $W$.</p>
<hr />
<p>			We show that $B$ is linearly independent.<br />
			Suppose that we have<br />
			\[c_1v_1+c_2v_2+c_3v_3=\begin{bmatrix}<br />
		  0 &#038; 0\\<br />
		  0&#038; 0<br />
		\end{bmatrix}.\]
		Then it follows that<br />
		\[\begin{bmatrix}<br />
		  c_1 &#038; c_2\\<br />
		  c_2&#038; c_3<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  0 &#038; 0\\<br />
		  0&#038; 0<br />
		\end{bmatrix}.\]
		Thus $c_1=c_2=c_3=$ and the set $B$ is linearly independent.</p>
<p>		As $B$ is a linearly independent spanning set, we conclude that $B$ is a basis for the subspace $W$.</p>
<h3>(c) Determine the dimension of $W$.</h3>
<p> Recall that the dimension of a subspace is the number of vectors in a basis of the subspace.</p>
<p>		In part (b), we found that $B=\{v_1, v_2, v_3\}$ is a basis for the subspace $W$.<br />
		As $B$ consists of three vectors, the dimension of $W$ is $3$.</p>
<h2> Related Question (Skew-Symmetric Matrices) </h2>
<p>A matrix $A$ is called <strong>skew-symmetric</strong> if $A^{\trans}=-A$.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Let $V$ be the vector space of all $2\times 2$ matrices.<br />
Let $W$ be a subset of $V$ consisting of all $2\times 2$ skew-symmetric matrices. </p>
<p>Prove that $W$ is a subspace of $V$ and also find a basis and dimension of $W$.
</p></div>
<p>The solution is given in the post &#8628;<br />
<a href="//yutsumura.com/subspace-of-skew-symmetric-matrices-and-its-dimension/" rel="noopener" target="_blank">Subspace of Skew-Symmetric Matrices and Its Dimension</a></p>
<button class="simplefavorite-button has-count" data-postid="5112" data-siteid="1" data-groupid="1" data-favoritecount="48" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">48</span></button><p>The post <a href="https://yutsumura.com/the-set-of-2times-2-symmetric-matrices-is-a-subspace/" target="_blank">The set of \times 2$ Symmetric Matrices is a Subspace</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-set-of-2times-2-symmetric-matrices-is-a-subspace/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5112</post-id>	</item>
		<item>
		<title>Linear Algebra Midterm 1 at the Ohio State University (3/3)</title>
		<link>https://yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-33/</link>
				<comments>https://yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-33/#comments</comments>
				<pubDate>Mon, 25 Sep 2017 04:28:52 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[exam]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear combination]]></category>
		<category><![CDATA[linearly dependent]]></category>
		<category><![CDATA[nonsingular matrix]]></category>
		<category><![CDATA[Ohio State]]></category>
		<category><![CDATA[Ohio State.LA]]></category>
		<category><![CDATA[symmetric matrix]]></category>
		<category><![CDATA[true or false]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4948</guid>
				<description><![CDATA[<p>The following problems are Midterm 1 problems of Linear Algebra (Math 2568) at the Ohio State University in Autumn 2017. There were 9 problems that covered Chapter 1 of our textbook (Johnson, Riess, Arnold).&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-33/" target="_blank">Linear Algebra Midterm 1 at the Ohio State University (3/3)</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 572</h2>
<p>The following problems are Midterm 1 problems of Linear Algebra (Math 2568) at the Ohio State University in Autumn 2017.<br />
There were 9 problems that covered Chapter 1 of <a href="http://amzn.to/2yn7XFa" rel="noopener" target="_blank">our textbook (Johnson, Riess, Arnold)</a>.<br />
The time limit was 55 minutes.</p>
<hr />
<p>This post is Part 3 and contains Problem 7, 8, and 9.<br />
Check out <a href="//yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-13/" rel="noopener" target="_blank">Part 1</a> and <a href="//yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-23/" rel="noopener" target="_blank">Part 2</a> for the rest of the exam problems.</p>
<hr />
<p><strong>Problem 7</strong>. Let $A=\begin{bmatrix}<br />
		  -3 &#038; -4\\<br />
		  8&#038; 9<br />
		\end{bmatrix}$ and $\mathbf{v}=\begin{bmatrix}<br />
		  -1 \\<br />
		  2<br />
		\end{bmatrix}$.</p>
<p><strong>(a)</strong> Calculate $A\mathbf{v}$ and find the number $\lambda$ such that $A\mathbf{v}=\lambda \mathbf{v}$.</p>
<p><strong>(b)</strong> Without forming $A^3$, calculate the vector $A^3\mathbf{v}$.</p>
<hr />
<p><strong>Problem 8</strong>. Prove that if $A$ and $B$ are $n\times n$ nonsingular matrices, then the product $AB$ is also nonsingular. </p>
<hr />
<p><strong>Problem 9</strong>.<br />
		Determine whether each of the following sentences is true or false.</p>
<p><strong>(a)</strong> There is a $3\times 3$ homogeneous system that has exactly three solutions.</p>
<p><strong>(b)</strong> If $A$ and $B$ are $n\times n$ symmetric matrices, then the sum $A+B$ is also symmetric.</p>
<p><strong>(c)</strong> If $n$-dimensional vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly dependent, then the vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3, \mathbf{v}_4$ is also linearly dependent for any $n$-dimensional vector $\mathbf{v}_4$.</p>
<p><strong>(d)</strong> If the coefficient matrix of a system of linear equations is singular, then the system is inconsistent.</p>
<p><strong>(e)</strong> The vectors<br />
		\[\mathbf{v}_1=\begin{bmatrix}<br />
		  1 \\<br />
		   0 \\<br />
		    1<br />
		  \end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
		  0 \\<br />
		   1 \\<br />
		    0<br />
		  \end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}<br />
		  0 \\<br />
		   0 \\<br />
		    1<br />
		  \end{bmatrix}\]
		  are linearly independent.</p>
<p>&nbsp;<br />
<span id="more-4948"></span><br />

<h2>Solution of Problem 7.</h2>
<h3>(a) Calculate $A\mathbf{v}$.</h3>
<p> We calculate<br />
				\[A\mathbf{v}=\begin{bmatrix}<br />
		  -3 &#038; -4\\<br />
		  8&#038; 9<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  -1 \\<br />
		  2<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  -5 \\<br />
		  10<br />
		\end{bmatrix}=5\begin{bmatrix}<br />
		  -1 \\<br />
		  2<br />
		\end{bmatrix}.\]
		Thus we see that $\lambda=5$.</p>
<h3>(b) Without forming $A^3$, calculate the vector $A^3\mathbf{v}$.</h3>
<p> From part (a), we know that $A\mathbf{v}=5\mathbf{v}$.<br />
		Using the properties of matrix multiplication, we compute<br />
		\begin{align*}<br />
		A^3\mathbf{v}&#038;=A^2(A\mathbf{v})=A^2(5\mathbf{v})=5A^2\mathbf{v}=5A(A\mathbf{v})=5A(5\mathbf{v})\\<br />
		&#038;=5^2A\mathbf{v}=5^2(5\mathbf{v})=5^3\mathbf{v}\\<br />
		&#038;=5^3\begin{bmatrix}<br />
		  -1 \\<br />
		  2<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  -125 \\<br />
		  250<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Hence we obtain $A^3\mathbf{v}=\begin{bmatrix}<br />
		  -125 \\<br />
		  250<br />
		\end{bmatrix}$.</p>
<h2> Proof of Problem 8. </h2>
<p>		Let $A$ and $B$ be nonsingular matrices.<br />
			Suppose that $(AB)\mathbf{v}=\mathbf{0}$ for some $n$-dimensional vector $\mathbf{v}$.<br />
			Then we have<br />
			\[A(B\mathbf{v})=\mathbf{0}.\]
			It follows that the vector $B\mathbf{v}$ is a solution of $A\mathbf{x}=\mathbf{0}$.<br />
			As the matrix $A$ is nonsingular, any solution must be the zero vector.<br />
			Hence we have $B\mathbf{v}=\mathbf{0}$.</p>
<p>		   This equation says that the vector $\mathbf{v}$ is a solution of $B\mathbf{x}=\mathbf{0}$.<br />
		   As the matrix $B$ is nonsingular, any solution must be the zero vector.<br />
		   This implies that $\mathbf{v}=\mathbf{0}$.</p>
<p>		   This proves that if $(AB)\mathbf{v}=\mathbf{0}$, then $\mathbf{v}=\mathbf{0}$.<br />
		   This is equivalent to say that the matrix $AB$ is nonsingular by definition.</p>
<h2>Solution of Problem 9.</h2>
<h3>True or False: (a) There is a $3\times 3$ homogeneous system that has exactly three solutions.</h3>
<p><strong>False</strong>. Every system of linear equations has no solution at all or it has either one or infinitely many solutions.</p>
<h3>True or False: (b) If $A$ and $B$ are $n\times n$ symmetric matrices, then the sum $A+B$ is also symmetric.</h3>
<p><strong> True</strong>. Since $A$ and $B$ are symmetric, we have $A^{\trans}=A$ and $B^{\trans}=B$.<br />
		It follows that<br />
		\[(A+B)^{\trans}=A^{\trans}+B^{\trans}=A+B.\]
		Thus the sum $A+B$ is symmetric.</p>
<h3>True or False: (c) If $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly dependent, then $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3, \mathbf{v}_4$ is linearly dependent.</h3>
<p> <strong>True</strong>. Since the vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly dependent, there exists scalars $c_1, c_2, c_3$, not all of them are zero, such that<br />
		\[c_1\mathbf{v}_1+c_2\mathbf{v}_2+c_3\mathbf{v}_3=\mathbf{0}.\]
		Let $\mathbf{v}_4$ be any $n$-dimensional vector.<br />
		Then we have the linear combination<br />
		\[c_1\mathbf{v}_1+c_2\mathbf{v}_2+c_3\mathbf{v}_3+0\mathbf{v}_4=\mathbf{0}\]
		whose coefficient is not trivial as at least one of $c_1, c_2, c_3$ is nonzero.<br />
		This implies that the vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3, \mathbf{v}_4$ is also linearly dependent.</p>
<h3>True or False: (d) If the coefficient matrix of a system of linear equations is singular, then the system is inconsistent.</h3>
<p> <strong>False</strong>. The system could be consistent even though the coefficient matrix is singular.<br />
		For example, consider the system<br />
		\begin{align*}<br />
		x_1+2x_2&#038;=3\\<br />
		2x_1+4x_2&#038;=6.<br />
		\end{align*}<br />
		The coefficient matrix of the system is $A=\begin{bmatrix}<br />
		  1 &#038; 2\\<br />
		  2&#038; 4<br />
		\end{bmatrix}$.<br />
		This is singular since, for example, it is row equivalent to $\begin{bmatrix}<br />
		  1 &#038; 2\\<br />
		  0&#038; 0<br />
		\end{bmatrix}$.<br />
		However, the system has a solution, for example, $x_1=1, x_2=1$.<br />
		Hence the system is consistent even though its coefficient matrix is singular.</p>
<h3>True or False: (e) The vectors<br />
		\[\mathbf{v}_1=\begin{bmatrix}<br />
		  1 \\<br />
		   0 \\<br />
		    1<br />
		  \end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
		  0 \\<br />
		   1 \\<br />
		    0<br />
		  \end{bmatrix}, \mathbf{v}_3=\begin{bmatrix}<br />
		  0 \\<br />
		   0 \\<br />
		    1<br />
		  \end{bmatrix}\]
		  are linearly independent.</h3>
<p><strong> True</strong>. Consider the linear combination<br />
		\[c_1\mathbf{v}_1+c_2\mathbf{v}_2+c_3\mathbf{v}_3=\mathbf{0}\]
		for some scalars $c_1, c_2, c_3$.<br />
		Then this can be written as<br />
		\[A\begin{bmatrix}<br />
		  c_1 \\<br />
		   c_2 \\<br />
		    c_3<br />
		  \end{bmatrix}=\begin{bmatrix}<br />
		  0 \\<br />
		   0 \\<br />
		    0<br />
		  \end{bmatrix},\]
		  where<br />
		  \[A=[\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3]=\begin{bmatrix}<br />
		  1 &#038; 0 &#038; 0 \\<br />
		   0 &#038;1 &#038;0 \\<br />
		   1 &#038; 0 &#038; 1<br />
		\end{bmatrix}.\]
		  So the scalars $c_1, c_2, c_3$ is a solution of the system $A\mathbf{x}=\mathbf{0}$.</p>
<p>		  The augmented matrix of the system is<br />
		  \begin{align*}<br />
		 \left[\begin{array}{rrr|r}<br />
		 1 &#038; 0 &#038; 0 &#038;   0 \\<br />
		  0 &#038;1 &#038;  0 &#038; 0  \\<br />
		  1 &#038; 0 &#038; 1 &#038; 0<br />
		    \end{array} \right] \xrightarrow{R_3-R_1}<br />
		     \left[\begin{array}{rrr|r}<br />
		 1 &#038; 0 &#038; 0 &#038;   0 \\<br />
		  0 &#038;1 &#038;  0 &#038; 0  \\<br />
		  0 &#038; 0 &#038; 1 &#038; 0<br />
		    \end{array} \right].<br />
		\end{align*}<br />
		Thus the solution is $c_1=0, c_2=0, c_3=0$.<br />
		It follows that the vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly independent.</p>
<h2>Go to Part 1 and Part 2 </h2>
<p>Go to <a href="//yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-13/" rel="noopener" target="_blank">Part 1</a> for Problem 1, 2, and 3.</p>
<p>Go to <a href="//yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-23/" rel="noopener" target="_blank">Part 2</a> for Problem 4, 5, and 6.</p>
<button class="simplefavorite-button has-count" data-postid="4948" data-siteid="1" data-groupid="1" data-favoritecount="13" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">13</span></button><p>The post <a href="https://yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-33/" target="_blank">Linear Algebra Midterm 1 at the Ohio State University (3/3)</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/linear-algebra-midterm-1-at-the-ohio-state-university-33/feed/</wfw:commentRss>
		<slash:comments>2</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4948</post-id>	</item>
		<item>
		<title>7 Problems on Skew-Symmetric Matrices</title>
		<link>https://yutsumura.com/7-problems-on-skew-symmetric-matrices/</link>
				<comments>https://yutsumura.com/7-problems-on-skew-symmetric-matrices/#respond</comments>
				<pubDate>Fri, 15 Sep 2017 04:21:05 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[hermitian matrix]]></category>
		<category><![CDATA[length of a vector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[skew-symmetric matrix]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4904</guid>
				<description><![CDATA[<p>Let $A$ and $B$ be $n\times n$ skew-symmetric matrices. Namely $A^{\trans}=-A$ and $B^{\trans}=-B$. (a) Prove that $A+B$ is skew-symmetric. (b) Prove that $cA$ is skew-symmetric for any scalar $c$. (c) Let $P$ be an&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/7-problems-on-skew-symmetric-matrices/" target="_blank">7 Problems on Skew-Symmetric Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 564</h2>
<p>		Let $A$ and $B$ be $n\times n$ skew-symmetric matrices. Namely $A^{\trans}=-A$ and $B^{\trans}=-B$.</p>
<p><strong>(a)</strong> Prove that $A+B$ is skew-symmetric.</p>
<p><strong>(b)</strong> Prove that $cA$ is skew-symmetric for any scalar $c$.</p>
<p><strong>(c)</strong> Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is skew-symmetric.</p>
<p><strong>(d)</strong> Suppose that $A$ is real skew-symmetric. Prove that $iA$ is an Hermitian matrix.</p>
<p><strong>(e)</strong> Prove that if $AB=-BA$, then $AB$ is a skew-symmetric matrix.</p>
<p><strong>(f)</strong> Let $\mathbf{v}$ be an $n$-dimensional column vecotor. Prove that $\mathbf{v}^{\trans}A\mathbf{v}=0$.</p>
<p><strong>(g)</strong> Suppose that $A$ is a real skew-symmetric matrix and $A^2\mathbf{v}=\mathbf{0}$ for some vector $\mathbf{v}\in \R^n$. Then prove that $A\mathbf{v}=\mathbf{0}$.</p>
<p>&nbsp;<br />
<span id="more-4904"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that $A+B$ is skew-symmetric.</h3>
<p> We have<br />
		\begin{align*}<br />
		(A+B)^{\trans}=A^{\trans}+B^{\trans}=(-A)+(-B)=-(A+B).<br />
		\end{align*}<br />
		Hence $A+B$ is skew-symmetric.</p>
<h3>(b) Prove that $cA$ is skew-symmetric for any scalar $c$.</h3>
<p> We compute<br />
		\begin{align*}<br />
		(cA)^{\trans}=cA^{\trans}=c(-A)=-cA.<br />
		\end{align*}<br />
		Thus, $cA$ is skew-symmetric.</p>
<h3>(c) Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is skew-symmetric.</h3>
<p> Using the properties of transpose, we have<br />
		\begin{align*}<br />
		(P^{\trans}AP)^{\trans}&#038;=P^{\trans}A^{\trans}(P^{\trans})^{\trans}=P^{\trans}A^{\trans}P\\<br />
		&#038;=P^{\trans}(-A)P=-(P^{\trans}AP).<br />
		\end{align*}<br />
		This implies that $P^{\trans}AP$ is skew-symmetric.</p>
<h3>(d) Suppose that $A$ is real skew-symmetric. Prove that $iA$ is an Hermitian matrix.</h3>
<p>Note that since $A$ is real, we have $\bar{A}=A$.<br />
		Then we have<br />
		\begin{align*}<br />
		(\overline{iA})^{\trans}=(\bar{i}\bar{A})^{\trans}=(-iA)^{\trans}=(-i)A^{\trans}=(-i)(-A)=iA.<br />
		\end{align*}<br />
		It follows that $iA$ is Hermitian.</p>
<h3>(e) Prove that if $AB=-BA$, then $AB$ is a skew-symmetric matrix.</h3>
<p>We calculate<br />
		\begin{align*}<br />
		(AB)^{\trans}&#038;=B^{\trans}A^{\trans}=(-B)(-A)\\<br />
		&#038;=BA=-AB,<br />
		\end{align*}<br />
		where the last step follows from the assumption $AB=-BA$.<br />
		This proves that $AB$ is skew-symmetric.</p>
<h3>(f) Let $\mathbf{v}$ be an $n$-dimensional column vecotor. Prove that $\mathbf{v}^{\trans}A\mathbf{v}=0$.</h3>
<p> Observe that $\mathbf{v}^{\trans}A\mathbf{v}$ is a $1\times 1$ matrix, or just a number.<br />
		So we have<br />
		\begin{align*}<br />
		\mathbf{v}^{\trans}A\mathbf{v}&#038;=(\mathbf{v}^{\trans}A\mathbf{v})^{\trans}=\mathbf{v}^{\trans}A^{\trans}(\mathbf{v}^{\trans})^{\trans}\\<br />
		&#038;=\mathbf{v}^{\trans}A^{\trans}\mathbf{v}=\mathbf{v}^{\trans}(-A)\mathbf{v}=-(\mathbf{v}^{\trans}A\mathbf{v}).<br />
		\end{align*}<br />
		This yields that $2\mathbf{v}^{\trans}A\mathbf{v}=0$, and hence $\mathbf{v}^{\trans}A\mathbf{v}=0$.</p>
<h3>(g) Suppose that $A$ is a real skew-symmetric matrix and $A^2\mathbf{v}=\mathbf{0}$ for some vector $\mathbf{v}\in \R^n$. Then prove that $A\mathbf{v}=\mathbf{0}$.</h3>
<p>Let us compute the length of the vector $A\mathbf{v}$.<br />
		We have<br />
		\begin{align*}<br />
		\|A\mathbf{v}\|&#038;=(A\mathbf{v})^{\trans}(A\mathbf{v})=\mathbf{v}^{\trans}A^{\trans}A\mathbf{v}\\<br />
		&#038;=\mathbf{v}^{\trans}(-A)A\mathbf{v}=-\mathbf{v}^{\trans}A^2\mathbf{v}\\<br />
		&#038;=-\mathbf{v}\mathbf{0} &#038;&#038;\text{by assumption}\\<br />
		&#038;=0.<br />
		\end{align*}<br />
		Since the length $\|A\mathbf{v}\|=0$, we conclude that $A\mathbf{v}=\mathbf{0}$.</p>
<button class="simplefavorite-button has-count" data-postid="4904" data-siteid="1" data-groupid="1" data-favoritecount="25" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">25</span></button><p>The post <a href="https://yutsumura.com/7-problems-on-skew-symmetric-matrices/" target="_blank">7 Problems on Skew-Symmetric Matrices</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/7-problems-on-skew-symmetric-matrices/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4904</post-id>	</item>
		<item>
		<title>Construction of a Symmetric Matrix whose Inverse Matrix is Itself</title>
		<link>https://yutsumura.com/construction-of-a-symmetric-matrix-whose-inverse-matrix-is-itself/</link>
				<comments>https://yutsumura.com/construction-of-a-symmetric-matrix-whose-inverse-matrix-is-itself/#respond</comments>
				<pubDate>Wed, 06 Sep 2017 03:08:57 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[dot product]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inverse matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[symmetric matrix]]></category>
		<category><![CDATA[transpose]]></category>
		<category><![CDATA[transpose matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4822</guid>
				<description><![CDATA[<p>Let $\mathbf{v}$ be a nonzero vector in $\R^n$. Then the dot product $\mathbf{v}\cdot \mathbf{v}=\mathbf{v}^{\trans}\mathbf{v}\neq 0$. Set $a:=\frac{2}{\mathbf{v}^{\trans}\mathbf{v}}$ and define the $n\times n$ matrix $A$ by \[A=I-a\mathbf{v}\mathbf{v}^{\trans},\] where $I$ is the $n\times n$ identity matrix.&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/construction-of-a-symmetric-matrix-whose-inverse-matrix-is-itself/" target="_blank">Construction of a Symmetric Matrix whose Inverse Matrix is Itself</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 556</h2>
<p>		Let $\mathbf{v}$ be a nonzero vector in $\R^n$.<br />
		Then the dot product $\mathbf{v}\cdot \mathbf{v}=\mathbf{v}^{\trans}\mathbf{v}\neq 0$.<br />
		Set $a:=\frac{2}{\mathbf{v}^{\trans}\mathbf{v}}$ and define the $n\times n$ matrix $A$ by<br />
		\[A=I-a\mathbf{v}\mathbf{v}^{\trans},\]
		where $I$ is the $n\times n$ identity matrix.</p>
<p>		Prove that $A$ is a symmetric matrix and $AA=I$.<br />
		Conclude that the inverse matrix is $A^{-1}=A$.</p>
<p>&nbsp;<br />
<span id="more-4822"></span><br />

<h2> Proof. </h2>
<h3>$A$ is symmetric</h3>
<p>			We first show that the matrix $A$ is symmetric.<br />
			We calculate using properties of transpose<br />
			\begin{align*}<br />
		A^{\trans}&#038;=(I-a\mathbf{v}\mathbf{v}^{\trans})^{\trans} &#038;&#038; \text{by definition of $A$}\\<br />
		&#038;=I^{\trans}-(a\mathbf{v}\mathbf{v}^{\trans})^{\trans}\\<br />
		&#038;=I-a(\mathbf{v}^{\trans})^{\trans}\mathbf{v}^{\trans}\\<br />
		&#038;=I-a\mathbf{v}\mathbf{v}^{\trans}\\<br />
		&#038;=A &#038;&#038; \text{by definition of $A$}.<br />
		\end{align*}<br />
		Hence we have $A^{\trans}=A$, and thus $A$ is symmetric.</p>
<h3>$AA=I$ and $A^{-1}=A$</h3>
<p>		Next, we prove that $AA=I$.</p>
<p>		We compute<br />
		\begin{align*}<br />
		AA&#038;=(I-a\mathbf{v}\mathbf{v}^{\trans})(I-a\mathbf{v}\mathbf{v}^{\trans})\\<br />
		&#038;=I(I-a\mathbf{v}\mathbf{v}^{\trans})-a\mathbf{v}\mathbf{v}^{\trans}(I-a\mathbf{v}\mathbf{v}^{\trans})\\<br />
		&#038;=I-a\mathbf{v}\mathbf{v}^{\trans}-a\mathbf{v}\mathbf{v}^{\trans}+a^2\mathbf{v}\mathbf{v}^{\trans}\mathbf{v}\mathbf{v}^{\trans}\\<br />
		&#038;=I-2a\mathbf{v}\mathbf{v}^{\trans}+a^2\mathbf{v}\mathbf{v}^{\trans}\mathbf{v}\mathbf{v}^{\trans}. \tag{*}<br />
		\end{align*}</p>
<p>		Note that we have<br />
		\begin{align*}<br />
		\mathbf{v}\mathbf{v}^{\trans}\mathbf{v}\mathbf{v}^{\trans}&#038;=\mathbf{v}(\mathbf{v}^{\trans}\mathbf{v})\mathbf{v}^{\trans}\\<br />
		&#038;=\mathbf{v}\left(\,  \frac{2}{a} \,\right)\mathbf{v}^{\trans} &#038;&#038;\text{by definition of $a\neq 0$}\\<br />
		&#038;=\frac{2}{a}\mathbf{v}\mathbf{v}^{\trans}.<br />
		\end{align*}</p>
<p>		Plugging this relation into (*), we obtain<br />
		\begin{align*}<br />
		AA&#038;=I-2a\mathbf{v}\mathbf{v}^{\trans}+a^2\frac{2}{a}\mathbf{v}\mathbf{v}^{\trans}=I.<br />
		\end{align*}<br />
		Thus we get $AA=I$.<br />
		This implies that the inverse matrix of $A$ is $A$ itself: $A^{-1}=A$.</p>
<button class="simplefavorite-button has-count" data-postid="4822" data-siteid="1" data-groupid="1" data-favoritecount="15" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">15</span></button><p>The post <a href="https://yutsumura.com/construction-of-a-symmetric-matrix-whose-inverse-matrix-is-itself/" target="_blank">Construction of a Symmetric Matrix whose Inverse Matrix is Itself</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/construction-of-a-symmetric-matrix-whose-inverse-matrix-is-itself/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4822</post-id>	</item>
		<item>
		<title>A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</title>
		<link>https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/</link>
				<comments>https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/#comments</comments>
				<pubDate>Wed, 16 Aug 2017 03:17:31 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[positive definite]]></category>
		<category><![CDATA[positive definite matrix]]></category>
		<category><![CDATA[symmetric matrix]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4648</guid>
				<description><![CDATA[<p>(a) Suppose that $A$ is an $n\times n$ real symmetric positive definite matrix. Prove that \[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\] defines an inner product on the vector space $\R^n$. (b) Let $A$ be an $n\times n$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 538</h2>
<p><strong>(a)</strong> Suppose that $A$ is an $n\times n$ real symmetric positive definite matrix.<br />
 Prove that<br />
			\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\]
			defines an inner product on the vector space $\R^n$.</p>
<p><strong>(b)</strong> Let $A$ be an $n\times n$ real matrix. Suppose that<br />
			\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\]
			defines an inner product on the vector space $\R^n$.</p>
<p>		    Prove that $A$ is symmetric and positive definite.</p>
<p>&nbsp;<br />
<span id="more-4648"></span><br />

<h2>Definitions.</h2>
<h3>Inner Product on a Real Vector Space</h3>
<p>Let $V$ be a real vector space. An <strong>inner product</strong> on $V$ is a function that assigns a real number $\langle \mathbf{u}, \mathbf{v}\rangle$ to each pair of vectors $\mathbf{u}$ and $\mathbf{v}$ in $V$ satisfying the following properties.<br />
	For any vectors $\mathbf{u}, \mathbf{v}, \mathbf{w}$ and a real number $r\in \R$, </p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<ul>
<li><strong>Symmetry</strong>  \[\langle \mathbf{u}, \mathbf{v}\rangle=\langle  \mathbf{v}, \mathbf{u}\rangle\]</li>
<li><strong>Linearity in the first argument</strong><br />
	\begin{align*}<br />
		\langle r\mathbf{u}, \mathbf{v}\rangle &#038;=r\langle \mathbf{u}, \mathbf{v}\rangle\\<br />
		\langle \mathbf{u}+ \mathbf{v}, \mathbf{w}\rangle &#038;=\langle \mathbf{u}, \mathbf{w}\rangle+ \langle \mathbf{v}, \mathbf{w}\rangle<br />
		\end{align*}</li>
<li><strong>Positive-Definiteness</strong><br />
			\begin{align*}<br />
		\langle \mathbf{u}, \mathbf{u}\rangle &#038;\geq 0 \\<br />
		\langle \mathbf{u}, \mathbf{u}\rangle &#038;=0 \text{ if and only if } \mathbf{u}=\mathbf{0}<br />
		\end{align*}</li>
</ul>
</div>
<h3>A Positive-Definite Matrix</h3>
<p>			A real symmetric $n\times n$ matrix $A$ is called <strong>positive definite</strong> if $\mathbf{x}^{\trans}A\mathbf{x} > 0$ for each nonzero vector $\mathbf{x}\in \R^n$.</p>
<h2> Proof. </h2>
<h3>(a) If $A$ is positive definite, then $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ defines an inner product</h3>
<p> First of all, note that we can write<br />
				\[\langle \mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\trans}A\mathbf{y}=\mathbf{x}\cdot (A\mathbf{y}),\]
				where the &#8220;dot&#8221; is the dot product of $\R^n$.<br />
				Thus, $\langle \mathbf{x}, \mathbf{y}\rangle$ is a real number.</p>
<hr />
<p>				We verify the three properties of an inner product.<br />
				Since the dot product is commutative, we have<br />
				\begin{align*}<br />
		\langle \mathbf{x}, \mathbf{y}\rangle&#038;=\mathbf{x}\cdot (A\mathbf{y})= (A\mathbf{y})\cdot \mathbf{x}\\<br />
		&#038;=(A\mathbf{y})^{\trans}\mathbf{x}=\mathbf{y}^{\trans}A^{\trans}\mathbf{x}\\<br />
		&#038;=\mathbf{y}^{\trans}A\mathbf{x} &#038;&#038;\text{since $A$ is symmetric}\\<br />
		&#038;=\langle \mathbf{y}, \mathbf{x} \rangle.<br />
		\end{align*}<br />
		Thus, the function $\langle\,,\,\rangle$ is symmetric.</p>
<hr />
<p>		Next, for any vectors $\mathbf{x}, \mathbf{y}, \mathbf{z}$ and any real number $r$, we have<br />
		\begin{align*}<br />
		\langle r\mathbf{x}, \mathbf{y}\rangle &#038;=(r\mathbf{x})^{\trans}A\mathbf{y}=r\mathbf{x}^{\trans}A\mathbf{y}=r\langle \mathbf{x}, \mathbf{y}\rangle<br />
		\end{align*}<br />
				and<br />
				\begin{align*}<br />
		\langle \mathbf{x}+\mathbf{y}, \mathbf{z}\rangle &#038;=(\mathbf{x}+\mathbf{y})^{\trans}A\mathbf{z}=(\mathbf{x}^{\trans}+\mathbf{y}^{\trans})A\mathbf{z}\\<br />
		&#038;=\mathbf{x}^{\trans}A\mathbf{z}+\mathbf{y}^{\trans}A\mathbf{z}=\langle \mathbf{x}, \mathbf{z}\rangle+\langle \mathbf{y}, \mathbf{z}\rangle.<br />
		\end{align*}<br />
		Thus, the linearity in the first argument is satisfied.</p>
<hr />
<p>		If $\mathbf{x}$ is a nonzero vector in $\R^n$, then we have<br />
		\begin{align*}<br />
		\langle \mathbf{x}, \mathbf{x}\rangle=\mathbf{x}^{\trans}A\mathbf{x} > 0<br />
		\end{align*}<br />
		since $A$ is positive definite.<br />
		We also have<br />
		\begin{align*}<br />
		\langle \mathbf{0}, \mathbf{0}\rangle=\mathbf{0}^{\trans}A\mathbf{0}=0.<br />
		\end{align*}<br />
		It follows that $\langle \mathbf{x}, \mathbf{x}\rangle \geq 0$ for any vector $\mathbf{x}\in \R^n$.</p>
<hr />
<p>		Suppose that $\langle \mathbf{x}, \mathbf{x}\rangle=0$.<br />
		Then we have<br />
		\begin{align*}<br />
		\langle \mathbf{x}, \mathbf{x}\rangle=\mathbf{x}^{\trans}A\mathbf{x}=0.<br />
		\end{align*}<br />
		Since $A$ is positive definite, this happens if and only if $\mathbf{x}=\mathbf{0}$.<br />
		Hence $\langle \mathbf{x}, \mathbf{x}\rangle=0$ if and only if $\mathbf{x}=0$.<br />
		This proves the positive-definiteness of the function $\langle\,,\,\rangle$.</p>
<hr />
<p>		This completes the verification of the three properties, and hence $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ defines an inner product on $\R^n$.</p>
<h3>(b) If $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ defines an inner product, then $A$ is symmetric positive definite</h3>
<p> Suppose that $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ is an inner product on $\R^n$.<br />
		Let us write $A=(a_{ij})$.</p>
<hr />
<p>		We first prove that $A$ is symmetric.<br />
		Let $\mathbf{e}_i$ denote the $i$-th standard unit vectors, that is,<br />
		\[\mathbf{e}_i=\begin{bmatrix}<br />
		  0 \\<br />
		   \vdots \\<br />
		    1 \\<br />
		   \vdots \\<br />
		   0<br />
		   \end{bmatrix},\]
		   where $1$ is on the $i$-th columns and other entries are all zero.</p>
<hr />
<p>				We compute<br />
		\begin{align*}<br />
		\langle \mathbf{e}_i, \mathbf{e}_j\rangle &#038;=\mathbf{e}^{\trans}_iA\mathbf{e}_j\\<br />
		&#038;=\begin{bmatrix}<br />
			0 &#038; \dots &#038;1 &#038; \dots 0<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		  a_{1 j} \\<br />
		   a_{2 j} \\<br />
		    \vdots \\<br />
		   a_{n j}<br />
		   \end{bmatrix}=a_{ij}.<br />
		\end{align*}<br />
		Similarly, $\langle \mathbf{e}_j, \mathbf{e}_i\rangle =a_{j i}$. By symmetry of the inner product, it yields that<br />
		\[a_{ij}=\langle \mathbf{e}_i, \mathbf{e}_j\rangle =\langle \mathbf{e}_j, \mathbf{e}_i\rangle =a_{j i}.\]
		Therefore, the matrix $A$ is symmetric.</p>
<hr />
<p>		Next, we show that $A$ is positive definite.<br />
		Let $\mathbf{x}$ be a nonzero vector in $\R^n$.<br />
		Then we have<br />
		\[\mathbf{x}^{\trans}A\mathbf{x}=\langle \mathbf{x}, \mathbf{x}\rangle > 0\]
		by positive-definiteness property of the inner product.<br />
		This proves that $A$ is a positive definite matrix.</p>
<h2> Related Question. </h2>
<p>A concrete example of a positive-definite matrix is given in the next problem.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
   Consider the $2\times 2$ real matrix<br />
		   \[A=\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}.\]
<p><strong>(a)</strong> Prove that the matrix $A$ is positive definite.</p>
<p><strong>(b)</strong> Since $A$ is positive definite by part (a), the formula<br />
		\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans} A \mathbf{y}\]
		for $\mathbf{x}, \mathbf{y} \in \R^2$ defines an inner product on $\R^n$.<br />
		Consider $\R^2$ as an inner product space with this inner product.</p>
<p>		Prove that the unit vectors<br />
		\[\mathbf{e}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix} \text{ and } \mathbf{e}_2=\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}\]
		are not orthogonal in the inner product space $\R^2$.</p>
<p><strong>(c)</strong> Find an orthogonal basis $\{\mathbf{v}_1, \mathbf{v}_2\}$ of $\R^2$ from the basis $\{\mathbf{e}_1, \mathbf{e}_2\}$ using the Gram-Schmidt orthogonalization process.
</div>
<p>See the post &#8628;<br />
<a href="//yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a><br />
for proofs.</p>
<button class="simplefavorite-button has-count" data-postid="4648" data-siteid="1" data-groupid="1" data-favoritecount="54" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">54</span></button><p>The post <a href="https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4648</post-id>	</item>
	</channel>
</rss>
