<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>positive definite matrix &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/positive-definite-matrix/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Sat, 09 Sep 2017 03:05:03 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.6</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Prove $\mathbf{x}^{\trans}A\mathbf{x} \geq 0$ and determine those $\mathbf{x}$ such that $\mathbf{x}^{\trans}A\mathbf{x}=0$</title>
		<link>https://yutsumura.com/prove-mathbfxtransamathbfx-geq-0-and-determine-those-mathbfx-such-that-mathbfxtransamathbfx0/</link>
				<comments>https://yutsumura.com/prove-mathbfxtransamathbfx-geq-0-and-determine-those-mathbfx-such-that-mathbfxtransamathbfx0/#respond</comments>
				<pubDate>Fri, 08 Sep 2017 13:18:40 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix product]]></category>
		<category><![CDATA[positive definite matrix]]></category>
		<category><![CDATA[positive semi-definite matrix]]></category>
		<category><![CDATA[vector]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4844</guid>
				<description><![CDATA[<p>For each of the following matrix $A$, prove that $\mathbf{x}^{\trans}A\mathbf{x} \geq 0$ for all vectors $\mathbf{x}$ in $\R^2$. Also, determine those vectors $\mathbf{x}\in \R^2$ such that $\mathbf{x}^{\trans}A\mathbf{x}=0$. (a) $A=\begin{bmatrix} 4 &#038; 2\\ 2&#038; 1&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/prove-mathbfxtransamathbfx-geq-0-and-determine-those-mathbfx-such-that-mathbfxtransamathbfx0/" target="_blank">Prove $\mathbf{x}^{\trans}A\mathbf{x} \geq 0$ and determine those $\mathbf{x}$ such that $\mathbf{x}^{\trans}A\mathbf{x}=0$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 559</h2>
<p>		For each of the following matrix $A$, prove that $\mathbf{x}^{\trans}A\mathbf{x} \geq 0$ for all vectors $\mathbf{x}$ in $\R^2$. Also, determine those vectors $\mathbf{x}\in \R^2$ such that $\mathbf{x}^{\trans}A\mathbf{x}=0$.</p>
<p><strong>(a)</strong> $A=\begin{bmatrix}<br />
		  4 &#038; 2\\<br />
		  2&#038; 1<br />
		\end{bmatrix}$.</p>
<p>&nbsp;<br />
<strong>(b)</strong> $A=\begin{bmatrix}<br />
		  2 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}$.</p>
<p>&nbsp;<br />
<span id="more-4844"></span><br />

<h2> Proof. </h2>
<h3>(a) $A=\begin{bmatrix}<br />
		  4 &#038; 2\\<br />
		  2&#038; 1<br />
		\end{bmatrix}$.</h3>
<p>Let $\mathbf{x}=\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}$ be a vector in $\R^2$. Then we have<br />
		\begin{align*}<br />
		\mathbf{x}^{\trans}A\mathbf{x}&#038;=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  4 &#038; 2\\<br />
		  2&#038; 1<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  4x+2y \\<br />
		  2x+y<br />
		\end{bmatrix}\\[6pt]
		&#038;=x(4x+2y)+y(2x+y)=4x^2+2xy+2xy+y^2\\<br />
		&#038;=4x^2+4xy+y^2\\<br />
		&#038;=(2x+y)^2 \geq 0.<br />
		\end{align*}<br />
		Thus we have $\mathbf{x}^{\trans}A\mathbf{x} \geq 0$.<br />
		Note that $\mathbf{x}^{\trans}A\mathbf{x}=0$ if and only if $2x+y=0$.<br />
		Thus those vectors $\mathbf{x}$ such that $\mathbf{x}^{\trans}A\mathbf{x}=0$ are<br />
		\[\mathbf{x}=\begin{bmatrix}<br />
		  x \\<br />
		  -2x<br />
		\end{bmatrix}=x\begin{bmatrix}<br />
		  1 \\<br />
		  -2<br />
		\end{bmatrix}\]
		for any real number $x$.</p>
<h3>(b) $A=\begin{bmatrix}<br />
		  2 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}$.</h3>
<p> Let $\mathbf{x}=\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}$ be a vector in $\R^2$.<br />
		We compute<br />
		\begin{align*}<br />
		\mathbf{x}^{\trans}A\mathbf{x}&#038;=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		  2 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}=<br />
		\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		  2x+y \\<br />
		  x+3y<br />
		\end{bmatrix}\\[6pt]
		&#038;=x(2x+y)+y(x+3y)=2x^2+xy+xy+3y^2\\<br />
		&#038;=2x^2+2xy+3y^2\\<br />
		&#038;=x^2+(x+y)^2+2y^2 \geq 0. \tag{*}<br />
		\end{align*}<br />
		Thus we obtain $\mathbf{x}^{\trans}A\mathbf{x} \geq 0$.<br />
		It follows from (*) that $\mathbf{x}^{\trans}A\mathbf{x}=0$ if and only if<br />
		\[x^2+(x+y)^2+2y^2=0.\]
		Since each of $x^2, (x+y)^2, 2y^2$ is nonnegative, we must have $x=y=0$.<br />
		Therefore $\mathbf{x}^{\trans}A\mathbf{x}=0$ if and only if $\mathbf{x}=\mathbf{0}$.</p>
<button class="simplefavorite-button has-count" data-postid="4844" data-siteid="1" data-groupid="1" data-favoritecount="23" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">23</span></button><p>The post <a href="https://yutsumura.com/prove-mathbfxtransamathbfx-geq-0-and-determine-those-mathbfx-such-that-mathbfxtransamathbfx0/" target="_blank">Prove $\mathbf{x}^{\trans}A\mathbf{x} \geq 0$ and determine those $\mathbf{x}$ such that $\mathbf{x}^{\trans}A\mathbf{x}=0$</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/prove-mathbfxtransamathbfx-geq-0-and-determine-those-mathbfx-such-that-mathbfxtransamathbfx0/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4844</post-id>	</item>
		<item>
		<title>The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</title>
		<link>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/</link>
				<comments>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/#comments</comments>
				<pubDate>Wed, 16 Aug 2017 21:55:59 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis of a vector space]]></category>
		<category><![CDATA[Gram-Schmidt orthogonalization process]]></category>
		<category><![CDATA[Gram-Schmidt process]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[inner product space]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[orthogonal basis]]></category>
		<category><![CDATA[positive definite matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4653</guid>
				<description><![CDATA[<p>Consider the $2\times 2$ real matrix \[A=\begin{bmatrix} 1 &#038; 1\\ 1&#038; 3 \end{bmatrix}.\] (a) Prove that the matrix $A$ is positive definite. (b) Since $A$ is positive definite by part (a), the formula \[\langle&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 539</h2>
<p>		   Consider the $2\times 2$ real matrix<br />
		   \[A=\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}.\]
<p><strong>(a)</strong> Prove that the matrix $A$ is positive definite.</p>
<p><strong>(b)</strong> Since $A$ is positive definite by part (a), the formula<br />
		\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans} A \mathbf{y}\]
		for $\mathbf{x}, \mathbf{y} \in \R^2$ defines an inner product on $\R^n$.<br />
		Consider $\R^2$ as an inner product space with this inner product.</p>
<p>		Prove that the unit vectors<br />
		\[\mathbf{e}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix} \text{ and } \mathbf{e}_2=\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}\]
		are not orthogonal in the inner product space $\R^2$.</p>
<p><strong>(c)</strong> Find an orthogonal basis $\{\mathbf{v}_1, \mathbf{v}_2\}$ of $\R^2$ from the basis $\{\mathbf{e}_1, \mathbf{e}_2\}$ using the Gram-Schmidt orthogonalization process.</p>
<p>&nbsp;<br />
<span id="more-4653"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that the matrix $A$ is positive definite.</h3>
<p> We prove that for every nonzero vector $\mathbf{x}=\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}\in \R^2$, we have $\mathbf{x}^{\trans} A \mathbf{x} > 0$.<br />
		We have<br />
		\begin{align*}<br />
		\mathbf{x}^{\trans} A \mathbf{x}&#038;=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix} \begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x \\<br />
		  y<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  x &#038; y<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  x+y \\<br />
		  x+3y<br />
		\end{bmatrix}\\[6pt]
		&#038;=x(x+y)+y(x+3y)=x^2+2xy+3y^2\\<br />
		&#038;=x^2+2xy+y^2+2y^2=(x+y)^2+2y^2.<br />
		\end{align*}</p>
<p>		Since $\mathbf{x}\neq \mathbf{0}$, at least one of $x, y$ is nonzero.<br />
		Thus the last expression is always positive.<br />
		Hence $A$ is a positive definite matrix.</p>
<h3>(b) Prove that $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal in the inner product space $\R^2$.</h3>
<p> Note that by post &#8220;<a href="//yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a>&#8220;, the formula $\langle \mathbf{x}, \mathbf{y}\rangle$ defines an inner product on $\R^2$.</p>
<p>		Two vectors $\mathbf{x}$ and $\mathbf{y}$ is said to be <strong>orthogonal</strong> if $\langle \mathbf{x}, \mathbf{y}\rangle=0$.</p>
<p>		The vectors $\mathbf{e}_1, \mathbf{e}_2$ are not orthogonal with this inner product since<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_2\rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  3<br />
		\end{bmatrix}=1\neq 0.<br />
		\end{align*}</p>
<h3>(c) Find an orthogonal basis using the Gram-Schmidt orthogonalization process.</h3>
<p>By the Gram-Schmidt orthogonalization process, we have<br />
		\begin{align*}<br />
		\mathbf{v}_1&#038;=\mathbf{e}_1\\<br />
		\mathbf{v}_2&#038;=\mathbf{e}_2-\frac{\langle \mathbf{v}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{v}_1, \mathbf{v}_1 \rangle}\mathbf{v}_1<br />
		=\mathbf{e}_2-\frac{\langle \mathbf{e}_1, \mathbf{e}_2 \rangle}{\langle \mathbf{e}_1, \mathbf{e}_1 \rangle}\mathbf{e}_1.<br />
		\end{align*}</p>
<p>		We compute<br />
		\begin{align*}<br />
		\langle \mathbf{e}_1, \mathbf{e}_1 \rangle=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 \\<br />
		  1<br />
		\end{bmatrix}=1.<br />
		\end{align*}<br />
		We also have $\langle \mathbf{e}_1, \mathbf{e}_2\rangle=1$ from part (b).<br />
		Thus, we have<br />
		\begin{align*}<br />
		\mathbf{v}_2=\mathbf{e}_2-\mathbf{e}_1=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.<br />
		\end{align*}<br />
		Thus, the Gram-Schmidt orthogonalization process yields the orthogonal basis<br />
		\[\mathbf{v}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix}, \mathbf{v}_2=\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}.\]
<h4>Double Check</h4>
<p>		Let us verify that $\mathbf{v}_1, \mathbf{v}_2$ are orthogonal by computing their inner product directly as follows.<br />
		We have<br />
		\begin{align*}<br />
		\langle \mathbf{v}_1, \mathbf{v}_2\rangle=\mathbf{v}_1^{\trans} A\mathbf{v}_2=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  -1 \\<br />
		  1<br />
		\end{bmatrix}=\begin{bmatrix}<br />
		  1 &#038; 0<br />
		\end{bmatrix}\begin{bmatrix}<br />
		  0 \\<br />
		  2<br />
		\end{bmatrix}=0.<br />
		\end{align*}</p>
<button class="simplefavorite-button has-count" data-postid="4653" data-siteid="1" data-groupid="1" data-favoritecount="21" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">21</span></button><p>The post <a href="https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4653</post-id>	</item>
		<item>
		<title>A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</title>
		<link>https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/</link>
				<comments>https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/#comments</comments>
				<pubDate>Wed, 16 Aug 2017 03:17:31 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[inner product]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[positive definite]]></category>
		<category><![CDATA[positive definite matrix]]></category>
		<category><![CDATA[symmetric matrix]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=4648</guid>
				<description><![CDATA[<p>(a) Suppose that $A$ is an $n\times n$ real symmetric positive definite matrix. Prove that \[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\] defines an inner product on the vector space $\R^n$. (b) Let $A$ be an $n\times n$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 538</h2>
<p><strong>(a)</strong> Suppose that $A$ is an $n\times n$ real symmetric positive definite matrix.<br />
 Prove that<br />
			\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\]
			defines an inner product on the vector space $\R^n$.</p>
<p><strong>(b)</strong> Let $A$ be an $n\times n$ real matrix. Suppose that<br />
			\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}\]
			defines an inner product on the vector space $\R^n$.</p>
<p>		    Prove that $A$ is symmetric and positive definite.</p>
<p>&nbsp;<br />
<span id="more-4648"></span><br />

<h2>Definitions.</h2>
<h3>Inner Product on a Real Vector Space</h3>
<p>Let $V$ be a real vector space. An <strong>inner product</strong> on $V$ is a function that assigns a real number $\langle \mathbf{u}, \mathbf{v}\rangle$ to each pair of vectors $\mathbf{u}$ and $\mathbf{v}$ in $V$ satisfying the following properties.<br />
	For any vectors $\mathbf{u}, \mathbf{v}, \mathbf{w}$ and a real number $r\in \R$, </p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<ul>
<li><strong>Symmetry</strong>  \[\langle \mathbf{u}, \mathbf{v}\rangle=\langle  \mathbf{v}, \mathbf{u}\rangle\]</li>
<li><strong>Linearity in the first argument</strong><br />
	\begin{align*}<br />
		\langle r\mathbf{u}, \mathbf{v}\rangle &#038;=r\langle \mathbf{u}, \mathbf{v}\rangle\\<br />
		\langle \mathbf{u}+ \mathbf{v}, \mathbf{w}\rangle &#038;=\langle \mathbf{u}, \mathbf{w}\rangle+ \langle \mathbf{v}, \mathbf{w}\rangle<br />
		\end{align*}</li>
<li><strong>Positive-Definiteness</strong><br />
			\begin{align*}<br />
		\langle \mathbf{u}, \mathbf{u}\rangle &#038;\geq 0 \\<br />
		\langle \mathbf{u}, \mathbf{u}\rangle &#038;=0 \text{ if and only if } \mathbf{u}=\mathbf{0}<br />
		\end{align*}</li>
</ul>
</div>
<h3>A Positive-Definite Matrix</h3>
<p>			A real symmetric $n\times n$ matrix $A$ is called <strong>positive definite</strong> if $\mathbf{x}^{\trans}A\mathbf{x} > 0$ for each nonzero vector $\mathbf{x}\in \R^n$.</p>
<h2> Proof. </h2>
<h3>(a) If $A$ is positive definite, then $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ defines an inner product</h3>
<p> First of all, note that we can write<br />
				\[\langle \mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\trans}A\mathbf{y}=\mathbf{x}\cdot (A\mathbf{y}),\]
				where the &#8220;dot&#8221; is the dot product of $\R^n$.<br />
				Thus, $\langle \mathbf{x}, \mathbf{y}\rangle$ is a real number.</p>
<hr />
<p>				We verify the three properties of an inner product.<br />
				Since the dot product is commutative, we have<br />
				\begin{align*}<br />
		\langle \mathbf{x}, \mathbf{y}\rangle&#038;=\mathbf{x}\cdot (A\mathbf{y})= (A\mathbf{y})\cdot \mathbf{x}\\<br />
		&#038;=(A\mathbf{y})^{\trans}\mathbf{x}=\mathbf{y}^{\trans}A^{\trans}\mathbf{x}\\<br />
		&#038;=\mathbf{y}^{\trans}A\mathbf{x} &#038;&#038;\text{since $A$ is symmetric}\\<br />
		&#038;=\langle \mathbf{y}, \mathbf{x} \rangle.<br />
		\end{align*}<br />
		Thus, the function $\langle\,,\,\rangle$ is symmetric.</p>
<hr />
<p>		Next, for any vectors $\mathbf{x}, \mathbf{y}, \mathbf{z}$ and any real number $r$, we have<br />
		\begin{align*}<br />
		\langle r\mathbf{x}, \mathbf{y}\rangle &#038;=(r\mathbf{x})^{\trans}A\mathbf{y}=r\mathbf{x}^{\trans}A\mathbf{y}=r\langle \mathbf{x}, \mathbf{y}\rangle<br />
		\end{align*}<br />
				and<br />
				\begin{align*}<br />
		\langle \mathbf{x}+\mathbf{y}, \mathbf{z}\rangle &#038;=(\mathbf{x}+\mathbf{y})^{\trans}A\mathbf{z}=(\mathbf{x}^{\trans}+\mathbf{y}^{\trans})A\mathbf{z}\\<br />
		&#038;=\mathbf{x}^{\trans}A\mathbf{z}+\mathbf{y}^{\trans}A\mathbf{z}=\langle \mathbf{x}, \mathbf{z}\rangle+\langle \mathbf{y}, \mathbf{z}\rangle.<br />
		\end{align*}<br />
		Thus, the linearity in the first argument is satisfied.</p>
<hr />
<p>		If $\mathbf{x}$ is a nonzero vector in $\R^n$, then we have<br />
		\begin{align*}<br />
		\langle \mathbf{x}, \mathbf{x}\rangle=\mathbf{x}^{\trans}A\mathbf{x} > 0<br />
		\end{align*}<br />
		since $A$ is positive definite.<br />
		We also have<br />
		\begin{align*}<br />
		\langle \mathbf{0}, \mathbf{0}\rangle=\mathbf{0}^{\trans}A\mathbf{0}=0.<br />
		\end{align*}<br />
		It follows that $\langle \mathbf{x}, \mathbf{x}\rangle \geq 0$ for any vector $\mathbf{x}\in \R^n$.</p>
<hr />
<p>		Suppose that $\langle \mathbf{x}, \mathbf{x}\rangle=0$.<br />
		Then we have<br />
		\begin{align*}<br />
		\langle \mathbf{x}, \mathbf{x}\rangle=\mathbf{x}^{\trans}A\mathbf{x}=0.<br />
		\end{align*}<br />
		Since $A$ is positive definite, this happens if and only if $\mathbf{x}=\mathbf{0}$.<br />
		Hence $\langle \mathbf{x}, \mathbf{x}\rangle=0$ if and only if $\mathbf{x}=0$.<br />
		This proves the positive-definiteness of the function $\langle\,,\,\rangle$.</p>
<hr />
<p>		This completes the verification of the three properties, and hence $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ defines an inner product on $\R^n$.</p>
<h3>(b) If $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ defines an inner product, then $A$ is symmetric positive definite</h3>
<p> Suppose that $\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans}A\mathbf{y}$ is an inner product on $\R^n$.<br />
		Let us write $A=(a_{ij})$.</p>
<hr />
<p>		We first prove that $A$ is symmetric.<br />
		Let $\mathbf{e}_i$ denote the $i$-th standard unit vectors, that is,<br />
		\[\mathbf{e}_i=\begin{bmatrix}<br />
		  0 \\<br />
		   \vdots \\<br />
		    1 \\<br />
		   \vdots \\<br />
		   0<br />
		   \end{bmatrix},\]
		   where $1$ is on the $i$-th columns and other entries are all zero.</p>
<hr />
<p>				We compute<br />
		\begin{align*}<br />
		\langle \mathbf{e}_i, \mathbf{e}_j\rangle &#038;=\mathbf{e}^{\trans}_iA\mathbf{e}_j\\<br />
		&#038;=\begin{bmatrix}<br />
			0 &#038; \dots &#038;1 &#038; \dots 0<br />
		\end{bmatrix}<br />
		\begin{bmatrix}<br />
		  a_{1 j} \\<br />
		   a_{2 j} \\<br />
		    \vdots \\<br />
		   a_{n j}<br />
		   \end{bmatrix}=a_{ij}.<br />
		\end{align*}<br />
		Similarly, $\langle \mathbf{e}_j, \mathbf{e}_i\rangle =a_{j i}$. By symmetry of the inner product, it yields that<br />
		\[a_{ij}=\langle \mathbf{e}_i, \mathbf{e}_j\rangle =\langle \mathbf{e}_j, \mathbf{e}_i\rangle =a_{j i}.\]
		Therefore, the matrix $A$ is symmetric.</p>
<hr />
<p>		Next, we show that $A$ is positive definite.<br />
		Let $\mathbf{x}$ be a nonzero vector in $\R^n$.<br />
		Then we have<br />
		\[\mathbf{x}^{\trans}A\mathbf{x}=\langle \mathbf{x}, \mathbf{x}\rangle > 0\]
		by positive-definiteness property of the inner product.<br />
		This proves that $A$ is a positive definite matrix.</p>
<h2> Related Question. </h2>
<p>A concrete example of a positive-definite matrix is given in the next problem.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
   Consider the $2\times 2$ real matrix<br />
		   \[A=\begin{bmatrix}<br />
		  1 &#038; 1\\<br />
		  1&#038; 3<br />
		\end{bmatrix}.\]
<p><strong>(a)</strong> Prove that the matrix $A$ is positive definite.</p>
<p><strong>(b)</strong> Since $A$ is positive definite by part (a), the formula<br />
		\[\langle \mathbf{x}, \mathbf{y}\rangle:=\mathbf{x}^{\trans} A \mathbf{y}\]
		for $\mathbf{x}, \mathbf{y} \in \R^2$ defines an inner product on $\R^n$.<br />
		Consider $\R^2$ as an inner product space with this inner product.</p>
<p>		Prove that the unit vectors<br />
		\[\mathbf{e}_1=\begin{bmatrix}<br />
		  1 \\<br />
		  0<br />
		\end{bmatrix} \text{ and } \mathbf{e}_2=\begin{bmatrix}<br />
		  0 \\<br />
		  1<br />
		\end{bmatrix}\]
		are not orthogonal in the inner product space $\R^2$.</p>
<p><strong>(c)</strong> Find an orthogonal basis $\{\mathbf{v}_1, \mathbf{v}_2\}$ of $\R^2$ from the basis $\{\mathbf{e}_1, \mathbf{e}_2\}$ using the Gram-Schmidt orthogonalization process.
</div>
<p>See the post &#8628;<br />
<a href="//yutsumura.com/the-inner-product-on-r2-induced-by-a-positive-definite-matrix-and-gram-schmidt-orthogonalization/" target="_blank">The Inner Product on $\R^2$ induced by a Positive Definite Matrix and Gram-Schmidt Orthogonalization</a><br />
for proofs.</p>
<button class="simplefavorite-button has-count" data-postid="4648" data-siteid="1" data-groupid="1" data-favoritecount="55" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">55</span></button><p>The post <a href="https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/" target="_blank">A Symmetric Positive Definite Matrix and An Inner Product on a Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/a-symmetric-positive-definite-matrix-and-an-inner-product-on-a-vector-space/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">4648</post-id>	</item>
		<item>
		<title>A Positive Definite Matrix Has a Unique Positive Definite Square Root</title>
		<link>https://yutsumura.com/a-positive-definite-matrix-has-a-unique-positive-definite-square-root/</link>
				<comments>https://yutsumura.com/a-positive-definite-matrix-has-a-unique-positive-definite-square-root/#comments</comments>
				<pubDate>Wed, 19 Jul 2017 00:39:26 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[diagonalization of a matrix]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[positive definite matrix]]></category>
		<category><![CDATA[positive semi-definite matrix]]></category>
		<category><![CDATA[square root matrix]]></category>
		<category><![CDATA[square root of a matrix]]></category>
		<category><![CDATA[symmetric matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=3855</guid>
				<description><![CDATA[<p>Prove that a positive definite matrix has a unique positive definite square root. &#160; In this post, we review several definitions (a square root of a matrix, a positive definite matrix) and solve the&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/a-positive-definite-matrix-has-a-unique-positive-definite-square-root/" target="_blank">A Positive Definite Matrix Has a Unique Positive Definite Square Root</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 514</h2>
<p>Prove that a positive definite matrix has a unique positive definite square root.</p>
<p>&nbsp;<br />
<span id="more-3855"></span><br />

<p>In this post, we review several definitions (a square root of a matrix, a positive definite matrix) and solve the above problem.</p>
<p>After the proof, several extra problems about square roots of a matrix are given.</p>
<h2>Definitions (Square Roots of a Matrix) </h2>
<p>Let $A$ be a square matrix. If there exists a matrix $B$ such that $B^2=A$, then we call $B$ a <strong>square root</strong> of the matrix $A$.</p>
<h3>Examples</h3>
<p>	For example, if $A=\begin{bmatrix}<br />
	  2 &#038; 2\\<br />
	  2&#038; 2<br />
	\end{bmatrix}$, then it is straightforward to see that<br />
	\[\begin{bmatrix}<br />
	  1 &#038; 1\\<br />
	  1&#038; 1<br />
	\end{bmatrix} \text{ and } \begin{bmatrix}<br />
	  -1 &#038; -1\\<br />
	  -1&#038; -1<br />
	\end{bmatrix}\]
	are square roots of $A$.<br />
	(The less trivial question is that these are the only square roots of $A$.<br />
See the post &#8220;<a href="//yutsumura.com/find-all-the-square-roots-of-a-given-2-by-2-matrix/" target="_blank">Find All the Square Roots of a Given 2 by 2 Matrix</a>&#8221; .)</p>
<p>	Some matrices do not have a square root at all.<br />
	For example, the matrix $A=\begin{bmatrix}<br />
	  0 &#038; 1\\<br />
	  0&#038; 0<br />
	\end{bmatrix}$ does not have a square root matrix.<br />
 (See the post &#8220;<a href="//yutsumura.com/noinfinitely-many-square-roots-of-2-by-2-matrices/" target="_blank">No/Infinitely Many Square Roots of 2 by 2 Matrices</a>&#8221; Part (a).)</p>
<p>	On the other hand, some matrices have infinitely many square roots.<br />
	For example, the $2\times 2$ identity matrix has infinitely many distinct square roots.<br />
	 (See the post &#8220;<a href="//yutsumura.com/noinfinitely-many-square-roots-of-2-by-2-matrices/" target="_blank">No/Infinitely Many Square Roots of 2 by 2 Matrices</a>&#8221; Part (b).)</p>
<h2>Analogy with Positive Real Number </h2>
<p>	For a positive real number $a$, there are two square roots $\pm \sqrt{a}$.<br />
	Here $\sqrt{a}$ is the unique positive number whose square is $a$.</p>
<p>The corresponding notion of positive number in matrices is <strong>positive definite</strong>.</p>
<h3>Definition (Positive Definite Matrix)</h3>
<p>An $n\times n$ real symmetric matrix $A$ is said to be <strong>positive definite</strong> if $\mathbf{v}^{\trans}A\mathbf{v}$ is positive for all nonzero vector $\mathbf{v}\in \R^n$.</p>
<p>We will use the following fact in the proof.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Fact</strong>.<br />
 A real symmetric matrix $A$ is positive definite if and only if the eigenvalues of $A$ are all positive.</div>
<p>For a proof of this fact, see the post &#8220;<a href="//yutsumura.com/positive-definite-real-symmetric-matrix-and-its-eigenvalues/" target="_blank">Positive definite Real Symmetric Matrix and its Eigenvalues</a>&#8221;.</p>
<h3>Problem</h3>
<p>Just like a positive real number $a$ has a unique positive square root $\sqrt{a}$, we can prove the following (which is the problem of this post).</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
 A positive definite matrix has a unique positive definite square root.
</div>
<h2> Proof. </h2>
<p>	Let $A$ be a positive definite $n\times n$ matrix.</p>
<h3><strong>Existence</strong> of a Square Root</h3>
<p>We first show that there exists a positive definite matrix $B$ such that $B^2=A$.</p>
<p>	Let $\lambda_1, \dots, \lambda_n$ be eigenvalues of $A$.<br />
		Since $A$ is a real symmetric matrix, it is diagonalizable by an orthogonal matrix $S$.<br />
	That is, we have<br />
	\[S^{\trans}AS=D,\]
	where $D$ is the diagonal matrix<br />
	\[D=\begin{bmatrix}<br />
	  \lambda_1 &#038; 0 &#038; 0 &#038;   0 \\<br />
	  0 &#038;\lambda_2 &#038;  0 &#038; 0  \\<br />
	  \vdots &#038; \cdots &#038; \ddots &#038; \vdots \\<br />
	  0 &#038; 0 &#038; \cdots &#038; \lambda_n<br />
	\end{bmatrix}.\]
	(Note that $S^{-1}=S^{\trans}$ since $S$ is orthogonal.)</p>
<p>	Recall that the eigenvalues of a positive definite matrix are positive real numbers by <strong>Fact</strong>.<br />
	So eigenvalues $\lambda_i$ of $A$ are positive real numbers.<br />
	Hence $\sqrt{\lambda_i}$ is a positive real number for $i=1, \dots, n$.</p>
<p>	Define the matrix<br />
	\[B=SD&#8217;S^{\trans},\]
	where $D&#8217;$ is the diagonal matrix<br />
	\[D&#8217;=\begin{bmatrix}<br />
	  \sqrt{\lambda_1} &#038; 0 &#038; 0 &#038;   0 \\<br />
	  0 &#038;\sqrt{\lambda_2} &#038;  0 &#038; 0  \\<br />
	  \vdots &#038; \cdots &#038; \ddots &#038; \vdots \\<br />
	  0 &#038; 0 &#038; \cdots &#038; \sqrt{\lambda_n}<br />
	\end{bmatrix}.\]
<p>	Then $B$ is a symmetric matrix because<br />
	\[B^{\trans}=(SD&#8217;S^{\trans})^{\trans}=(S^{\trans})^{\trans}D&#8217;^{\trans}S^{\trans}=SD&#8217;S^{\trans}=B.\]
<p>	It follows from the definition of $B=SD&#8217;S^{\trans}$ that the eigenvalues of $B$ are positive numbers $\sqrt{\lambda_1}, \dots, \sqrt{\lambda_n}$.<br />
	Thus by <strong>Fact</strong> the matrix $B$ is positive-definite.</p>
<p>	The matrix $B$ is a square root of $A$ since we have<br />
	\begin{align*}<br />
	B^2=(SD&#8217;S^{\trans})(SD&#8217;S^{\trans})=SD&#8217;^2S^{\trans}=SDS^{\trans}=A.<br />
	\end{align*}</p>
<h3> <strong>Uniqueness</strong> of a Square Root</h3>
<p>Now we prove the uniqueness of the square root.<br />
	Suppose that $C$ is another positive definite square roots of the matrix $A$.</p>
<p>	Since $C$ is a real symmetric matrix, there is an orthogonal matrix $P$ such that<br />
	\[P^{\trans}CP=T.\]
	Here $T$ is the diagonal matrix<br />
	\[T=\begin{bmatrix}<br />
		  \mu_1 &#038; 0 &#038; 0 &#038;   0 \\<br />
		  0 &#038;\mu_2 &#038;  0 &#038; 0  \\<br />
		  \vdots &#038; \cdots &#038; \ddots &#038; \vdots \\<br />
		  0 &#038; 0 &#038; \cdots &#038; \mu_n<br />
		\end{bmatrix},\]
	where $\mu_1, \dots, \mu_n$ are eigenvalues of $C$.</p>
<p>	Since $C^2=A$, we have<br />
	\begin{align*}<br />
	P^{\trans}AP&#038;=P^{\trans}C^2P=(P^{\trans}CP)^2=T^2\\[6pt]
	&#038;=\begin{bmatrix}<br />
		  \mu_1^2&#038; 0 &#038; 0 &#038;   0 \\<br />
		  0 &#038;\mu_2^2 &#038;  0 &#038; 0  \\<br />
		  \vdots &#038; \cdots &#038; \ddots &#038; \vdots \\<br />
		  0 &#038; 0 &#038; \cdots &#038; \mu_n^2<br />
		\end{bmatrix}.<br />
	\end{align*}<br />
	Thus, the matrix $P$ diagonalizes $A$, and it follows that up to permutation $\mu_1^2, \dots, \mu_n^2$ are equal to $\lambda_1, \dots, \lambda_n$.<br />
	Hence we can modify $P$ (by interchanging columns vectors), and without loss of generality we may assume that $\mu_i^2=\lambda_i$ for $i=1, \dots, n$.<br />
	Thus $P^{\trans}CP=D&#8217;$  or equivalently,<br />
	\[ C=PD&#8217;P^{\trans}.\]
<p>	Since $B^2=A=C^2$, we have<br />
	\begin{align*}<br />
	&#038;(SD&#8217;S^{\trans})^2=(PD&#8217;P^{\trans})^2\\<br />
	&#038;\Leftrightarrow  SD&#8217;^2S^{\trans}=PD&#8217;^2P^{\trans}\\<br />
	&#038;\Leftrightarrow SDS^{\trans}=PDP^{\trans}\\<br />
	&#038;\Leftrightarrow (P^{\trans}S) D=D (P^{\trans}S).<br />
	\end{align*}</p>
<p>		Let $Q:=P^\trans S$. Then the last equality is $QD=D Q$, and hence $Q$ commutes with $D$.</p>
<p>		Without loss of generality, we may assume that the matrix $D$ can be expressed as a block matrix<br />
		\[<br />
	D=<br />
	\left[\begin{array}{c|c|c|c}<br />
	  \lambda_1 I_1 &#038; 0 &#038;\dots  &#038;0\\<br />
	  \hline<br />
	  0 &#038; \lambda_2 I_2 &#038; \dots &#038; 0\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  0 &#038; \dots &#038; \dots &#038; \lambda_k I_k<br />
	\end{array}<br />
	\right] ,<br />
	\]
		where $\lambda_1, \dots, \lambda_k$ are distinct eigenvalues of $A$ and $I_k$ are some identity matrix. (These $\lambda_i$ and the previous $\lambda_i$ are different. The size of the identity matrix $I_j$ is the algebraic multiplicity of $\lambda_j$.)</p>
<p>		Express the matrix $Q$ as the block matrix with the same partition as $D$, and write it as<br />
		\[Q=\left[\begin{array}{c|c|c|c}<br />
	  Q_{11} &#038; Q_{12} &#038;\dots  &#038;Q_{1 k}\\<br />
	  \hline<br />
	  Q_{21} &#038; Q_{22}&#038; \dots &#038; Q_{2k}\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  Q_{k1} &#038; \dots &#038; \dots &#038; Q_{k k}<br />
	\end{array}<br />
	\right].<br />
	\]
<p>	Comparing the $(i,j)$-block of both sides of $QD=DQ$ yields that<br />
	\[\lambda_j Q_{ij}=\lambda_i Q_{ij}.\]
<p>	It follows that if $i\neq j$ then $Q_{ij}$ is the zero matrix.<br />
	Thus, $Q$ is a block diagonal matrix<br />
		\[Q=\left[\begin{array}{c|c|c|c}<br />
	  Q_{11} &#038; 0 &#038;\dots  &#038;0\\<br />
	  \hline<br />
	  0 &#038; Q_{22}&#038; \dots &#038; 0\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  0 &#038; \dots &#038; \dots &#038; Q_{k k}<br />
	\end{array}<br />
	\right].<br />
	\]
	Note that $D&#8217;$ is also a block diagonal matrix with the same partition.<br />
	It follows that<br />
	\begin{align*}<br />
	QD&#8217;&#038;=\left[\begin{array}{c|c|c|c}<br />
	  Q_{11} &#038; 0 &#038;\dots  &#038;0\\<br />
	  \hline<br />
	  0 &#038; Q_{22}&#038; \dots &#038; 0\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  0 &#038; \dots &#038; \dots &#038; Q_{k k}<br />
	\end{array}<br />
	\right]
	\left[\begin{array}{c|c|c|c}<br />
	  \lambda_1 I_1 &#038; 0 &#038;\dots  &#038;0\\<br />
	  \hline<br />
	  0 &#038; \lambda_2 I_2 &#038; \dots &#038; 0\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  0 &#038; \dots &#038; \dots &#038; \lambda_k I_k<br />
	\end{array}<br />
	\right]\\[6pt]
	&#038;=\left[\begin{array}{c|c|c|c}<br />
	  \lambda_1 Q_{11} &#038; 0 &#038;\dots  &#038;0\\<br />
	  \hline<br />
	  0 &#038; \lambda_2 Q_{22} &#038; \dots &#038; 0\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  0 &#038; \dots &#038; \dots &#038; \lambda_k Q_{kk}<br />
	\end{array}<br />
	\right]\\[6pt]
	&#038;=\left[\begin{array}{c|c|c|c}<br />
	  \lambda_1 I_1 &#038; 0 &#038;\dots  &#038;0\\<br />
	  \hline<br />
	  0 &#038; \lambda_2 I_2 &#038; \dots &#038; 0\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  0 &#038; \dots &#038; \dots &#038; \lambda_k I_k<br />
	\end{array}<br />
	\right]
	\left[\begin{array}{c|c|c|c}<br />
	  Q_{11} &#038; 0 &#038;\dots  &#038;0\\<br />
	  \hline<br />
	  0 &#038; Q_{22}&#038; \dots &#038; 0\\<br />
	  \hline<br />
	  \vdots &#038; \dots &#038; \ddots&#038; \vdots\\<br />
	  \hline<br />
	  0 &#038; \dots &#038; \dots &#038; Q_{k k}<br />
	\end{array}<br />
	\right]
	=D&#8217;Q.<br />
	\end{align*}<br />
	It yields that<br />
	\[D&#8217;=Q^{\trans}D&#8217;Q\]
	since $Q=P^{\trans}S$ is an orthogonal matrix.</p>
<p>		Then we obtain<br />
		\begin{align*}<br />
	B&#038;=SD&#8217;S^{\trans}=S(Q^{\trans}D&#8217;Q)S^{\trans}\\<br />
	&#038;=SS^{\trans} PD&#8217; P^{\trans} S S^{\trans}\\<br />
	&#038;=PD&#8217;P^{\trans}=C.<br />
	\end{align*}</p>
<p>	Therefore, any square root of $A$ must be equal to the matrix $B$.<br />
	This completes the proof of the uniqueness, hence the proof of the problem.</p>
<h2> Remark. </h2>
<p>	The above problem is still true if &#8220;positive definite&#8221; is replaced by &#8220;positive semi-definite&#8221;.</p>
<h2> Related Questions About Square Roots of a Matrix</h2>
<p>If you want to solve more problems about square roots of a matrix, then try the following problems.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Does there exist  a $3 \times 3$ real matrix $B$ such that $B^2=A$ where<br />
\[A=\begin{bmatrix}<br />
1 &amp; -1 &amp; 0 \\<br />
-1 &amp;2 &amp;-1 \\<br />
0 &amp; -1 &amp; 1<br />
\end{bmatrix}\,\,\,\,?\]
</div>
<p>This is a part of Princeton University&#8217;s Linear Algebra exam problems.<br />
See the post &#8628;<br />
<a href="//yutsumura.com/a-square-root-matrix-of-a-symmetric-matrix/" target="_blank">A Square Root Matrix of a Symmetric Matrix</a><br />
for a solution.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>. Find a square root of the matrix<br />
\[A=\begin{bmatrix}<br />
  1 &#038; 3 &#038; -3 \\<br />
   0 &#038;4 &#038;5 \\<br />
   0 &#038; 0 &#038; 9<br />
\end{bmatrix}.\]
How many square roots does this matrix have?
</div>
<p>This is one of Berkeley&#8217;s qualifying exam problems.<br />
See the post &#8628;<br />
<a href="//yutsumura.com/square-root-of-a-diagonal-matrix-how-many-square-roots-exist/" target="_blank">Square Root of an Upper Triangular Matrix. How Many Square Roots Exist?</a><br />
for a solution.</p>
<button class="simplefavorite-button has-count" data-postid="3855" data-siteid="1" data-groupid="1" data-favoritecount="33" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">33</span></button><p>The post <a href="https://yutsumura.com/a-positive-definite-matrix-has-a-unique-positive-definite-square-root/" target="_blank">A Positive Definite Matrix Has a Unique Positive Definite Square Root</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/a-positive-definite-matrix-has-a-unique-positive-definite-square-root/feed/</wfw:commentRss>
		<slash:comments>6</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">3855</post-id>	</item>
	</channel>
</rss>
