<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	
	xmlns:georss="http://www.georss.org/georss"
	xmlns:geo="http://www.w3.org/2003/01/geo/wgs84_pos#"
	>

<channel>
	<title>Vandermonde matrix &#8211; Problems in Mathematics</title>
	<atom:link href="https://yutsumura.com/tag/vandermonde-matrix/feed/" rel="self" type="application/rss+xml" />
	<link>https://yutsumura.com</link>
	<description></description>
	<lastBuildDate>Sun, 19 Nov 2017 03:19:14 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=5.3.6</generator>

 
<site xmlns="com-wordpress:feed-additions:1">114989322</site>	<item>
		<title>Exponential Functions Form a Basis of a Vector Space</title>
		<link>https://yutsumura.com/exponential-functions-form-a-basis-of-a-vector-space/</link>
				<comments>https://yutsumura.com/exponential-functions-form-a-basis-of-a-vector-space/#comments</comments>
				<pubDate>Fri, 20 Oct 2017 22:28:13 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[basis for a vector space]]></category>
		<category><![CDATA[derivative]]></category>
		<category><![CDATA[exponential function]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[subspace]]></category>
		<category><![CDATA[Vandermonde determinant]]></category>
		<category><![CDATA[Vandermonde matrix]]></category>
		<category><![CDATA[vector space]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=5143</guid>
				<description><![CDATA[<p>Let $C[-1, 1]$ be the vector space over $\R$ of all continuous functions defined on the interval $[-1, 1]$. Let \[V:=\{f(x)\in C[-1,1] \mid f(x)=a e^x+b e^{2x}+c e^{3x}, a, b, c\in \R\}\] be a subset&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/exponential-functions-form-a-basis-of-a-vector-space/" target="_blank">Exponential Functions Form a Basis of a Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 590</h2>
<p>		Let $C[-1, 1]$ be the vector space over $\R$ of all continuous functions defined on the interval $[-1, 1]$. Let<br />
		\[V:=\{f(x)\in C[-1,1] \mid f(x)=a e^x+b e^{2x}+c e^{3x}, a, b, c\in \R\}\]
		be a subset in $C[-1, 1]$.</p>
<p><strong>(a)</strong> Prove that $V$ is a subspace of $C[-1, 1]$.</p>
<p><strong>(b)</strong> Prove that the set $B=\{e^x, e^{2x}, e^{3x}\}$ is a basis of $V$.</p>
<p><strong>(c)</strong> Prove that<br />
		\[B&#8217;=\{e^x-2e^{3x}, e^x+e^{2x}+2e^{3x}, 3e^{2x}+e^{3x}\}\]
		is a basis for $V$.</p>
<p>&nbsp;<br />
<span id="more-5143"></span><br />

<h2> Proof. </h2>
<h3>(a) Prove that $V$ is a subspace of $C[-1, 1]$.</h3>
<p>Note that each function in the subset $V$ is a linear combination of the functions $e^x, e^{2x}, e^{3x}$.<br />
			Namely, we have<br />
			\[V=\Span\{e^x, e^{2x}, e^{3x}\}\]
			and we know that the span is always a subspace. Hence $V$ is a subspace of $C[-1,1]$.</p>
<h3>(b) Prove that the set $B=\{e^x, e^{2x}, e^{3x}\}$ is a basis of $V$.</h3>
<p> We noted in part (a) that $V=\Span(B)$. So it suffices to show that $B$ is linearly independent.<br />
			Consider the linear combination<br />
			\[c_1e^x+c_2 e^{2x}+c_3 e^{3x}=\theta(x),\]
			where $\theta(x)$ is the zero function (the zero vector in $V$).<br />
			Taking the derivative, we get<br />
			\[c_1e^x+2c_2 e^{2x}+3c_3 e^{3x}=\theta(x).\]
			Taking the derivative again, we obtain<br />
			\[c_1e^x+4c_2 e^{2x}+9c_3 e^{3x}=\theta(x).\]
<p>			Evaluating at $x=0$, we obtain the system of linear equations<br />
			\begin{align*}<br />
		c_1+c_2+c_3&#038;=0\\<br />
		c_1+2c_2+3c_3&#038;=0\\<br />
		c_1+4c_2+9c_3&#038;=0.<br />
		\end{align*}</p>
<hr />
<p>		We reduce the augmented matrix for this system as follows:<br />
		\begin{align*}<br />
		 \left[\begin{array}{rrr|r}<br />
		 1 &#038; 1 &#038; 1 &#038;   0 \\<br />
		  1 &#038;2 &#038;  3 &#038; 0  \\<br />
		  1 &#038; 4 &#038; 9 &#038; 0<br />
		    \end{array} \right]
		    \xrightarrow[R_3-R_1]{R_2-R_1}<br />
		     \left[\begin{array}{rrr|r}<br />
		 1 &#038; 1 &#038; 1 &#038;   0 \\<br />
		  0 &#038;1 &#038;  2 &#038; 0  \\<br />
		  0 &#038; 3 &#038; 8 &#038; 0<br />
		    \end{array} \right]
		    \xrightarrow[R_3-3R_2]{R_1-R_2}\\[6pt]
		     \left[\begin{array}{rrr|r}<br />
		 1 &#038; 0 &#038; -1 &#038;   0 \\<br />
		  0 &#038;1 &#038;  2 &#038; 0  \\<br />
		  0 &#038; 0 &#038; 2 &#038; 0<br />
		    \end{array} \right]
		    \xrightarrow{\frac{1}{2}R_3}<br />
		     \left[\begin{array}{rrr|r}<br />
		 1 &#038; 0 &#038; -1 &#038;   0 \\<br />
		  0 &#038;1 &#038;  2 &#038; 0  \\<br />
		  0 &#038; 0 &#038; 1 &#038; 0<br />
		    \end{array} \right]
		    \xrightarrow[R_2-2R_2]{R_1+R_3}<br />
		     \left[\begin{array}{rrr|r}<br />
		 1 &#038; 0 &#038; 0 &#038;   0 \\<br />
		  0 &#038;1 &#038;  0 &#038; 0  \\<br />
		  0 &#038; 0 &#038; 1 &#038; 0<br />
		    \end{array} \right].<br />
		\end{align*}<br />
		It follows that the solution of the system is $c_1=c_2=c_3=0$.<br />
		Hence the set $B$ is linearly independent, and thus $B$ is a basis for $V$.</p>
<h4>Anotehr approach.</h4>
<p>		Alternatively, we can show that the coefficient matrix is nonsingular by using the Vandermonde determinant formula as follows.<br />
		Observe that the coefficient matrix of the system is a Vandermonde matrix:<br />
		\[A:=\begin{bmatrix}<br />
		  1 &#038; 1 &#038; 1 \\<br />
		   1 &#038;2 &#038;3 \\<br />
		   1^2 &#038; 2^2 &#038; 3^2<br />
		\end{bmatrix}.\]
		The Vandermonde determinant formula yields that<br />
		\[\det(A)=(3-1)(3-2)(2-1)=2\neq 0.\]
		Hence the coefficient matrix $A$ is nonsingular.<br />
		Thus we obtain the solution $c_1=c_2=c_3=0$.</p>
<h3>(c) Prove that $B&#8217;=\{e^x-2e^{3x}, e^x+e^{2x}+2e^{3x}, 3e^{2x}+e^{3x}\}$ is a basis for $V$.</h3>
<p> We consider the coordinate vectors of vectors in $B&#8217;$ with respect to the basis $B$.<br />
		The coordinate vectors with respect to basis $B$ are<br />
		\[[e^x-2e^{3x}]_B=\begin{bmatrix}<br />
		  1 \\<br />
		   0 \\<br />
		    -2<br />
		  \end{bmatrix}, [e^x+e^{2x}+2e^{3x}]_B=\begin{bmatrix}<br />
		  1 \\<br />
		   1 \\<br />
		    2<br />
		  \end{bmatrix}, [3e^{2x}+e^{3x}]_B=\begin{bmatrix}<br />
		  0 \\<br />
		   3 \\<br />
		    1<br />
		  \end{bmatrix}.\]
		  Let $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ be these vectors and let $T=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$.<br />
		  Then we know that $B&#8217;$ is a basis for $V$ if and only if $T$ is a basis for $\R^3$.</p>
<hr />
<p>		  We claim that $T$ is linearly independent.<br />
		Consider the matrix whose column vectors are $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$:<br />
		\begin{align*}<br />
		\begin{bmatrix}<br />
		  1 &#038; 1 &#038; 0 \\<br />
		   0 &#038;1 &#038;3 \\<br />
		   -2 &#038; 2 &#038; 1<br />
		\end{bmatrix}<br />
		\xrightarrow{R_3+2R_1}<br />
		\begin{bmatrix}<br />
		  1 &#038; 1 &#038; 0 \\<br />
		   0 &#038;1 &#038;3 \\<br />
		   0 &#038; 4 &#038; 1<br />
		\end{bmatrix}<br />
		\xrightarrow[R_3-4R_1]{R_1-R_2}\\[6pt]
		\begin{bmatrix}<br />
		  1 &#038; 0 &#038; -3 \\<br />
		   0 &#038;1 &#038;3 \\<br />
		   0 &#038; 0 &#038; -11<br />
		\end{bmatrix}<br />
		\xrightarrow{-\frac{1}{11}R_3}<br />
		\begin{bmatrix}<br />
		  1 &#038; 0 &#038; -3 \\<br />
		   0 &#038;1 &#038;3 \\<br />
		   0 &#038; 0 &#038; 1<br />
		\end{bmatrix}<br />
		\xrightarrow[R_2-3R_3]{R_1+3R_3}<br />
		\begin{bmatrix}<br />
		  1 &#038; 0 &#038; 0 \\<br />
		   0 &#038;1 &#038;0 \\<br />
		   0 &#038; 0 &#038; 1<br />
		\end{bmatrix}.<br />
		\end{align*}</p>
<hr />
<p>		Thus, the matrix is nonsingular and hence the column vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly independent.<br />
		As $T$ consists of three linearly independent vectors in the three-dimensional vector space $\R^3$, we conclude that $T$ is a basis for $\R^3$.<br />
		Therefore, by the correspondence of the coordinates, we see that $B&#8217;$ is a basis for $V$.</p>
<h2> Related Question. </h2>
<p>If you know the Wronskian, then you may use the Wronskian to prove that the exponential functions $e^x, e^{2x}, e^{3x}$ are linearly independent.</p>
<p>See the post<br />
<a href="//yutsumura.com/using-the-wronskian-for-exponential-functions-determine-whether-the-set-is-linearly-independent/" rel="noopener" target="_blank">Using the Wronskian for Exponential Functions, Determine Whether the Set is Linearly Independent</a> for the details.</p>
<hr />
<p>Try the next more general question.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Let $c_1, c_2,\dots, c_n$ be mutually distinct real numbers.</p>
<p>Show that exponential functions<br />
\[e^{c_1x}, e^{c_2x}, \dots, e^{c_nx}\]
are linearly independent over $\R$.
</p></div>
<p>The solution is given in the post &#8628;<br />
<a href="//yutsumura.com/exponential-functions-are-linearly-independent/" rel="noopener" target="_blank">Exponential Functions are Linearly Independent</a></p>
<button class="simplefavorite-button has-count" data-postid="5143" data-siteid="1" data-groupid="1" data-favoritecount="29" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">29</span></button><p>The post <a href="https://yutsumura.com/exponential-functions-form-a-basis-of-a-vector-space/" target="_blank">Exponential Functions Form a Basis of a Vector Space</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/exponential-functions-form-a-basis-of-a-vector-space/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">5143</post-id>	</item>
		<item>
		<title>Determinant of a General Circulant Matrix</title>
		<link>https://yutsumura.com/determinant-of-a-general-circulant-matrix/</link>
				<comments>https://yutsumura.com/determinant-of-a-general-circulant-matrix/#comments</comments>
				<pubDate>Tue, 11 Apr 2017 04:18:28 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[circulant matrix]]></category>
		<category><![CDATA[determinant]]></category>
		<category><![CDATA[determinant of a matrix]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[eigenvector]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linearly independent]]></category>
		<category><![CDATA[root of unity]]></category>
		<category><![CDATA[Vandermonde determinant]]></category>
		<category><![CDATA[Vandermonde matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=2662</guid>
				<description><![CDATA[<p>Let \[A=\begin{bmatrix} a_0 &#038; a_1 &#038; \dots &#038; a_{n-2} &#038;a_{n-1} \\ a_{n-1} &#038; a_0 &#038; \dots &#038; a_{n-3} &#038; a_{n-2} \\ a_{n-2} &#038; a_{n-1} &#038; \dots &#038; a_{n-4} &#038; a_{n-3} \\ \vdots &#038; \vdots&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/determinant-of-a-general-circulant-matrix/" target="_blank">Determinant of a General Circulant Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 374</h2>
<p> Let \[A=\begin{bmatrix}<br />
	  a_0 &#038; a_1 &#038; \dots &#038; a_{n-2} &#038;a_{n-1} \\<br />
	  a_{n-1} &#038; a_0 &#038; \dots &#038; a_{n-3} &#038; a_{n-2} \\<br />
	  a_{n-2} &#038; a_{n-1} &#038; \dots &#038; a_{n-4} &#038; a_{n-3} \\<br />
	  \vdots &#038; \vdots &#038; \dots &#038; \vdots &#038; \vdots \\<br />
	  a_{2} &#038; a_3 &#038; \dots &#038; a_{0} &#038; a_{1}\\<br />
	   a_{1} &#038; a_2 &#038; \dots &#038; a_{n-1} &#038; a_{0}<br />
	  \end{bmatrix}\]
	  be a complex $n \times n$ matrix.<br />
	  Such a matrix is called <strong>circulant</strong> matrix.<br />
	  Then prove that the determinant of the circulant matrix $A$ is given by<br />
	  \[\det(A)=\prod_{k=0}^{n-1}(a_0+a_1\zeta^k+a_2 \zeta^{2k}+\cdots+a_{n-1}\zeta^{k(n-1)}),\]
	  where $\zeta=e^{2 \pi i/n}$ is a primitive $n$-th root of unity.</p>
<p>&nbsp;<br />
<span id="more-2662"></span></p>
<h2> Proof. </h2>
<p>	  	Let $\omega$ be any $n$-th root of unity.<br />
	  	Consider the vector<br />
	  	\[\mathbf{v}=\begin{bmatrix}<br />
	  1 \\<br />
	   \omega \\<br />
	    \omega^2 \\<br />
	   \vdots \\<br />
	   \omega^{n-1}<br />
	   \end{bmatrix}.\]
	   We show that the vector $\mathbf{v}$ is an eigenvector of $A$.<br />
	   We compute<br />
	   \begin{align*}<br />
	A\mathbf{v}=\<br />
	\begin{bmatrix}<br />
	  a_0 &#038; a_1 &#038; \dots &#038; a_{n-2} &#038;a_{n-1} \\<br />
	  a_{n-1} &#038; a_0 &#038; \dots &#038; a_{n-3} &#038; a_{n-2} \\<br />
	  a_{n-2} &#038; a_{n-1} &#038; \dots &#038; a_{n-4} &#038; a_{n-3} \\<br />
	  \vdots &#038; \vdots &#038; \dots &#038; \vdots &#038; \vdots \\<br />
	  a_{2} &#038; a_3 &#038; \dots &#038; a_{0} &#038; a_{1}\\<br />
	   a_{1} &#038; a_2 &#038; \dots &#038; a_{n-1} &#038; a_{0}<br />
	  \end{bmatrix}<br />
	  \begin{bmatrix}<br />
	  1 \\<br />
	   \omega \\<br />
	    \omega^2 \\<br />
	   \vdots \\<br />
	   \omega^{n-1}\\<br />
	   \end{bmatrix}.<br />
	\end{align*}</p>
<p>	The first entry of the vector  $A\mathbf{v}$ is<br />
	\[a_0+a_1\omega+a_2\omega^2+\cdots a_{n-2}\omega^{n-2}+a_{n-1}\omega^{n-1}=:\lambda.\]
	We define $\lambda$ to be this number.</p>
<hr />
<p>	The second entry is<br />
	\begin{align*}<br />
	&#038;a_{n-1}+a_0\omega+\cdots+a_{n-3}\omega^{n-2}+a_{n-2}\omega^{n-1}\\<br />
	&#038;=(a_{n-1}\omega^{n-1}+a_0+\cdots+a_{n-3}\omega^{n-3}+a_{n-2}\omega^{n-2})\omega \\<br />
	&#038;=(a_0+\cdots+a_{n-3}\omega^{n-3}+a_{n-2}\omega^{n-2}+a_{n-1}\omega^{n-1})\omega \\<br />
	&#038;=\lambda \omega<br />
	\end{align*}</p>
<hr />
<p>	Similarly the $i$-th entry of the vector $A\mathbf{v}$ is<br />
	\begin{align*}<br />
	&#038;a_{n-i+1}+a_{n-i+2}\omega +\cdots+ a_{n-i}\omega^{n-1}\\<br />
	&#038;= (a_{n-i+1}\omega^{n-i+1}+a_{n-i+2}\omega^{n-i+2} +\cdots+ a_{n-i}\omega^{n-i})\omega^{i-1}\\<br />
	&#038;=\lambda \omega^{i-1} .<br />
	\end{align*}<br />
	Therefore we obtain<br />
	\[A\mathbf{v}=\begin{bmatrix}<br />
	  \lambda \\<br />
	   \lambda\omega \\<br />
	   \lambda \omega^2 \\<br />
	   \vdots \\<br />
	   \lambda\omega^{n-1}<br />
	   \end{bmatrix}=\lambda \mathbf{v}.\]
<hr />
<p>	   Since $\mathbf{v}$ is a nonzero vector, it follows that $\lambda$ is an eigenvalue of $A$ and $\mathbf{v}$ is an eigenvector corresponding to $\lambda$.</p>
<hr />
<p>	   The above argument holds for any $n$-th root of unity $\omega$.<br />
	   We take $\omega=\zeta^k$, where $k$ runs from $0$ to $n-1$.<br />
	   It follows that the vector<br />
	   \[\mathbf{v}_k:=\begin{bmatrix}<br />
	  1 \\<br />
	   \zeta^k \\<br />
	    \zeta^{2k} \\<br />
	   \vdots \\<br />
	   \zeta^{k(n-1)}<br />
	   \end{bmatrix}\]
	   is eigenvector corresponding to the eigenvalue<br />
	   \[\lambda_k:=a_0+a_1\zeta^k+a_2\zeta^{2k}+\cdots a_{n-2}\zeta^{k(n-2)}+a_{n-1}\zeta^{k(n-1)}\]
	   for each $k=0,1, \dots, n-1$.</p>
<hr />
<p>	   We claim that the vectors $\mathbf{v}_k$ are linearly independent.</p>
<p>	   To see this, form a matrix whose column vectors are these vectors. That is, we consider<br />
	   \[B=\begin{bmatrix}<br />
	 1&#038; 1 &#038; 1 &#038; \dots &#038; 1 &#038;1 \\<br />
	  1&#038;\zeta &#038; \zeta^{2} &#038; \dots &#038; \zeta^{n-2} &#038; \zeta^{n-1} \\<br />
	  1&#038;\zeta^{2} &#038; \zeta^{4} &#038; \dots &#038; \zeta^{2(n-2)} &#038; \zeta^{2(n-1)} \\<br />
	 1&#038; \zeta^{3} &#038; \zeta^{6} &#038; \dots &#038; \zeta^{3(n-2)} &#038; \zeta^{3(n-1)} \\<br />
	 \vdots&#038; \vdots &#038; \vdots  &#038; \dots &#038; \vdots &#038; \vdots \\<br />
	 1&#038; \zeta^{n-1} &#038; \zeta^{2(n-1)} &#038; \dots &#038; \zeta^{(n-1)(n-2)} &#038;\zeta^{(n-1)(n-1)}<br />
	  \end{bmatrix}.\]
<hr />
<p>	  This is the Vandermonde matrix and its determinant is<br />
	  \[\det(B)=\prod_{i < j }(\zeta^j-\zeta^i)\neq 0.\]
	  Thus, the matrix $B$ is nonsingular, hence its column vectors are linearly independent.
	  
	  It follows that $\lambda_k$, $k=0, 1, \dots, n$ are all the eigenvalues of $A$.
	  
	  Since the determinant is the product of all the eigenvalues of $A$, we have
	  \begin{align*}
	\det(A)&#038;=\prod_{k=0}^{n-1}\lambda_k\\
	&#038;=\prod_{k=0}^{n-1}(a_0+a_1\zeta^k+a_2 \zeta^{2k}+\cdots+a_{n-1}\zeta^{k(n-1)}),
	\end{align*}
	as required. This completes the proof.
</p>
<button class="simplefavorite-button has-count" data-postid="2662" data-siteid="1" data-groupid="1" data-favoritecount="23" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">23</span></button><p>The post <a href="https://yutsumura.com/determinant-of-a-general-circulant-matrix/" target="_blank">Determinant of a General Circulant Matrix</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/determinant-of-a-general-circulant-matrix/feed/</wfw:commentRss>
		<slash:comments>6</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">2662</post-id>	</item>
		<item>
		<title>Exponential Functions are Linearly Independent</title>
		<link>https://yutsumura.com/exponential-functions-are-linearly-independent/</link>
				<comments>https://yutsumura.com/exponential-functions-are-linearly-independent/#comments</comments>
				<pubDate>Thu, 18 Aug 2016 03:56:47 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[calculus]]></category>
		<category><![CDATA[exponential function]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[linear independent]]></category>
		<category><![CDATA[Vandermonde determinant]]></category>
		<category><![CDATA[Vandermonde matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=576</guid>
				<description><![CDATA[<p>Let $c_1, c_2,\dots, c_n$ be mutually distinct real numbers. Show that exponential functions \[e^{c_1x}, e^{c_2x}, \dots, e^{c_nx}\] are linearly independent over $\R$. Hint. Consider a linear combination \[a_1 e^{c_1 x}+a_2 e^{c_2x}+\cdots + a_ne^{c_nx}=0.\] Differentiate this&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/exponential-functions-are-linearly-independent/" target="_blank">Exponential Functions are Linearly Independent</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2>Problem 73</h2>
<p>Let $c_1, c_2,\dots, c_n$ be mutually distinct real numbers.</p>
<p>Show that exponential functions<br />
\[e^{c_1x}, e^{c_2x}, \dots, e^{c_nx}\]
are linearly independent over $\R$.</p>
<p><span id="more-576"></span><br />

<h2>Hint.</h2>
<ol>
<li>Consider a linear combination \[a_1 e^{c_1 x}+a_2 e^{c_2x}+\cdots + a_ne^{c_nx}=0.\]</li>
<li>Differentiate this equality $n-1$ times and you will get $n$ equations.</li>
<li>Write a matrix equation for the system. You will see the Vandermonde matrix.</li>
</ol>
<h2>Proof.</h2>
<p>Suppose that we have a linear combination of these functions that is zero.<br />
Namely, suppose we have<br />
\[a_1 e^{c_1 x}+a_2 e^{c_2x}+\cdots + a_ne^{c_nx}=0\]
for some real numbers $a_1, a_2, \dots, a_n$.</p>
<p>We want to show that the coefficients $a_1, a_2, \dots, a_n$ are all zero.</p>
<hr />
<p>By differentiating the equation, we obtain<br />
\[a_1c_1e^{c_1x}+a_2c_2e^{c_2x}+\cdots +a_n c_n e^{c_n x}=0.\]
Differentiating repeatedly we further obtain the equalities<br />
\begin{align*}<br />
&amp; a_1c_1^2e^{c_1x}+a_2c_2^2e^{c_2x}+\cdots +a_n c_n^2 e^{c_n x}=0\\<br />
&amp; a_1c_1^3e^{c_1x}+a_2c_2^3e^{c_2x}+\cdots +a_n c_n^3 e^{c_n x}=0\\<br />
&amp; \dots \\<br />
&amp; a_1c_1^{n-1}e^{c_1x}+a_2c_2^{n-1}e^{c_2x}+\cdots +a_n c_n^{n-1}<br />
e^{c_n x}=0\\<br />
\end{align*}</p>
<p>We rewrite these $n$ equations into the following matrix equation.<br />
\[\begin{bmatrix}<br />
1 &amp; 1 &amp; \dots &amp;1 \\<br />
c_1 &amp; c_2 &amp; \dots &amp; c_n \\[3pt]
c_1^2 &amp; c_2^2 &amp; \dots &amp; c_n^2 \\[3pt]
\vdots &amp; \vdots &amp; \vdots &amp; \vdots \\[3pt]
c_1^{n-1} &amp; c_2^{n-1} &amp; \dots &amp; c_n^{n-1}<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
a_1 e^{c_1x} \\<br />
a_2 e^{c_2x} \\<br />
\vdots \\<br />
a_n e^{c^nx}<br />
\end{bmatrix}=\begin{bmatrix}<br />
0 \\<br />
\vdots \\<br />
0<br />
\end{bmatrix} \tag{*}<br />
\]
<hr />
<p>The determinant of the left matrix is<br />
\[\det \begin{bmatrix}<br />
1 &amp; 1 &amp; \dots &amp;1 \\<br />
c_1 &amp; c_2 &amp; \dots &amp; c_n \\[3pt]
c_1^2 &amp; c_2^2 &amp; \dots &amp; c_n^2 \\[3pt]
\vdots &amp; \vdots &amp; \vdots &amp; \vdots \\[3pt]
c_1^{n-1} &amp; c_2^{n-1} &amp; \dots &amp; c_n^{n-1}<br />
\end{bmatrix}=\prod_{i&lt;j}(c_j-c_i)\]
by the Vandermonde determinant.</p>
<p>Since by assumption $c_1,\dots, c_n$ are distinct, the determinant is not zero.<br />
Therefore by multiplying equality (*) by the inverse on the left, we obtain<br />
\[\begin{bmatrix}<br />
a_1 e^{c_1x} \\<br />
a_2 e^{c_2x} \\<br />
\vdots \\<br />
a_n e^{c^nx}<br />
\end{bmatrix}=\begin{bmatrix}<br />
0 \\<br />
\vdots \\<br />
0<br />
\end{bmatrix}. \]
Since the functions $e^{c_i x}$ are always positive, we must have $a_1=a_2=\cdots=a_n=0$ as required.<br />
Therefore the functions $e^{c_1 x}, \dots, e^{c_n x}$ are linearly independent.</p>
<h2>Comment.</h2>
<p>The determinant that we considered above is called the Wronskian for the set of functions $\{e^{c_1x}, e^{c_2x}, \dots, e^{c_nx}\}$.</p>
<h2> Related Question. </h2>
<p>The following problems are more concrete versions of the current problem.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
<strong>Problem</strong>.<br />
Let $C[-1, 1]$ be the vector space over $\R$ of all continuous functions defined on the interval $[-1, 1]$. Let<br />
		\[V:=\{f(x)\in C[-1,1] \mid f(x)=a e^x+b e^{2x}+c e^{3x}, a, b, c\in \R\}\]
		be a subset in $C[-1, 1]$.</p>
<p><strong>(a)</strong> Prove that $V$ is a subspace of $C[-1, 1]$.<br />
<strong>(b)</strong> Prove that the set $B=\{e^x, e^{2x}, e^{3x}\}$ is a basis of $V$.<br />
<strong>(c)</strong> Prove that<br />
		\[B&#8217;=\{e^x-2e^{3x}, e^x+e^{2x}+2e^{3x}, 3e^{2x}+e^{3x}\}\]
		is a basis for $V$.
</div>
<p>See the post &#8628;<br />
<a href="//yutsumura.com/exponential-functions-form-a-basis-of-a-vector-space/" rel="noopener" target="_blank">Exponential Functions Form a Basis of a Vector Space</a><br />
for the solution.</p>
<div style="padding: 16px; border: none 3px #4169e1; border-radius: 10px; background-color: #f0f8ff; margin-top: 30px; margin-bottom: 30px;">
By calculating the Wronskian, determine whether the set of exponential functions<br />
	\[\{e^x, e^{2x}, e^{3x}\}\]
	is linearly independent on the interval $[-1, 1]$.</div>
<p>The solutions is given in the post &#8628;<br />
<a href="//yutsumura.com/using-the-wronskian-for-exponential-functions-determine-whether-the-set-is-linearly-independent/" target="_blank">Using the Wronskian for Exponential Functions, Determine Whether the Set is Linearly Independent</a></p>
<button class="simplefavorite-button has-count" data-postid="576" data-siteid="1" data-groupid="1" data-favoritecount="25" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">25</span></button><p>The post <a href="https://yutsumura.com/exponential-functions-are-linearly-independent/" target="_blank">Exponential Functions are Linearly Independent</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/exponential-functions-are-linearly-independent/feed/</wfw:commentRss>
		<slash:comments>1</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">576</post-id>	</item>
		<item>
		<title>If Every Trace of a Power of a Matrix is Zero, then the Matrix is Nilpotent</title>
		<link>https://yutsumura.com/if-every-trace-of-a-power-of-a-matrix-is-zero-then-the-matrix-is-nilpotent/</link>
				<comments>https://yutsumura.com/if-every-trace-of-a-power-of-a-matrix-is-zero-then-the-matrix-is-nilpotent/#respond</comments>
				<pubDate>Tue, 26 Jul 2016 22:10:00 +0000</pubDate>
		<dc:creator><![CDATA[Yu]]></dc:creator>
				<category><![CDATA[Linear Algebra]]></category>
		<category><![CDATA[eigenvalue]]></category>
		<category><![CDATA[Jordan canonical form]]></category>
		<category><![CDATA[linear algebra]]></category>
		<category><![CDATA[matrix]]></category>
		<category><![CDATA[nilpotent matrix]]></category>
		<category><![CDATA[system of linear equations]]></category>
		<category><![CDATA[trace of a matrix]]></category>
		<category><![CDATA[upper triangular matrix]]></category>
		<category><![CDATA[Vandermonde matrix]]></category>

		<guid isPermaLink="false">https://yutsumura.com/?p=181</guid>
				<description><![CDATA[<p>Let $A$ be an $n \times n$ matrix such that $\tr(A^n)=0$ for all $n \in \N$. Then prove that $A$ is a nilpotent matrix. Namely there exist a positive integer $m$ such that $A^m$&#46;&#46;&#46;</p>
<p>The post <a href="https://yutsumura.com/if-every-trace-of-a-power-of-a-matrix-is-zero-then-the-matrix-is-nilpotent/" target="_blank">If Every Trace of a Power of a Matrix is Zero, then the Matrix is Nilpotent</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></description>
								<content:encoded><![CDATA[<h2> Problem 21</h2>
<p>Let $A$ be an $n \times n$ matrix such that $\tr(A^n)=0$ for all $n \in \N$.<br />
Then prove that $A$ is a nilpotent matrix. Namely there exist a positive integer $m$ such that $A^m$ is the zero matrix.</p>
<p><span id="more-181"></span><br />

</p>
<h2> Steps. </h2>
<ol>
<li>Use the Jordan canonical form of the matrix $A$.</li>
<li>We want to show that all eigenvalues are zero. (Review <a href="//yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/">Nilpotent matrix and eigenvalues of the matrix</a>)</li>
<li>Seeking a contradiction, assume some eigenvalues are not zero.</li>
<li>Using  $\tr(A^n)=0$, create a system of linear equations.</li>
<li>Calculate the determinant of the coefficient matrix of the system using the Vandermonde matrix formula.</li>
<li>Find a contradiction.</li>
</ol>
<h2> Proof. </h2>
<p>We first want to prove that all the eigenvalues of $A$ must be zero.<br />
Seeking a contradiction, assume that some of the eigenvalues of $A$ are not zero.<br />
So assume that $\lambda_i$, $i=1, \dots, r$ are distinct nonzero eigenvalues of $A$ and each $m_i \geq 1$ is a multiplicity of $\lambda_i$.</p>
<p>We use the Jordan canonical form of the matrix $A$.<br />
 There exists an invertible matrix $S$ such that $S^{-1}AS=T$, where $T$ is an upper triangular matrix.The diagonal entries are eigenvalues of $A$.</p>
<p>Then we have for any positive integer $n$,<br />
\begin{align*}<br />
0 &amp;= \tr(A^n)=\tr((STS^{-1})^n)=\tr(ST^n S^{-1})=\tr(T^{n}) \\<br />
&amp;=m_1 \lambda_1^n +m_2 \lambda_2^n+ \cdots + m_r \lambda_r^n.<br />
\end{align*}</p>
<p>Note that since $T$ is an upper triangular matrix, the nonzero diagonal entries of $T^n$ are $\lambda_i^n$ appearing $m_i$ times.<br />
Changing $n$ from $1$ to $r$, we obtain the system of linear equation. (Think $m_i$ as variables.)<br />
\begin{align*}<br />
m_1 \lambda_1^1 +m_2 \lambda_2^1+ \cdots + m_r \lambda_r^1 &amp;=0 \\<br />
m_1 \lambda_1^2 +m_2 \lambda_2^2+ \cdots + m_r \lambda_r^2 &amp;=0 \\<br />
&amp; \vdots \\<br />
m_1 \lambda_1^r +m_2 \lambda_2^r+ \cdots + m_r \lambda_r^r &amp;=0 \\<br />
\end{align*}<br />
Equivalently, we have the matrix equation<br />
\begin{align*}<br />
\begin{bmatrix}<br />
\lambda_1^1 &amp; \lambda_2^1 &amp; \cdots &amp; \lambda_r^1 \\<br />
\lambda_1^2 &amp; \lambda_2^2&amp; \cdots &amp; \lambda_r^2 \\<br />
\vdots &amp; \vdots &amp; \vdots &amp; \vdots \\<br />
\lambda_1^r &amp; \lambda_2^r&amp;\cdots &amp; \lambda_r^r \\<br />
\end{bmatrix}<br />
\begin{bmatrix}<br />
m_1 \\<br />
m_2 \\<br />
\vdots \\<br />
m_r<br />
\end{bmatrix}<br />
=<br />
\begin{bmatrix}\tag{*}<br />
0 \\<br />
0 \\<br />
\vdots \\<br />
0<br />
\end{bmatrix}.<br />
\end{align*}</p>
<p>Let $B$ denote the matrix above whose entries are powers of eigenvalues $\lambda_i$.<br />
We calculate the determinant of $B$.<br />
\begin{align*}<br />
\det(B)&amp;=\lambda_1 \lambda_2 \cdots \lambda_r<br />
\begin{bmatrix}<br />
1 &amp; 1 &amp; \cdots &amp; 1 \\<br />
\lambda_1 &amp; \lambda_2&amp; \cdots &amp; \lambda_r \\<br />
\vdots &amp; \vdots &amp; \vdots &amp; \vdots \\<br />
\lambda_1^{r-1} &amp; \lambda_2^{r-1}&amp;\cdots &amp; \lambda_r^{r-1} \\<br />
\end{bmatrix} \\[6pt]
&amp;=\lambda_1 \lambda_2 \cdots \lambda_r \prod_{1 \leq i&lt;j \leq n}(\lambda_j-\lambda_i) \neq 0.<br />
\end{align*}<br />
(Note that the matrix above is a Vandermonde matrix.)</p>
<p>Thus the matrix $B$ is invertible. This means that the matrix equation (*) has unique solution<br />
\begin{align*}<br />
\begin{bmatrix}<br />
m_1 \\<br />
m_2 \\<br />
\vdots \\<br />
m_r<br />
\end{bmatrix}<br />
=<br />
\begin{bmatrix}\tag{*}<br />
0 \\<br />
0 \\<br />
\vdots \\<br />
0<br />
\end{bmatrix}.<br />
\end{align*}<br />
But this is a contradiction because each multiplicity $m_i$ is grater than zero.</p>
<p>Thus we proved that all eigenvalues of $A$ are zero.</p>
<p>Recall that a matrix is nilpotent if and only if its eigenvalues are zero.<br />
See the post &#8628;<br />
 <a href="//yutsumura.com/nilpotent-matrix-and-eigenvalues-of-the-matrix/">Nilpotent Matrix and Eigenvalues of the Matrix</a><br />
for a proof of this fact.</p>
<p>Hence it follows from this fact that $A$ is a nilpotent matrix.</p>
<button class="simplefavorite-button has-count" data-postid="181" data-siteid="1" data-groupid="1" data-favoritecount="24" style="">Click here if solved <i class="sf-icon-star-empty"></i><span class="simplefavorite-button-count" style="">24</span></button><p>The post <a href="https://yutsumura.com/if-every-trace-of-a-power-of-a-matrix-is-zero-then-the-matrix-is-nilpotent/" target="_blank">If Every Trace of a Power of a Matrix is Zero, then the Matrix is Nilpotent</a> first appeared on <a href="https://yutsumura.com/" target="_blank">Problems in Mathematics</a>.</p>]]></content:encoded>
							<wfw:commentRss>https://yutsumura.com/if-every-trace-of-a-power-of-a-matrix-is-zero-then-the-matrix-is-nilpotent/feed/</wfw:commentRss>
		<slash:comments>0</slash:comments>
						<post-id xmlns="com-wordpress:feed-additions:1">181</post-id>	</item>
	</channel>
</rss>
