Exponential Functions Form a Basis of a Vector Space

Vector Space Problems and Solutions

Problem 590

Let $C[-1, 1]$ be the vector space over $\R$ of all continuous functions defined on the interval $[-1, 1]$. Let
\[V:=\{f(x)\in C[-1,1] \mid f(x)=a e^x+b e^{2x}+c e^{3x}, a, b, c\in \R\}\] be a subset in $C[-1, 1]$.

(a) Prove that $V$ is a subspace of $C[-1, 1]$.

(b) Prove that the set $B=\{e^x, e^{2x}, e^{3x}\}$ is a basis of $V$.

(c) Prove that
\[B’=\{e^x-2e^{3x}, e^x+e^{2x}+2e^{3x}, 3e^{2x}+e^{3x}\}\] is a basis for $V$.

 
LoadingAdd to solve later

Sponsored Links


Proof.

(a) Prove that $V$ is a subspace of $C[-1, 1]$.

Note that each function in the subset $V$ is a linear combination of the functions $e^x, e^{2x}, e^{3x}$.
Namely, we have
\[V=\Span\{e^x, e^{2x}, e^{3x}\}\] and we know that the span is always a subspace. Hence $V$ is a subspace of $C[-1,1]$.

(b) Prove that the set $B=\{e^x, e^{2x}, e^{3x}\}$ is a basis of $V$.

We noted in part (a) that $V=\Span(B)$. So it suffices to show that $B$ is linearly independent.
Consider the linear combination
\[c_1e^x+c_2 e^{2x}+c_3 e^{3x}=\theta(x),\] where $\theta(x)$ is the zero function (the zero vector in $V$).
Taking the derivative, we get
\[c_1e^x+2c_2 e^{2x}+3c_3 e^{3x}=\theta(x).\] Taking the derivative again, we obtain
\[c_1e^x+4c_2 e^{2x}+9c_3 e^{3x}=\theta(x).\]

Evaluating at $x=0$, we obtain the system of linear equations
\begin{align*}
c_1+c_2+c_3&=0\\
c_1+2c_2+3c_3&=0\\
c_1+4c_2+9c_3&=0.
\end{align*}


We reduce the augmented matrix for this system as follows:
\begin{align*}
\left[\begin{array}{rrr|r}
1 & 1 & 1 & 0 \\
1 &2 & 3 & 0 \\
1 & 4 & 9 & 0
\end{array} \right] \xrightarrow[R_3-R_1]{R_2-R_1}
\left[\begin{array}{rrr|r}
1 & 1 & 1 & 0 \\
0 &1 & 2 & 0 \\
0 & 3 & 8 & 0
\end{array} \right] \xrightarrow[R_3-3R_2]{R_1-R_2}\\[6pt] \left[\begin{array}{rrr|r}
1 & 0 & -1 & 0 \\
0 &1 & 2 & 0 \\
0 & 0 & 2 & 0
\end{array} \right] \xrightarrow{\frac{1}{2}R_3}
\left[\begin{array}{rrr|r}
1 & 0 & -1 & 0 \\
0 &1 & 2 & 0 \\
0 & 0 & 1 & 0
\end{array} \right] \xrightarrow[R_2-2R_2]{R_1+R_3}
\left[\begin{array}{rrr|r}
1 & 0 & 0 & 0 \\
0 &1 & 0 & 0 \\
0 & 0 & 1 & 0
\end{array} \right].
\end{align*}
It follows that the solution of the system is $c_1=c_2=c_3=0$.
Hence the set $B$ is linearly independent, and thus $B$ is a basis for $V$.

Anotehr approach.

Alternatively, we can show that the coefficient matrix is nonsingular by using the Vandermonde determinant formula as follows.
Observe that the coefficient matrix of the system is a Vandermonde matrix:
\[A:=\begin{bmatrix}
1 & 1 & 1 \\
1 &2 &3 \\
1^2 & 2^2 & 3^2
\end{bmatrix}.\] The Vandermonde determinant formula yields that
\[\det(A)=(3-1)(3-2)(2-1)=2\neq 0.\] Hence the coefficient matrix $A$ is nonsingular.
Thus we obtain the solution $c_1=c_2=c_3=0$.

(c) Prove that $B’=\{e^x-2e^{3x}, e^x+e^{2x}+2e^{3x}, 3e^{2x}+e^{3x}\}$ is a basis for $V$.

We consider the coordinate vectors of vectors in $B’$ with respect to the basis $B$.
The coordinate vectors with respect to basis $B$ are
\[[e^x-2e^{3x}]_B=\begin{bmatrix}
1 \\
0 \\
-2
\end{bmatrix}, [e^x+e^{2x}+2e^{3x}]_B=\begin{bmatrix}
1 \\
1 \\
2
\end{bmatrix}, [3e^{2x}+e^{3x}]_B=\begin{bmatrix}
0 \\
3 \\
1
\end{bmatrix}.\] Let $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ be these vectors and let $T=\{\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3\}$.
Then we know that $B’$ is a basis for $V$ if and only if $T$ is a basis for $\R^3$.


We claim that $T$ is linearly independent.
Consider the matrix whose column vectors are $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$:
\begin{align*}
\begin{bmatrix}
1 & 1 & 0 \\
0 &1 &3 \\
-2 & 2 & 1
\end{bmatrix}
\xrightarrow{R_3+2R_1}
\begin{bmatrix}
1 & 1 & 0 \\
0 &1 &3 \\
0 & 4 & 1
\end{bmatrix}
\xrightarrow[R_3-4R_1]{R_1-R_2}\\[6pt] \begin{bmatrix}
1 & 0 & -3 \\
0 &1 &3 \\
0 & 0 & -11
\end{bmatrix}
\xrightarrow{-\frac{1}{11}R_3}
\begin{bmatrix}
1 & 0 & -3 \\
0 &1 &3 \\
0 & 0 & 1
\end{bmatrix}
\xrightarrow[R_2-3R_3]{R_1+3R_3}
\begin{bmatrix}
1 & 0 & 0 \\
0 &1 &0 \\
0 & 0 & 1
\end{bmatrix}.
\end{align*}


Thus, the matrix is nonsingular and hence the column vectors $\mathbf{v}_1, \mathbf{v}_2, \mathbf{v}_3$ are linearly independent.
As $T$ consists of three linearly independent vectors in the three-dimensional vector space $\R^3$, we conclude that $T$ is a basis for $\R^3$.
Therefore, by the correspondence of the coordinates, we see that $B’$ is a basis for $V$.

Related Question.

If you know the Wronskian, then you may use the Wronskian to prove that the exponential functions $e^x, e^{2x}, e^{3x}$ are linearly independent.

See the post
Using the Wronskian for Exponential Functions, Determine Whether the Set is Linearly Independent for the details.


Try the next more general question.

Problem.
Let $c_1, c_2,\dots, c_n$ be mutually distinct real numbers.

Show that exponential functions
\[e^{c_1x}, e^{c_2x}, \dots, e^{c_nx}\] are linearly independent over $\R$.

The solution is given in the post ↴
Exponential Functions are Linearly Independent


LoadingAdd to solve later

Sponsored Links

More from my site

  • Exponential Functions are Linearly IndependentExponential Functions are Linearly Independent Let $c_1, c_2,\dots, c_n$ be mutually distinct real numbers. Show that exponential functions \[e^{c_1x}, e^{c_2x}, \dots, e^{c_nx}\] are linearly independent over $\R$. Hint. Consider a linear combination \[a_1 e^{c_1 x}+a_2 e^{c_2x}+\cdots + a_ne^{c_nx}=0.\] […]
  • Show the Subset of the Vector Space of Polynomials is a Subspace and Find its BasisShow the Subset of the Vector Space of Polynomials is a Subspace and Find its Basis Let $P_3$ be the vector space over $\R$ of all degree three or less polynomial with real number coefficient. Let $W$ be the following subset of $P_3$. \[W=\{p(x) \in P_3 \mid p'(-1)=0 \text{ and } p^{\prime\prime}(1)=0\}.\] Here $p'(x)$ is the first derivative of $p(x)$ and […]
  • Determinant of a General Circulant MatrixDeterminant of a General Circulant Matrix Let \[A=\begin{bmatrix} a_0 & a_1 & \dots & a_{n-2} &a_{n-1} \\ a_{n-1} & a_0 & \dots & a_{n-3} & a_{n-2} \\ a_{n-2} & a_{n-1} & \dots & a_{n-4} & a_{n-3} \\ \vdots & \vdots & \dots & \vdots & \vdots \\ a_{2} & a_3 & \dots & a_{0} & a_{1}\\ a_{1} & a_2 & […]
  • Subspace Spanned by Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$Subspace Spanned by Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$ Let $C[-2\pi, 2\pi]$ be the vector space of all real-valued continuous functions defined on the interval $[-2\pi, 2\pi]$. Consider the subspace $W=\Span\{\sin^2(x), \cos^2(x)\}$ spanned by functions $\sin^2(x)$ and $\cos^2(x)$. (a) Prove that the set $B=\{\sin^2(x), \cos^2(x)\}$ […]
  • Subspace Spanned By Cosine and Sine FunctionsSubspace Spanned By Cosine and Sine Functions Let $\calF[0, 2\pi]$ be the vector space of all real valued functions defined on the interval $[0, 2\pi]$. Define the map $f:\R^2 \to \calF[0, 2\pi]$ by \[\left(\, f\left(\, \begin{bmatrix} \alpha \\ \beta \end{bmatrix} \,\right) \,\right)(x):=\alpha \cos x + \beta […]
  • Basis and Dimension of the Subspace of All Polynomials of Degree 4 or Less Satisfying Some Conditions.Basis and Dimension of the Subspace of All Polynomials of Degree 4 or Less Satisfying Some Conditions. Let $P_4$ be the vector space consisting of all polynomials of degree $4$ or less with real number coefficients. Let $W$ be the subspace of $P_2$ by \[W=\{ p(x)\in P_4 \mid p(1)+p(-1)=0 \text{ and } p(2)+p(-2)=0 \}.\] Find a basis of the subspace $W$ and determine the dimension of […]
  • Are Linear Transformations of Derivatives and Integrations Linearly Independent?Are Linear Transformations of Derivatives and Integrations Linearly Independent? Let $W=C^{\infty}(\R)$ be the vector space of all $C^{\infty}$ real-valued functions (smooth function, differentiable for all degrees of differentiation). Let $V$ be the vector space of all linear transformations from $W$ to $W$. The addition and the scalar multiplication of $V$ […]
  • The Subset Consisting of the Zero Vector is a Subspace and its Dimension is ZeroThe Subset Consisting of the Zero Vector is a Subspace and its Dimension is Zero Let $V$ be a subset of the vector space $\R^n$ consisting only of the zero vector of $\R^n$. Namely $V=\{\mathbf{0}\}$. Then prove that $V$ is a subspace of $\R^n$.   Proof. To prove that $V=\{\mathbf{0}\}$ is a subspace of $\R^n$, we check the following subspace […]

You may also like...

1 Response

  1. 10/20/2017

    […] the post ↴ Exponential Functions Form a Basis of a Vector Space for the […]

Please Login to Comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Problems and solutions in Linear Algebra
Use Coordinate Vectors to Show a Set is a Basis for the Vector Space of Polynomials of Degree 2 or Less

Let $P_2$ be the vector space over $\R$ of all polynomials of degree $2$ or less. Let $S=\{p_1(x), p_2(x), p_3(x)\}$,...

Close