Determine Whether a Set of Functions $f(x)$ such that $f(x)=f(1-x)$ is a Subspace

Problems and solutions in Linear Algebra

Problem 285

Let $V$ be the vector space over $\R$ of all real valued function on the interval $[0, 1]$ and let
\[W=\{ f(x)\in V \mid f(x)=f(1-x) \text{ for } x\in [0,1]\}\] be a subset of $V$. Determine whether the subset $W$ is a subspace of the vector space $V$.

 
LoadingAdd to solve later

Sponsored Links


Proof.

We claim that $W$ is a subspace of $V$.
To show the claim, we need to check that the following subspace criteria.

Subspace Criteria.

  1. The zero vector in $V$ is in $W$.
  2. For any two elements $f(x), g(x) \in W$, we have $f(x)+g(x) \in W$.
  3. For any scalar $c$ and any element $f(x) \in W$, we have $cf(x) \in W$.

The zero vector of $V$ is the zero function $\theta(x)=0$.
Since we have
\[\theta(x)=0=\theta(1-x)\] for any $x\in [0, 1]$, the zero vector $\theta$ is in $W$, hence condition 1 is met.


Let $f(x), g(x)$ be arbitrary elements in $W$. Then these functions satisfy
\[f(x)=f(1-x) \text{ and } g(x)=g(1-x)\] for any $x\in [0,1]$.
We want to show that the sum $h(x):=f(x)+g(x)$ is in $W$. This follows since we have
\[h(x)=f(x)+g(x)=f(1-x)+g(1-x)=h(1-x).\] Thus, condition 2 is satisfied.


Finally, we check condition 3. Let $c$ be a scalar and let $f(x)$ be an element in $W$.
Then we have
\[f(x)=f(1-x).\] It follows from this that
\[cf(x)=cf(1-x),\] and this shows that the scalar product $cf(x)$ is in $W$.

Therefore condition 3 holds, and we have proved the subspace criteria for $W$. Thus $W$ is a subspace of the vector space $V$.


LoadingAdd to solve later

Sponsored Links

More from my site

You may also like...

Please Login to Comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Problems and solutions in Linear Algebra
Linearly Independent vectors $\mathbf{v}_1, \mathbf{v}_2$ and Linearly Independent Vectors $A\mathbf{v}_1, A\mathbf{v}_2$ for a Nonsingular Matrix

Let $\mathbf{v}_1$ and $\mathbf{v}_2$ be $2$-dimensional vectors and let $A$ be a $2\times 2$ matrix. (a) Show that if $\mathbf{v}_1,...

Close