The Rotation Matrix is an Orthogonal Transformation

Linear Transformation problems and solutions

Problem 684

Let $\mathbb{R}^2$ be the vector space of size-2 column vectors. This vector space has an inner product defined by $ \langle \mathbf{v} , \mathbf{w} \rangle = \mathbf{v}^\trans \mathbf{w}$. A linear transformation $T : \R^2 \rightarrow \R^2$ is called an orthogonal transformation if for all $\mathbf{v} , \mathbf{w} \in \R^2$,
\[\langle T(\mathbf{v}) , T(\mathbf{w}) \rangle = \langle \mathbf{v} , \mathbf{w} \rangle.\]

For a fixed angle $\theta \in [0, 2 \pi )$ , define the matrix
\[ [T] = \begin{bmatrix} \cos (\theta) & – \sin ( \theta ) \\ \sin ( \theta ) & \cos ( \theta ) \end{bmatrix} \] and the linear transformation $T : \R^2 \rightarrow \R^2$ by
\[T( \mathbf{v} ) = [T] \mathbf{v}.\]

Prove that $T$ is an orthogonal transformation.

 
LoadingAdd to solve later

Solution.

Suppose we have vectors $\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix}$ and $\mathbf{w} = \begin{bmatrix} w_1 \\ w_2 \end{bmatrix} $ . Then,
\[T(\mathbf{v}) = \begin{bmatrix} \cos (\theta) & – \sin ( \theta ) \\ \sin ( \theta ) & \cos ( \theta ) \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \begin{bmatrix} \cos(\theta) v_1 – \sin (\theta) v_2 \\ \sin(\theta) v_1 + \cos (\theta) v_2 \end{bmatrix},\] and
\[ T(\mathbf{w}) = \begin{bmatrix} \cos (\theta) & – \sin ( \theta ) \\ \sin ( \theta ) & \cos ( \theta ) \end{bmatrix} \begin{bmatrix} w_1 \\ w_2 \end{bmatrix} = \begin{bmatrix} \cos(\theta) w_1 – \sin (\theta) w_2 \\ \sin(\theta) w_1 + \cos (\theta) w_2 \end{bmatrix}.\]


Then we find the inner product for these two vectors:
\begin{align*}
&\langle T(\mathbf{v} ) , T( \mathbf{w} ) \rangle \\
&= \begin{bmatrix} \cos(\theta) v_1 – \sin (\theta) v_2 & \sin(\theta) v_1 + \cos (\theta) v_2 \end{bmatrix} \begin{bmatrix} \cos(\theta) w_1 – \sin (\theta) w_2 \\ \sin(\theta) w_1 + \cos (\theta) w_2 \end{bmatrix} \\[6pt] &= \biggl( \cos(\theta) v_1 – \sin(\theta) v_2 \biggr) \biggl( \cos(\theta) w_1 – \sin ( \theta) w_2 \biggr) \\[6pt] &\qquad + \biggl( \sin (\theta) v_1 + \cos (\theta) v_2 \biggr) \biggl( \sin (\theta) w_1 + \cos(\theta) w_2 \biggr) \\[6pt] &= \cos^2(\theta) ( v_1 w_1 + v_2 w_2 ) + \sin(\theta) \cos(\theta) ( – v_1 w_2 – v_2 w_1 + v_1 w_2 + v_2 w_1 ) \\ &\qquad + \sin^2 (\theta) ( v_2 w_2 + v_1 w_1 ) \\[6pt] &= \left( \cos^2 ( \theta) + \sin^2 ( \theta ) \right) ( v_1 w_1 + v_2 w_2 ) \\
&= v_1 w_1 + v_2 w_2 \\
&= \langle \mathbf{v} , \mathbf{w} \rangle .
\end{align*}


This proves that $T$ is an orthogonal transformation. For the second-to-last equality, we used the Pythagorean identity $\sin^2 ( \theta ) + \cos^2 ( \theta ) = 1$.


LoadingAdd to solve later

More from my site

  • Rotation Matrix in Space and its Determinant and EigenvaluesRotation Matrix in Space and its Determinant and Eigenvalues For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta & -\sin\theta & 0 \\ \sin\theta &\cos\theta &0 \\ 0 & 0 & 1 \end{bmatrix}.\] (a) Find the determinant of the matrix $A$. (b) Show that $A$ is an […]
  • The Sum of Cosine Squared in an Inner Product SpaceThe Sum of Cosine Squared in an Inner Product Space Let $\mathbf{v}$ be a vector in an inner product space $V$ over $\R$. Suppose that $\{\mathbf{u}_1, \dots, \mathbf{u}_n\}$ is an orthonormal basis of $V$. Let $\theta_i$ be the angle between $\mathbf{v}$ and $\mathbf{u}_i$ for $i=1,\dots, n$. Prove that \[\cos […]
  • An Orthogonal Transformation from $\R^n$ to $\R^n$ is an IsomorphismAn Orthogonal Transformation from $\R^n$ to $\R^n$ is an Isomorphism Let $\R^n$ be an inner product space with inner product $\langle \mathbf{x}, \mathbf{y}\rangle=\mathbf{x}^{\trans}\mathbf{y}$ for $\mathbf{x}, \mathbf{y}\in \R^n$. A linear transformation $T:\R^n \to \R^n$ is called orthogonal transformation if for all $\mathbf{x}, \mathbf{y}\in […]
  • Rotation Matrix in the Plane and its Eigenvalues and EigenvectorsRotation Matrix in the Plane and its Eigenvalues and Eigenvectors Consider the $2\times 2$ matrix \[A=\begin{bmatrix} \cos \theta & -\sin \theta\\ \sin \theta& \cos \theta \end{bmatrix},\] where $\theta$ is a real number $0\leq \theta < 2\pi$.   (a) Find the characteristic polynomial of the matrix $A$. (b) Find the […]
  • Determine Trigonometric Functions with Given ConditionsDetermine Trigonometric Functions with Given Conditions (a) Find a function \[g(\theta) = a \cos(\theta) + b \cos(2 \theta) + c \cos(3 \theta)\] such that $g(0) = g(\pi/2) = g(\pi) = 0$, where $a, b, c$ are constants. (b) Find real numbers $a, b, c$ such that the function \[g(\theta) = a \cos(\theta) + b \cos(2 \theta) + c \cos(3 […]
  • Cosine and Sine Functions are Linearly IndependentCosine and Sine Functions are Linearly Independent Let $C[-\pi, \pi]$ be the vector space of all continuous functions defined on the interval $[-\pi, \pi]$. Show that the subset $\{\cos(x), \sin(x)\}$ in $C[-\pi, \pi]$ is linearly independent.   Proof. Note that the zero vector in the vector space $C[-\pi, \pi]$ is […]
  • Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$ Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors. (a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$. (b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} […]
  • Subspace Spanned by Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$Subspace Spanned by Trigonometric Functions $\sin^2(x)$ and $\cos^2(x)$ Let $C[-2\pi, 2\pi]$ be the vector space of all real-valued continuous functions defined on the interval $[-2\pi, 2\pi]$. Consider the subspace $W=\Span\{\sin^2(x), \cos^2(x)\}$ spanned by functions $\sin^2(x)$ and $\cos^2(x)$. (a) Prove that the set $B=\{\sin^2(x), \cos^2(x)\}$ […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Linear Algebra
Vector Space Problems and Solutions
The Coordinate Vector for a Polynomial with respect to the Given Basis

Let $\mathrm{P}_3$ denote the set of polynomials of degree $3$ or less with real coefficients. Consider the ordered basis \[B...

Close