Construction of a Symmetric Matrix whose Inverse Matrix is Itself

Inverse Matrices Problems and Solutions

Problem 556

Let $\mathbf{v}$ be a nonzero vector in $\R^n$.
Then the dot product $\mathbf{v}\cdot \mathbf{v}=\mathbf{v}^{\trans}\mathbf{v}\neq 0$.
Set $a:=\frac{2}{\mathbf{v}^{\trans}\mathbf{v}}$ and define the $n\times n$ matrix $A$ by
\[A=I-a\mathbf{v}\mathbf{v}^{\trans},\] where $I$ is the $n\times n$ identity matrix.

Prove that $A$ is a symmetric matrix and $AA=I$.
Conclude that the inverse matrix is $A^{-1}=A$.

 
LoadingAdd to solve later

Sponsored Links


Proof.

$A$ is symmetric

We first show that the matrix $A$ is symmetric.
We calculate using properties of transpose
\begin{align*}
A^{\trans}&=(I-a\mathbf{v}\mathbf{v}^{\trans})^{\trans} && \text{by definition of $A$}\\
&=I^{\trans}-(a\mathbf{v}\mathbf{v}^{\trans})^{\trans}\\
&=I-a(\mathbf{v}^{\trans})^{\trans}\mathbf{v}^{\trans}\\
&=I-a\mathbf{v}\mathbf{v}^{\trans}\\
&=A && \text{by definition of $A$}.
\end{align*}
Hence we have $A^{\trans}=A$, and thus $A$ is symmetric.

$AA=I$ and $A^{-1}=A$

Next, we prove that $AA=I$.

We compute
\begin{align*}
AA&=(I-a\mathbf{v}\mathbf{v}^{\trans})(I-a\mathbf{v}\mathbf{v}^{\trans})\\
&=I(I-a\mathbf{v}\mathbf{v}^{\trans})-a\mathbf{v}\mathbf{v}^{\trans}(I-a\mathbf{v}\mathbf{v}^{\trans})\\
&=I-a\mathbf{v}\mathbf{v}^{\trans}-a\mathbf{v}\mathbf{v}^{\trans}+a^2\mathbf{v}\mathbf{v}^{\trans}\mathbf{v}\mathbf{v}^{\trans}\\
&=I-2a\mathbf{v}\mathbf{v}^{\trans}+a^2\mathbf{v}\mathbf{v}^{\trans}\mathbf{v}\mathbf{v}^{\trans}. \tag{*}
\end{align*}

Note that we have
\begin{align*}
\mathbf{v}\mathbf{v}^{\trans}\mathbf{v}\mathbf{v}^{\trans}&=\mathbf{v}(\mathbf{v}^{\trans}\mathbf{v})\mathbf{v}^{\trans}\\
&=\mathbf{v}\left(\, \frac{2}{a} \,\right)\mathbf{v}^{\trans} &&\text{by definition of $a\neq 0$}\\
&=\frac{2}{a}\mathbf{v}\mathbf{v}^{\trans}.
\end{align*}

Plugging this relation into (*), we obtain
\begin{align*}
AA&=I-2a\mathbf{v}\mathbf{v}^{\trans}+a^2\frac{2}{a}\mathbf{v}\mathbf{v}^{\trans}=I.
\end{align*}
Thus we get $AA=I$.
This implies that the inverse matrix of $A$ is $A$ itself: $A^{-1}=A$.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Find the Inverse Matrix of a Matrix With FractionsFind the Inverse Matrix of a Matrix With Fractions Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7} \end{bmatrix}.\]   Hint. You may use the augmented matrix […]
  • Find the Distance Between Two Vectors if the Lengths and the Dot Product are GivenFind the Distance Between Two Vectors if the Lengths and the Dot Product are Given Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are \[\|\mathbf{a}\|=\|\mathbf{b}\|=1\] and the inner product \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\|\mathbf{a}-\mathbf{b}\|$. (Note […]
  • Rotation Matrix in Space and its Determinant and EigenvaluesRotation Matrix in Space and its Determinant and Eigenvalues For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta & -\sin\theta & 0 \\ \sin\theta &\cos\theta &0 \\ 0 & 0 & 1 \end{bmatrix}.\] (a) Find the determinant of the matrix $A$. (b) Show that $A$ is an […]
  • Diagonalizable by an Orthogonal Matrix Implies a Symmetric MatrixDiagonalizable by an Orthogonal Matrix Implies a Symmetric Matrix Let $A$ be an $n\times n$ matrix with real number entries. Show that if $A$ is diagonalizable by an orthogonal matrix, then $A$ is a symmetric matrix.   Proof. Suppose that the matrix $A$ is diagonalizable by an orthogonal matrix $Q$. The orthogonality of the […]
  • Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$ Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors. (a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$. (b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} […]
  • Subspaces of Symmetric, Skew-Symmetric MatricesSubspaces of Symmetric, Skew-Symmetric Matrices Let $V$ be the vector space over $\R$ consisting of all $n\times n$ real matrices for some fixed integer $n$. Prove or disprove that the following subsets of $V$ are subspaces of $V$. (a) The set $S$ consisting of all $n\times n$ symmetric matrices. (b) The set $T$ consisting of […]
  • The Transpose of a Nonsingular Matrix is NonsingularThe Transpose of a Nonsingular Matrix is Nonsingular Let $A$ be an $n\times n$ nonsingular matrix. Prove that the transpose matrix $A^{\trans}$ is also nonsingular.   Definition (Nonsingular Matrix). By definition, $A^{\trans}$ is a nonsingular matrix if the only solution to […]
  • Sherman-Woodbery Formula for the Inverse MatrixSherman-Woodbery Formula for the Inverse Matrix Let $\mathbf{u}$ and $\mathbf{v}$ be vectors in $\R^n$, and let $I$ be the $n \times n$ identity matrix. Suppose that the inner product of $\mathbf{u}$ and $\mathbf{v}$ satisfies \[\mathbf{v}^{\trans}\mathbf{u}\neq -1.\] Define the matrix […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Linear Transformation problems and solutions
The Range and Null Space of the Zero Transformation of Vector Spaces

Let $U$ and $V$ be vector spaces over a scalar field $\F$. Define the map $T:U\to V$ by $T(\mathbf{u})=\mathbf{0}_V$ for...

Close