Sherman-Woodbery Formula for the Inverse Matrix

Linear algebra problems and solutions

Problem 250

Let $\mathbf{u}$ and $\mathbf{v}$ be vectors in $\R^n$, and let $I$ be the $n \times n$ identity matrix. Suppose that the inner product of $\mathbf{u}$ and $\mathbf{v}$ satisfies
\[\mathbf{v}^{\trans}\mathbf{u}\neq -1.\] Define the matrix
\[A=I+\mathbf{u}\mathbf{v}^{\trans}.\]

Prove that $A$ is invertible and the inverse matrix is given by the formula
\[A^{-1}=I-a\mathbf{u}\mathbf{v}^{\trans},\] where
\[a=\frac{1}{1+\mathbf{v}^{\trans}\mathbf{u}}.\] This formula is called the Sherman-Woodberry formula.

 
LoadingAdd to solve later
Sponsored Links

Proof.

Let us put
\[B=I-a\mathbf{u}\mathbf{v}^{\trans},\] the matrix given by the Sherman-Woodberry formula.
We compute $AB$ and $BA$ and show that they are equal to the identity matrix $I$.


Let us first compute the matrix product $AB$. We have
\begin{align*}
AB&=(I+\mathbf{u}\mathbf{v}^{\trans})(I-a\mathbf{u}\mathbf{v}^{\trans})\\
&=I-a\mathbf{u}\mathbf{v}^{\trans}+\mathbf{u}\mathbf{v}^{\trans}-a\mathbf{u}\mathbf{v}^{\trans}\mathbf{u}\mathbf{v}^{\trans}\\
&=I+(1-a)\mathbf{u}\mathbf{v}^{\trans}-a\mathbf{u}\mathbf{v}^{\trans}\mathbf{u}\mathbf{v}^{\trans} \tag{*}
\end{align*}
By using the defining formula for $a=\frac{1}{1+\mathbf{v}^{\trans}\mathbf{u}}$, we have
\[a(1+\mathbf{v}^{\trans}\mathbf{u})=1,\] and thus
\[1-a=a\mathbf{v}^{\trans}\mathbf{u}. \tag{**}\]


Note that the third term in (*) $-a\mathbf{u}\mathbf{v}^{\trans}\mathbf{u}\mathbf{v}^{\trans}$ contains $\mathbf{v}^{\trans}\mathbf{u}$ in the middle, and $\mathbf{v}^{\trans}\mathbf{u}$ is just a number. Thus we can factor out this number and get
\begin{align*}
-a\mathbf{u}\mathbf{v}^{\trans}\mathbf{u}\mathbf{v}^{\trans}&=-a\mathbf{u}(\mathbf{v}^{\trans}\mathbf{u})\mathbf{v}^{\trans}\\
&=-a(\mathbf{v}^{\trans}\mathbf{u})\mathbf{u}\mathbf{v}^{\trans} \tag{***}
\end{align*}
Inserting (**) and (***) into (*), it follows that we have
\begin{align*}
AB&=I+(a\mathbf{v}^{\trans}\mathbf{u})\mathbf{u}\mathbf{v}^{\trans}-a(\mathbf{v}^{\trans}\mathbf{u})\mathbf{u}\mathbf{v}^{\trans}\\
&=I.
\end{align*}
Thus we have proved $AB=I$.


Now we compute $BA$. We have
\begin{align*}
BA&=(I-a\mathbf{u}\mathbf{v}^{\trans})(I+\mathbf{u}\mathbf{v}^{\trans})\\
&=I+\mathbf{u}\mathbf{v}^{\trans}-a\mathbf{u}\mathbf{v}^{\trans}-a\mathbf{u}\mathbf{v}^{\trans}\mathbf{u}\mathbf{v}^{\trans}\\
&=I+(1-a)\mathbf{u}\mathbf{v}^{\trans}-a\mathbf{u}\mathbf{v}^{\trans}\mathbf{u}\mathbf{v}^{\trans}
\end{align*}
and this is exactly the expression (*), hence $BA=AB=I$.


Therefore, we conclude that the matrix $A$ is invertible and the inverse matrix is $B$. Hence
\[A^{-1}=I-a\mathbf{u}\mathbf{v}^{\trans}\] and we have proved the Sherman-Woodberry formula.

Comment.

The invertible matrix theorem says that once we have $AB=I$, then we have automatically $BA=I$ and the inverse matrix of $A$ is $B$, that is, $A^{-1}=B$.
So in the above proof, after proving $AB=I$, you may conclude that $A$ is invertible and $A^{-1}=B$.

Related Question.

Problem.
Let $A$ be a singular $2\times 2$ matrix such that $\tr(A)\neq -1$ and let $I$ be the $2\times 2$ identity matrix.
Then prove that the inverse matrix of the matrix $I+A$ is given by the following formula:
\[(I+A)^{-1}=I-\frac{1}{1+\tr(A)}A.\]

See the post ↴
The Formula for the Inverse Matrix of $I+A$ for a $2\times 2$ Singular Matrix $A$
for a proof of this problem.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Find the Inverse Matrix of a Matrix With FractionsFind the Inverse Matrix of a Matrix With Fractions Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7} \end{bmatrix}.\]   Hint. You may use the augmented matrix […]
  • Rotation Matrix in Space and its Determinant and EigenvaluesRotation Matrix in Space and its Determinant and Eigenvalues For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta & -\sin\theta & 0 \\ \sin\theta &\cos\theta &0 \\ 0 & 0 & 1 \end{bmatrix}.\] (a) Find the determinant of the matrix $A$. (b) Show that $A$ is an […]
  • Eigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is EvenEigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is Even Let $A$ be a real skew-symmetric matrix, that is, $A^{\trans}=-A$. Then prove the following statements. (a) Each eigenvalue of the real skew-symmetric matrix $A$ is either $0$ or a purely imaginary number. (b) The rank of $A$ is even.   Proof. (a) Each […]
  • Construction of a Symmetric Matrix whose Inverse Matrix is ItselfConstruction of a Symmetric Matrix whose Inverse Matrix is Itself Let $\mathbf{v}$ be a nonzero vector in $\R^n$. Then the dot product $\mathbf{v}\cdot \mathbf{v}=\mathbf{v}^{\trans}\mathbf{v}\neq 0$. Set $a:=\frac{2}{\mathbf{v}^{\trans}\mathbf{v}}$ and define the $n\times n$ matrix $A$ by \[A=I-a\mathbf{v}\mathbf{v}^{\trans},\] where […]
  • Find the Distance Between Two Vectors if the Lengths and the Dot Product are GivenFind the Distance Between Two Vectors if the Lengths and the Dot Product are Given Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are \[\|\mathbf{a}\|=\|\mathbf{b}\|=1\] and the inner product \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\|\mathbf{a}-\mathbf{b}\|$. (Note […]
  • Inner Product, Norm, and Orthogonal VectorsInner Product, Norm, and Orthogonal Vectors Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in […]
  • Orthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct EigenvaluesOrthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct Eigenvalues Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$. Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$. (Nagoya University, Linear Algebra Final Exam Problem)   Hint. Two […]
  • Subset of Vectors Perpendicular to Two Vectors is a SubspaceSubset of Vectors Perpendicular to Two Vectors is a Subspace Let $\mathbf{a}$ and $\mathbf{b}$ be fixed vectors in $\R^3$, and let $W$ be the subset of $\R^3$ defined by \[W=\{\mathbf{x}\in \R^3 \mid \mathbf{a}^{\trans} \mathbf{x}=0 \text{ and } \mathbf{b}^{\trans} \mathbf{x}=0\}.\] Prove that the subset $W$ is a subspace of […]

You may also like...

1 Response

  1. 07/11/2017

    […] the post Sherman-Woodbery Formula for the Inverse Matrix for the statement of the Sherman-Woodberry formula and its […]

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Linear Algebra
Linear algebra problems and solutions
Find Values of $a$ so that Augmented Matrix Represents a Consistent System

Suppose that the following matrix $A$ is the augmented matrix for a system of linear equations. \[A= \left[\begin{array}{rrr|r} 1 &...

Close