Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$

Problems and solutions in Linear Algebra

Problem 637

Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors.

(a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$.

(b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} \mathbf{v}^\trans$.

 
LoadingAdd to solve later

Sponsored Links


Solution.

(a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$.

Suppose the vectors have component
\[\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} \, \mbox{ and } \mathbf{w} = \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{bmatrix}.\] Then,
\[\mathbf{v}^\trans \mathbf{w} = \begin{bmatrix} v_1 & v_2 & \cdots & v_n \end{bmatrix} \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{bmatrix} = \sum_{i=1}^n v_i w_i,\] while
\[\mathbf{w}^\trans \mathbf{v} = \begin{bmatrix} w_1 & w_2 & \cdots & w_n \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} = \sum_{i=1}^n w_i v_i.\] We can see that they are equal because $v_i w_i = w_i v_i$.

(b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} \mathbf{v}^\trans$.

For the counterexample, let $\mathbf{v} = \begin{bmatrix} 1 \\ 0 \end{bmatrix}$ and $\mathbf{w} = \begin{bmatrix} 0 \\ 1 \end{bmatrix}$. Then
\[\mathbf{v} \mathbf{w}^\trans = \begin{bmatrix} 1 \\ 0 \end{bmatrix} \begin{bmatrix} 0 & 1 \end{bmatrix} = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}\] while
\[\quad \mathbf{w} \mathbf{v}^\trans = \begin{bmatrix} 0 \\ 1 \end{bmatrix} \begin{bmatrix} 1 & 0 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}.\]

Comment.

Recall that for two vectors $\mathbf{v}, \mathbf{w} \in \R^n$, the dot product (or inner product) of $\mathbf{v}, \mathbf{w}$ is defined to be
\[\mathbf{v}\cdot \mathbf{w}:=\mathbf{v}^{\trans} \mathbf{w}.\]

Part (a) of the problem deduces that the dot product is commutative. This means that we have
\[\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}.\]

In fact, we have
\begin{align*}
\mathbf{v}\cdot \mathbf{w}= \mathbf{v}^\trans \mathbf{w} \stackrel{\text{(a)}}{=} \mathbf{w}^\trans \mathbf{v} \mathbf{w} \cdot \mathbf{v}.
\end{align*}


Also, notice that while $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} \mathbf{v}^\trans$, we know that $(\mathbf{v} \mathbf{w}^\trans)^\trans = \mathbf{w} \mathbf{v}^\trans$.


LoadingAdd to solve later

Sponsored Links

More from my site

  • A Relation between the Dot Product and the TraceA Relation between the Dot Product and the Trace Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors. Prove that $\tr ( \mathbf{v} \mathbf{w}^\trans ) = \mathbf{v}^\trans \mathbf{w}$.   Solution. Suppose the vectors have components \[\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n […]
  • Find the Distance Between Two Vectors if the Lengths and the Dot Product are GivenFind the Distance Between Two Vectors if the Lengths and the Dot Product are Given Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are \[\|\mathbf{a}\|=\|\mathbf{b}\|=1\] and the inner product \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\|\mathbf{a}-\mathbf{b}\|$. (Note […]
  • Rotation Matrix in Space and its Determinant and EigenvaluesRotation Matrix in Space and its Determinant and Eigenvalues For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta & -\sin\theta & 0 \\ \sin\theta &\cos\theta &0 \\ 0 & 0 & 1 \end{bmatrix}.\] (a) Find the determinant of the matrix $A$. (b) Show that $A$ is an […]
  • Construction of a Symmetric Matrix whose Inverse Matrix is ItselfConstruction of a Symmetric Matrix whose Inverse Matrix is Itself Let $\mathbf{v}$ be a nonzero vector in $\R^n$. Then the dot product $\mathbf{v}\cdot \mathbf{v}=\mathbf{v}^{\trans}\mathbf{v}\neq 0$. Set $a:=\frac{2}{\mathbf{v}^{\trans}\mathbf{v}}$ and define the $n\times n$ matrix $A$ by \[A=I-a\mathbf{v}\mathbf{v}^{\trans},\] where […]
  • Find the Inverse Matrix of a Matrix With FractionsFind the Inverse Matrix of a Matrix With Fractions Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7} \end{bmatrix}.\]   Hint. You may use the augmented matrix […]
  • Matrix Operations with TransposeMatrix Operations with Transpose Calculate the following expressions, using the following matrices: \[A = \begin{bmatrix} 2 & 3 \\ -5 & 1 \end{bmatrix}, \qquad B = \begin{bmatrix} 0 & -1 \\ 1 & -1 \end{bmatrix}, \qquad \mathbf{v} = \begin{bmatrix} 2 \\ -4 \end{bmatrix}\] (a) $A B^\trans + \mathbf{v} […]
  • Orthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct EigenvaluesOrthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct Eigenvalues Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$. Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$. (Nagoya University, Linear Algebra Final Exam Problem)   Hint. Two […]
  • Sherman-Woodbery Formula for the Inverse MatrixSherman-Woodbery Formula for the Inverse Matrix Let $\mathbf{u}$ and $\mathbf{v}$ be vectors in $\R^n$, and let $I$ be the $n \times n$ identity matrix. Suppose that the inner product of $\mathbf{u}$ and $\mathbf{v}$ satisfies \[\mathbf{v}^{\trans}\mathbf{u}\neq -1.\] Define the matrix […]

You may also like...

Please Login to Comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Linear algebra problems and solutions
Matrix Operations with Transpose

Calculate the following expressions, using the following matrices: \[A = \begin{bmatrix} 2 & 3 \\ -5 & 1 \end{bmatrix}, \qquad...

Close