A Relation between the Dot Product and the Trace

Trace of a matrix, linear algebra

Problem 638

Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors.

Prove that $\tr ( \mathbf{v} \mathbf{w}^\trans ) = \mathbf{v}^\trans \mathbf{w}$.

 
LoadingAdd to solve later

Sponsored Links

Solution.

Suppose the vectors have components
\[\mathbf{v} = \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} \, \mbox{ and } \mathbf{w} = \begin{bmatrix} w_1 \\ w_2 \\ \vdots \\ w_n \end{bmatrix}.\] Then,
\begin{align*}
\mathbf{v} \mathbf{w}^\trans &= \begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} \begin{bmatrix} w_1 & w_2 & \cdots & w_n \end{bmatrix}\\[6pt] &= \begin{bmatrix} v_1 w_1 & v_1 w_2 & \cdots & v_1 w_n \\ v_2 w_1 & v_2 w_2 & \cdots & v_2 w_n \\ \vdots & \vdots & \vdots & \vdots \\ v_n w_1 & v_n w_2 & \cdots & v_n w_n \end{bmatrix}.
\end{align*}

We can now see that
\[\tr( \mathbf{v} \mathbf{w}^\trans) = \sum_{i=1}^n v_i w_i = \mathbf{v}^\trans \mathbf{w}.\]

Comment.

Recall that $\mathbf{v}^\trans \mathbf{w}$ is, by definition, the dot product of the vectors $\mathbf{v}$ and $\mathbf{w}$.

So, the dot product of vectors $\mathbf{v}$ and $\mathbf{w}$ is equal to the trace of the matrix $\mathbf{v} \mathbf{w}^\trans$.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$ Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors. (a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$. (b) Provide an example to show that $\mathbf{v} \mathbf{w}^\trans$ is not always equal to $\mathbf{w} […]
  • Is the Trace of the Transposed Matrix the Same as the Trace of the Matrix?Is the Trace of the Transposed Matrix the Same as the Trace of the Matrix? Let $A$ be an $n \times n$ matrix. Is it true that $\tr ( A^\trans ) = \tr(A)$? If it is true, prove it. If not, give a counterexample.   Solution. The answer is true. Recall that the transpose of a matrix is the sum of its diagonal entries. Also, note that the […]
  • Matrix Operations with TransposeMatrix Operations with Transpose Calculate the following expressions, using the following matrices: \[A = \begin{bmatrix} 2 & 3 \\ -5 & 1 \end{bmatrix}, \qquad B = \begin{bmatrix} 0 & -1 \\ 1 & -1 \end{bmatrix}, \qquad \mathbf{v} = \begin{bmatrix} 2 \\ -4 \end{bmatrix}\] (a) $A B^\trans + \mathbf{v} […]
  • Find the Inverse Matrix of a Matrix With FractionsFind the Inverse Matrix of a Matrix With Fractions Find the inverse matrix of the matrix \[A=\begin{bmatrix} \frac{2}{7} & \frac{3}{7} & \frac{6}{7} \\[6 pt] \frac{6}{7} &\frac{2}{7} &-\frac{3}{7} \\[6pt] -\frac{3}{7} & \frac{6}{7} & -\frac{2}{7} \end{bmatrix}.\]   Hint. You may use the augmented matrix […]
  • Find the Distance Between Two Vectors if the Lengths and the Dot Product are GivenFind the Distance Between Two Vectors if the Lengths and the Dot Product are Given Let $\mathbf{a}$ and $\mathbf{b}$ be vectors in $\R^n$ such that their length are \[\|\mathbf{a}\|=\|\mathbf{b}\|=1\] and the inner product \[\mathbf{a}\cdot \mathbf{b}=\mathbf{a}^{\trans}\mathbf{b}=-\frac{1}{2}.\] Then determine the length $\|\mathbf{a}-\mathbf{b}\|$. (Note […]
  • Construction of a Symmetric Matrix whose Inverse Matrix is ItselfConstruction of a Symmetric Matrix whose Inverse Matrix is Itself Let $\mathbf{v}$ be a nonzero vector in $\R^n$. Then the dot product $\mathbf{v}\cdot \mathbf{v}=\mathbf{v}^{\trans}\mathbf{v}\neq 0$. Set $a:=\frac{2}{\mathbf{v}^{\trans}\mathbf{v}}$ and define the $n\times n$ matrix $A$ by \[A=I-a\mathbf{v}\mathbf{v}^{\trans},\] where […]
  • Trace of the Inverse Matrix of a Finite Order MatrixTrace of the Inverse Matrix of a Finite Order Matrix Let $A$ be an $n\times n$ matrix such that $A^k=I_n$, where $k\in \N$ and $I_n$ is the $n \times n$ identity matrix. Show that the trace of $(A^{-1})^{\trans}$ is the conjugate of the trace of $A$. That is, show that […]
  • Rotation Matrix in Space and its Determinant and EigenvaluesRotation Matrix in Space and its Determinant and Eigenvalues For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta & -\sin\theta & 0 \\ \sin\theta &\cos\theta &0 \\ 0 & 0 & 1 \end{bmatrix}.\] (a) Find the determinant of the matrix $A$. (b) Show that $A$ is an […]

You may also like...

Please Login to Comment.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Problems and solutions in Linear Algebra
Prove that the Dot Product is Commutative: $\mathbf{v}\cdot \mathbf{w}= \mathbf{w} \cdot \mathbf{v}$

Let $\mathbf{v}$ and $\mathbf{w}$ be two $n \times 1$ column vectors. (a) Prove that $\mathbf{v}^\trans \mathbf{w} = \mathbf{w}^\trans \mathbf{v}$. (b)...

Close