Questions About the Trace of a Matrix

Linear algebra problems and solutions

Problem 19

Let $A=(a_{i j})$ and $B=(b_{i j})$ be $n\times n$ real matrices for some $n \in \N$. Then answer the following questions about the trace of a matrix.

(a) Express $\tr(AB^{\trans})$ in terms of the entries of the matrices $A$ and $B$. Here $B^{\trans}$ is the transpose matrix of $B$.

(b) Show that $\tr(AA^{\trans})$ is the sum of the square of the entries of $A$.

(c) Show that if $A$ is nonzero symmetric matrix, then $\tr(A^2)>0$.

LoadingAdd to solve later

Sponsored Links


Hint.

Review

  1. the definition of the transpose of a matrix
  2. the definition of matrix multiplication
  3. the definition of a symmetric matrix

Then the proofs of these statement is straightforward computations.

Proof.

(a) Express $\tr(AB^{\trans})$ in terms of the entries of $A$ and $B$.

Here we use the following notation for an entry of a matrix: the $(i, j)$-entry of a matrix $C$ is denoted by $(C)_{i,j}$.

Then the $(i,j)$-entry of $AB^{\trans}$ is $(AB^{\trans})_{ij}=\sum_{k=1}^n a_{ik}b_{jk}$.
Thus we have
\[\tr(AB^{\trans})=\sum_{l=1}^n (AB^{\trans})_{ll}=\sum_{l=1}^n \sum_{k=1}^n a_{lk}b_{lk}.\]

(b) Show that $\tr(AA^{\trans})$ is the sum of the square of the entries of $A$

By the formula obtained in part (a), we have
\[ \tr(AA^{\trans})=\sum_{l=1}^n \sum_{k=1}^n a_{lk}^2.\] This is the sum of the squares of entries of $A$.

(c) Show that if $A$ is nonzero symmetric matrix, then $\tr(A^2)>0$.

Since $A$ is a symmetric matrix, we have $A^{\trans}=A$.
Thus by the result of part (b), we have

\[ \tr(A^2)=\tr(AA^{\trans})=\sum_{l=1}^n \sum_{k=1}^n a_{lk}^2>0.\] The last sum is strictly positive since $A$ is not the zero matrix, there is a nonzero entry of $A$ (and of course the square of a real number is nonnegative).

Comment.

The results we proved in this article can be extended to complex matrices, matrices with complex number entries.
In this case, the condition in (c) that $A$ is symmetric is replaced by the condition that $A$ is a hermitian matrix.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Transpose of a Matrix and Eigenvalues and Related QuestionsTranspose of a Matrix and Eigenvalues and Related Questions Let $A$ be an $n \times n$ real matrix. Prove the followings. (a) The matrix $AA^{\trans}$ is a symmetric matrix. (b) The set of eigenvalues of $A$ and the set of eigenvalues of $A^{\trans}$ are equal. (c) The matrix $AA^{\trans}$ is non-negative definite. (An $n\times n$ […]
  • Eigenvalues of a Hermitian Matrix are Real NumbersEigenvalues of a Hermitian Matrix are Real Numbers Show that eigenvalues of a Hermitian matrix $A$ are real numbers. (The Ohio State University Linear Algebra Exam Problem)   We give two proofs. These two proofs are essentially the same. The second proof is a bit simpler and concise compared to the first one. […]
  • Express a Hermitian Matrix as a Sum of Real Symmetric Matrix and a Real Skew-Symmetric MatrixExpress a Hermitian Matrix as a Sum of Real Symmetric Matrix and a Real Skew-Symmetric Matrix Recall that a complex matrix is called Hermitian if $A^*=A$, where $A^*=\bar{A}^{\trans}$. Prove that every Hermitian matrix $A$ can be written as the sum \[A=B+iC,\] where $B$ is a real symmetric matrix and $C$ is a real skew-symmetric matrix.   Proof. Since […]
  • Symmetric Matrices and the Product of Two MatricesSymmetric Matrices and the Product of Two Matrices Let $A$ and $B$ be $n \times n$ real symmetric matrices. Prove the followings. (a) The product $AB$ is symmetric if and only if $AB=BA$. (b) If the product $AB$ is a diagonal matrix, then $AB=BA$.   Hint. A matrix $A$ is called symmetric if $A=A^{\trans}$. In […]
  • Inverse Matrix of Positive-Definite Symmetric Matrix is Positive-DefiniteInverse Matrix of Positive-Definite Symmetric Matrix is Positive-Definite Suppose $A$ is a positive definite symmetric $n\times n$ matrix. (a) Prove that $A$ is invertible. (b) Prove that $A^{-1}$ is symmetric. (c) Prove that $A^{-1}$ is positive-definite. (MIT, Linear Algebra Exam Problem)   Proof. (a) Prove that $A$ is […]
  • Positive definite Real Symmetric Matrix and its EigenvaluesPositive definite Real Symmetric Matrix and its Eigenvalues A real symmetric $n \times n$ matrix $A$ is called positive definite if \[\mathbf{x}^{\trans}A\mathbf{x}>0\] for all nonzero vectors $\mathbf{x}$ in $\R^n$. (a) Prove that the eigenvalues of a real symmetric positive-definite matrix $A$ are all positive. (b) Prove that if […]
  • 7 Problems on Skew-Symmetric Matrices7 Problems on Skew-Symmetric Matrices Let $A$ and $B$ be $n\times n$ skew-symmetric matrices. Namely $A^{\trans}=-A$ and $B^{\trans}=-B$. (a) Prove that $A+B$ is skew-symmetric. (b) Prove that $cA$ is skew-symmetric for any scalar $c$. (c) Let $P$ be an $m\times n$ matrix. Prove that $P^{\trans}AP$ is […]
  • Sum of Squares of Hermitian Matrices is Zero, then Hermitian Matrices Are All ZeroSum of Squares of Hermitian Matrices is Zero, then Hermitian Matrices Are All Zero Let $A_1, A_2, \dots, A_m$ be $n\times n$ Hermitian matrices. Show that if \[A_1^2+A_2^2+\cdots+A_m^2=\calO,\] where $\calO$ is the $n \times n$ zero matrix, then we have $A_i=\calO$ for each $i=1,2, \dots, m$.   Hint. Recall that a complex matrix $A$ is Hermitian if […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Linear algebra problems and solutions
Linear Dependent/Independent Vectors of Polynomials

Let $p_1(x), p_2(x), p_3(x), p_4(x)$ be (real) polynomials of degree at most $3$. Which (if any) of the following two...

Close