Determinant/Trace and Eigenvalues of a Matrix
Problem 9
Let $A$ be an $n\times n$ matrix and let $\lambda_1, \dots, \lambda_n$ be its eigenvalues.
Show that
(1) $$\det(A)=\prod_{i=1}^n \lambda_i$$
(2) $$\tr(A)=\sum_{i=1}^n \lambda_i$$
Here $\det(A)$ is the determinant of the matrix $A$ and $\tr(A)$ is the trace of the matrix $A$.
Namely, prove that (1) the determinant of $A$ is the product of its eigenvalues, and (2) the trace of $A$ is the sum of the eigenvalues.
Add to solve later
Sponsored Links
Plan 1.
- Use the definition of eigenvalues (the characteristic polynomial).
- Compare coefficients.
Plan 2.
- Make $A$ upper triangular matrix or in the Jordan normal/canonical form.
- Use the property of determinants and traces.
Proof. [Method 1]
(1) Recall that eigenvalues are roots of the characteristic polynomial $p(\lambda)=\det(A-\lambda I_n)$.
It follows that we have
\begin{align*}
&\det(A-\lambda I_n) \\
&=\begin{vmatrix}
a_{1 1}- \lambda & a_{1 2} & \cdots & a_{1,n} \\
a_{2 1} & a_{2 2} -\lambda & \cdots & a_{2,n} \\
\vdots & \vdots & \ddots & \vdots \\
a_{n 1} & a_{m 2} & \cdots & a_{n n}-\lambda
\end{vmatrix} =\prod_{i=1}^n (\lambda_i-\lambda). \tag{*}
\end{align*}
Letting $\lambda=0$, we see that $\det(A)=\prod_{i=1}^n \lambda_i$ and this completes the proof of part (a).
(2) Compare the coefficients of $\lambda^{n-1}$ of the both sides of (*).
The coefficient of $\lambda^{n-1}$ of the determinant on the left side of (*) is
$$(-1)^{n-1}(a_{11}+a_{22}+\cdots a_{n n})=(-1)^{n-1}\tr(A).$$
The coefficient of $\lambda^{n-1}$ of the determinant on the right side of (*) is
$$(-1)^{n-1}\sum_{i=1}^n \lambda_i.$$
Thus we have $\tr(A)=\sum_{i=1}^n \lambda_i$.
Proof. [Method 2]
Observe that there exists an $n \times n$ invertible matrix $P$ such that
\[P^{-1} A P= \begin{bmatrix}
\lambda_1 & * & \cdots & * \\
0 & \lambda_2 & \cdots & * \\
\vdots & \vdots & \ddots & \vdots \\
0 & 0 & \cdots & \lambda_n \tag{**}
\end{bmatrix}.
\]
This is an upper triangular matrix and diagonal entries are eigenvalues.
(If this is not familiar to you, then study a “triangularizable matrix” or “Jordan normal/canonical form”.)
(1) Since the determinant of an upper triangular matrix is the product of diagonal entries, we have
\begin{align*}
\prod_{i=1}^n \lambda_i & =\det(P^{-1} A P)=\det(P^{-1}) \det(A) \det(P) \\
&= \det(P)^{-1}\det(A) \det(P)=\det(A),
\end{align*}
where we used the multiplicative property of the determinant.
(2) We take the trace of both sides of (**) and get
\begin{align*}
\sum_{i=1}^n \lambda_i =\tr(P^{-1}AP) =\tr(A).
\end{align*}
(Here for the last equality we used the property of the trace that $\tr(AB)=\tr(BA)$ for any $n\times n$ matrices $A$ and $B$.)
Thus we obtained the result $\tr(A)=\sum_{i=1}^n \lambda_i$.
Comment.
The proof of (1) in the first method is simple, but that of (2) requires a bit observation, especially when we find the coefficient of the left-hand side.
The proof of (2) in the second method is simpler although you need to know about the Jordan normal/canonical form.
These two formulas relate the determinant and the trace, and the eigenvalue of a matrix in a very simple way.
Add to solve later
Sponsored Links
7 Responses
[…] For (b), use (a) and consider the trace of $B$ and its relation to eigenvalues. For this relation, see the problem Determinant/trace and eigenvalues of a matrix. […]
[…] Computing the determinant directly by hand is tedious. So use the fact that the determinant of a matrix $A$ is the product of all eigenvalues of $A$. […]
[…] 5=tr(A^2)=lambda_1^2+lambda_2^2. end{align*} Here we used two facts. The first one is that the trace of a matrix is the sum of all eigenvalues of the matrix. The second one is that $lambda^2$ is an eigenvalue of $A^2$ if $lambda$ is an eigenvalue of $A$, […]
[…] the product of all eigenvalues of $A$ is the determinant of $A$. (For a proof, see the post “Determinant/trace and eigenvalues of a matrix“.) Thus we have [alpha beta gamma=det(A)=1.] Thus, at least one of $alpha, beta, […]
[…] Let us give a more theoretical explanation. If an $ntimes n$ matrix $A$ is diagonalizable, then there exists an invertible matrix $P$ such that [P^{-1}AP=begin{bmatrix} lambda_1 & 0 & cdots & 0 \ 0 & lambda_2 & cdots & 0 \ vdots & vdots & ddots & vdots \ 0 & 0 & cdots & lambda_n end{bmatrix},] where $lambda_1, dots, lambda_n$ are eigenvalues of $A$. Then we consider the determinants of the matrices of both sides. The determinant of the left hand side is begin{align*} det(P^{-1}AP)=det(P)^{-1}det(A)det(P)=det(A). end{align*} On the other hand, the determinant of the right hand side is the product [lambda_1lambda_2cdots lambda_n] since the right matrix is diagonal. Hence we obtain [det(A)=lambda_1lambda_2cdots lambda_n.] (Note that it is always true that the determinant of a matrix is the product of its eigenvalues regardless diagonalizability. See the post “Determinant/trace and eigenvalues of a matrix“.) […]
[…] that the product of all eigenvalues of $A$ is the determinant of $A$. Thus, we have [frac{-1+sqrt{3}i}{2} cdot frac{-1-sqrt{3}i}{2}cdot lambda =det(A)=1.] […]
[…] $det(A)=0$. Note that the product of all eigenvalues of $A$ is $det(A)$. (See the post “Determinant/Trace and Eigenvalues of a Matrix” for a proof.) Thus, $0$ is an eigenvalue of […]