Determinant of a General Circulant Matrix

Linear algebra problems and solutions

Problem 374

Let \[A=\begin{bmatrix}
a_0 & a_1 & \dots & a_{n-2} &a_{n-1} \\
a_{n-1} & a_0 & \dots & a_{n-3} & a_{n-2} \\
a_{n-2} & a_{n-1} & \dots & a_{n-4} & a_{n-3} \\
\vdots & \vdots & \dots & \vdots & \vdots \\
a_{2} & a_3 & \dots & a_{0} & a_{1}\\
a_{1} & a_2 & \dots & a_{n-1} & a_{0}
\end{bmatrix}\] be a complex $n \times n$ matrix.
Such a matrix is called circulant matrix.
Then prove that the determinant of the circulant matrix $A$ is given by
\[\det(A)=\prod_{k=0}^{n-1}(a_0+a_1\zeta^k+a_2 \zeta^{2k}+\cdots+a_{n-1}\zeta^{k(n-1)}),\] where $\zeta=e^{2 \pi i/n}$ is a primitive $n$-th root of unity.

 
LoadingAdd to solve later

Sponsored Links

Proof.

Let $\omega$ be any $n$-th root of unity.
Consider the vector
\[\mathbf{v}=\begin{bmatrix}
1 \\
\omega \\
\omega^2 \\
\vdots \\
\omega^{n-1}
\end{bmatrix}.\] We show that the vector $\mathbf{v}$ is an eigenvector of $A$.
We compute
\begin{align*}
A\mathbf{v}=\
\begin{bmatrix}
a_0 & a_1 & \dots & a_{n-2} &a_{n-1} \\
a_{n-1} & a_0 & \dots & a_{n-3} & a_{n-2} \\
a_{n-2} & a_{n-1} & \dots & a_{n-4} & a_{n-3} \\
\vdots & \vdots & \dots & \vdots & \vdots \\
a_{2} & a_3 & \dots & a_{0} & a_{1}\\
a_{1} & a_2 & \dots & a_{n-1} & a_{0}
\end{bmatrix}
\begin{bmatrix}
1 \\
\omega \\
\omega^2 \\
\vdots \\
\omega^{n-1}\\
\end{bmatrix}.
\end{align*}

The first entry of the vector $A\mathbf{v}$ is
\[a_0+a_1\omega+a_2\omega^2+\cdots a_{n-2}\omega^{n-2}+a_{n-1}\omega^{n-1}=:\lambda.\] We define $\lambda$ to be this number.


The second entry is
\begin{align*}
&a_{n-1}+a_0\omega+\cdots+a_{n-3}\omega^{n-2}+a_{n-2}\omega^{n-1}\\
&=(a_{n-1}\omega^{n-1}+a_0+\cdots+a_{n-3}\omega^{n-3}+a_{n-2}\omega^{n-2})\omega \\
&=(a_0+\cdots+a_{n-3}\omega^{n-3}+a_{n-2}\omega^{n-2}+a_{n-1}\omega^{n-1})\omega \\
&=\lambda \omega
\end{align*}


Similarly the $i$-th entry of the vector $A\mathbf{v}$ is
\begin{align*}
&a_{n-i+1}+a_{n-i+2}\omega +\cdots+ a_{n-i}\omega^{n-1}\\
&= (a_{n-i+1}\omega^{n-i+1}+a_{n-i+2}\omega^{n-i+2} +\cdots+ a_{n-i}\omega^{n-i})\omega^{i-1}\\
&=\lambda \omega^{i-1} .
\end{align*}
Therefore we obtain
\[A\mathbf{v}=\begin{bmatrix}
\lambda \\
\lambda\omega \\
\lambda \omega^2 \\
\vdots \\
\lambda\omega^{n-1}
\end{bmatrix}=\lambda \mathbf{v}.\]


Since $\mathbf{v}$ is a nonzero vector, it follows that $\lambda$ is an eigenvalue of $A$ and $\mathbf{v}$ is an eigenvector corresponding to $\lambda$.


The above argument holds for any $n$-th root of unity $\omega$.
We take $\omega=\zeta^k$, where $k$ runs from $0$ to $n-1$.
It follows that the vector
\[\mathbf{v}_k:=\begin{bmatrix}
1 \\
\zeta^k \\
\zeta^{2k} \\
\vdots \\
\zeta^{k(n-1)}
\end{bmatrix}\] is eigenvector corresponding to the eigenvalue
\[\lambda_k:=a_0+a_1\zeta^k+a_2\zeta^{2k}+\cdots a_{n-2}\zeta^{k(n-2)}+a_{n-1}\zeta^{k(n-1)}\] for each $k=0,1, \dots, n-1$.


We claim that the vectors $\mathbf{v}_k$ are linearly independent.

To see this, form a matrix whose column vectors are these vectors. That is, we consider
\[B=\begin{bmatrix}
1& 1 & 1 & \dots & 1 &1 \\
1&\zeta & \zeta^{2} & \dots & \zeta^{n-2} & \zeta^{n-1} \\
1&\zeta^{2} & \zeta^{4} & \dots & \zeta^{2(n-2)} & \zeta^{2(n-1)} \\
1& \zeta^{3} & \zeta^{6} & \dots & \zeta^{3(n-2)} & \zeta^{3(n-1)} \\
\vdots& \vdots & \vdots & \dots & \vdots & \vdots \\
1& \zeta^{n-1} & \zeta^{2(n-1)} & \dots & \zeta^{(n-1)(n-2)} &\zeta^{(n-1)(n-1)}
\end{bmatrix}.\]


This is the Vandermonde matrix and its determinant is
\[\det(B)=\prod_{i < j }(\zeta^j-\zeta^i)\neq 0.\] Thus, the matrix $B$ is nonsingular, hence its column vectors are linearly independent. It follows that $\lambda_k$, $k=0, 1, \dots, n$ are all the eigenvalues of $A$. Since the determinant is the product of all the eigenvalues of $A$, we have \begin{align*} \det(A)&=\prod_{k=0}^{n-1}\lambda_k\\ &=\prod_{k=0}^{n-1}(a_0+a_1\zeta^k+a_2 \zeta^{2k}+\cdots+a_{n-1}\zeta^{k(n-1)}), \end{align*} as required. This completes the proof.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Exponential Functions Form a Basis of a Vector SpaceExponential Functions Form a Basis of a Vector Space Let $C[-1, 1]$ be the vector space over $\R$ of all continuous functions defined on the interval $[-1, 1]$. Let \[V:=\{f(x)\in C[-1,1] \mid f(x)=a e^x+b e^{2x}+c e^{3x}, a, b, c\in \R\}\] be a subset in $C[-1, 1]$. (a) Prove that $V$ is a subspace of $C[-1, 1]$. (b) […]
  • Exponential Functions are Linearly IndependentExponential Functions are Linearly Independent Let $c_1, c_2,\dots, c_n$ be mutually distinct real numbers. Show that exponential functions \[e^{c_1x}, e^{c_2x}, \dots, e^{c_nx}\] are linearly independent over $\R$. Hint. Consider a linear combination \[a_1 e^{c_1 x}+a_2 e^{c_2x}+\cdots + a_ne^{c_nx}=0.\] […]
  • Eigenvalues of Orthogonal Matrices Have Length 1. Every $3\times 3$ Orthogonal Matrix Has 1 as an EigenvalueEigenvalues of Orthogonal Matrices Have Length 1. Every $3\times 3$ Orthogonal Matrix Has 1 as an Eigenvalue (a) Let $A$ be a real orthogonal $n\times n$ matrix. Prove that the length (magnitude) of each eigenvalue of $A$ is $1$. (b) Let $A$ be a real orthogonal $3\times 3$ matrix and suppose that the determinant of $A$ is $1$. Then prove that $A$ has $1$ as an […]
  • Determinant of Matrix whose Diagonal Entries are 6 and 2 ElsewhereDeterminant of Matrix whose Diagonal Entries are 6 and 2 Elsewhere Find the determinant of the following matrix \[A=\begin{bmatrix} 6 & 2 & 2 & 2 &2 \\ 2 & 6 & 2 & 2 & 2 \\ 2 & 2 & 6 & 2 & 2 \\ 2 & 2 & 2 & 6 & 2 \\ 2 & 2 & 2 & 2 & 6 \end{bmatrix}.\] (Harvard University, Linear Algebra Exam […]
  • Quiz 11. Find Eigenvalues and Eigenvectors/ Properties of DeterminantsQuiz 11. Find Eigenvalues and Eigenvectors/ Properties of Determinants (a) Find all the eigenvalues and eigenvectors of the matrix \[A=\begin{bmatrix} 3 & -2\\ 6& -4 \end{bmatrix}.\] (b) Let \[A=\begin{bmatrix} 1 & 0 & 3 \\ 4 &5 &6 \\ 7 & 0 & 9 \end{bmatrix} \text{ and } B=\begin{bmatrix} 2 & 0 & 0 \\ 0 & 3 &0 […]
  • Trace of the Inverse Matrix of a Finite Order MatrixTrace of the Inverse Matrix of a Finite Order Matrix Let $A$ be an $n\times n$ matrix such that $A^k=I_n$, where $k\in \N$ and $I_n$ is the $n \times n$ identity matrix. Show that the trace of $(A^{-1})^{\trans}$ is the conjugate of the trace of $A$. That is, show that […]
  • How to Find Eigenvalues of a Specific Matrix.How to Find Eigenvalues of a Specific Matrix. Find all eigenvalues of the following $n \times n$ matrix. \[ A=\begin{bmatrix} 0 & 0 & \cdots & 0 &1 \\ 1 & 0 & \cdots & 0 & 0\\ 0 & 1 & \cdots & 0 &0\\ \vdots & \vdots & \ddots & \ddots & \vdots \\ 0 & […]
  • Finite Order Matrix and its TraceFinite Order Matrix and its Trace Let $A$ be an $n\times n$ matrix and suppose that $A^r=I_n$ for some positive integer $r$. Then show that (a) $|\tr(A)|\leq n$. (b) If $|\tr(A)|=n$, then $A=\zeta I_n$ for an $r$-th root of unity $\zeta$. (c) $\tr(A)=n$ if and only if $A=I_n$. Proof. (a) […]

You may also like...

6 Responses

  1. Parthasarathi Mukhopadhyay says:

    Kindly clarify why the matrix B is taken as (n-1)X(n-1)? It seems that a column of 1at the beginning and the actual second row is missing.

  2. Zubair says:

    Respected Doctor,
    I don’t think the second entry you computed for “Av” vector is right, because when you take ‘w’ common then the a_n_1 term should be multiplied with just w instead of w^n-1

    • Yu says:

      Dear Zubair,

      Thank you for the comment. I’m not exactly sure what you suggested.

      Do you ask why we have $\omega^{n-1]$ in front of $a_{n-1}$? in the following computation?

      ====================================================

      $\begin{align*}
      &a_{n-1}+a_0\omega+\cdots+a_{n-3}\omega^{n-2}+a_{n-2}\omega^{n-1}\\
      &=(a_{n-1}\omega^{n-1}+a_0+\cdots+a_{n-3}\omega^{n-3}+a_{n-2}\omega^{n-2})\omega \\
      &=(a_0+\cdots+a_{n-3}\omega^{n-3}+a_{n-2}\omega^{n-2}+a_{n-1}\omega^{n-1})\omega \\
      &=\lambda \omega
      \end{align*}$

      ====================================================

      This is because we have $\omega^n=1$ since $\omega$ is an $n$-th root of unity.

  3. Ragib Zaman says:

    Let $p(x) = \sum_{i=0}^{n-1} a_i x^i,$ and note that $A=p(X),$ where $X$ is (the transpose of) the companion matrix of $t^n-1.$ The result then follows from the fact that if $\lambda$ is an eigenvalue of $M,$ then $q(\lambda)$ is an eigenvalue of $q(M)$ for polynomial (in fact, analytic) functions $q.$

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Problems and Solutions of Eigenvalue, Eigenvector in Linear Algebra
Compute Power of Matrix If Eigenvalues and Eigenvectors Are Given

Let $A$ be a $3\times 3$ matrix. Suppose that $A$ has eigenvalues $2$ and $-1$, and suppose that $\mathbf{u}$ and...

Close