Orthogonality of Eigenvectors of a Symmetric Matrix Corresponding to Distinct Eigenvalues

Problem 235

Suppose that a real symmetric matrix $A$ has two distinct eigenvalues $\alpha$ and $\beta$.
Show that any eigenvector corresponding to $\alpha$ is orthogonal to any eigenvector corresponding to $\beta$.

(Nagoya University, Linear Algebra Final Exam Problem)
 
FavoriteLoadingAdd to solve later

Sponsored Links

Hint.

Two vectors $\mathbf{u}$ and $\mathbf{v}$ are orthogonal if their inner (dot) product $\mathbf{u}\cdot \mathbf{v}:=\mathbf{u}^{\trans}\mathbf{v}=0$.

Here $\mathbf{u}^{\trans}$ is the transpose of $\mathbf{u}$.

A fact that we will use below is that for matrices $A$ and $B$, we have $(AB)^{\trans}=B^{\trans}A^{\trans}$.

Proof.

Let $\mathbf{u}, \mathbf{v}$ be eigenvectors corresponding to $\alpha, \beta$, respectively.
Namely we have
\[A\mathbf{u}=\alpha \mathbf{u} \text{ and } A\mathbf{v}=\beta \mathbf{v}. \tag{*}\]

To prove that $\mathbf{u}$ and $\mathbf{v}$ are orthogonal, we show that the inner product $\mathbf{u} \cdot \mathbf{v}=0$.
Keeping this in mind, we compute
\begin{align*}
\alpha (\mathbf{u} \cdot \mathbf{v}) &=(\alpha \mathbf{u}) \cdot \mathbf{v} \stackrel{(*)}{=} A\mathbf{u}\cdot \mathbf{v} =(A\mathbf{u})^{\trans} \mathbf{v}\\
&=\mathbf{u}^{\trans}A^{\trans}\mathbf{v} \text{ (This follows from the fact mentioned in the hint above)} \\
&=\mathbf{u}^{\trans}A\mathbf{v} \text{ (since $A$ is symmetric.)}\\
& \stackrel{(*)}{=} \mathbf{u}^{\trans}\beta \mathbf{v}=\beta (\mathbf{u}^{\trans} \mathbf{v})=\beta (\mathbf{u}\cdot \mathbf{v}).
\end{align*}

Therefore we obtain
\[\alpha (\mathbf{u} \cdot \mathbf{v})=\beta (\mathbf{u} \cdot \mathbf{v}),\] and thus
\[(\alpha-\beta)(\mathbf{u} \cdot \mathbf{v})=0.\]

Since $\alpha$ and $\beta$ are distinct, $\alpha-\beta \neq 0$.
Hence we must have
\[\mathbf{u} \cdot \mathbf{v}=0,\] and the eigenvectors $\mathbf{u}, \mathbf{v}$ are orthogonal.


FavoriteLoadingAdd to solve later

Sponsored Links

More from my site

  • Inner Product, Norm, and Orthogonal VectorsInner Product, Norm, and Orthogonal Vectors Let $\mathbf{u}_1, \mathbf{u}_2, \mathbf{u}_3$ are vectors in $\R^n$. Suppose that vectors $\mathbf{u}_1$, $\mathbf{u}_2$ are orthogonal and the norm of $\mathbf{u}_2$ is $4$ and $\mathbf{u}_2^{\trans}\mathbf{u}_3=7$. Find the value of the real number $a$ in […]
  • Unit Vectors and Idempotent MatricesUnit Vectors and Idempotent Matrices A square matrix $A$ is called idempotent if $A^2=A$. (a) Let $\mathbf{u}$ be a vector in $\R^n$ with length $1$. Define the matrix $P$ to be $P=\mathbf{u}\mathbf{u}^{\trans}$. Prove that $P$ is an idempotent matrix. (b) Suppose that $\mathbf{u}$ and $\mathbf{v}$ be […]
  • Find the Eigenvalues and Eigenvectors of the Matrix $A^4-3A^3+3A^2-2A+8E$.Find the Eigenvalues and Eigenvectors of the Matrix $A^4-3A^3+3A^2-2A+8E$. Let \[A=\begin{bmatrix} 1 & -1\\ 2& 3 \end{bmatrix}.\] Find the eigenvalues and the eigenvectors of the matrix \[B=A^4-3A^3+3A^2-2A+8E.\] (Nagoya University Linear Algebra Exam Problem)   Hint. Apply the Cayley-Hamilton theorem. That is if $p_A(t)$ is the […]
  • Quiz 3. Condition that Vectors are Linearly Dependent/ Orthogonal Vectors are Linearly IndependentQuiz 3. Condition that Vectors are Linearly Dependent/ Orthogonal Vectors are Linearly Independent (a) For what value(s) of $a$ is the following set $S$ linearly dependent? \[ S=\left \{\,\begin{bmatrix} 1 \\ 2 \\ 3 \\ a \end{bmatrix}, \begin{bmatrix} a \\ 0 \\ -1 \\ 2 \end{bmatrix}, \begin{bmatrix} 0 \\ 0 \\ a^2 […]
  • Find the Limit of a MatrixFind the Limit of a Matrix Let \[A=\begin{bmatrix} \frac{1}{7} & \frac{3}{7} & \frac{3}{7} \\ \frac{3}{7} &\frac{1}{7} &\frac{3}{7} \\ \frac{3}{7} & \frac{3}{7} & \frac{1}{7} \end{bmatrix}\] be $3 \times 3$ matrix. Find \[\lim_{n \to \infty} A^n.\] (Nagoya University Linear […]
  • Determine the Values of $a$ such that the 2 by 2 Matrix is DiagonalizableDetermine the Values of $a$ such that the 2 by 2 Matrix is Diagonalizable Let \[A=\begin{bmatrix} 1-a & a\\ -a& 1+a \end{bmatrix}\] be a $2\times 2$ matrix, where $a$ is a complex number. Determine the values of $a$ such that the matrix $A$ is diagonalizable. (Nagoya University, Linear Algebra Exam Problem)   Proof. To […]
  • Sequence Converges to the Largest Eigenvalue of a MatrixSequence Converges to the Largest Eigenvalue of a Matrix Let $A$ be an $n\times n$ matrix. Suppose that $A$ has real eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$ with corresponding eigenvectors $\mathbf{u}_1, \mathbf{u}_2, \dots, \mathbf{u}_n$. Furthermore, suppose that \[|\lambda_1| > |\lambda_2| \geq \cdots \geq […]
  • Eigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is EvenEigenvalues of Real Skew-Symmetric Matrix are Zero or Purely Imaginary and the Rank is Even Let $A$ be a real skew-symmetric matrix, that is, $A^{\trans}=-A$. Then prove the following statements. (a) Each eigenvalue of the real skew-symmetric matrix $A$ is either $0$ or a purely imaginary number. (b) The rank of $A$ is even.   Proof. (a) Each […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

More in Linear Algebra
Problems and solutions in Linear Algebra
Dimension of Null Spaces of Similar Matrices are the Same

Suppose that $n\times n$ matrices $A$ and $B$ are similar. Then show that the nullity of $A$ is equal to...

Close