A Relation of Nonzero Row Vectors and Column Vectors

Linear Algebra Problems and Solutions

Problem 406

Let $A$ be an $n\times n$ matrix. Suppose that $\mathbf{y}$ is a nonzero row vector such that
\[\mathbf{y}A=\mathbf{y}.\] (Here a row vector means a $1\times n$ matrix.)
Prove that there is a nonzero column vector $\mathbf{x}$ such that
\[A\mathbf{x}=\mathbf{x}.\] (Here a column vector means an $n \times 1$ matrix.)

 
LoadingAdd to solve later

Sponsored Links


We give two proofs. The first proof does not use the theory of eigenvalues and the second one uses it.

Proof 1.(Without the theory of eigenvalues)

Let $I$ be the $n\times n$ identity matrix. Then we have
\begin{align*}
\mathbf{0}_{1\times n}=\mathbf{y}A-\mathbf{y}=\mathbf{y}(A-I),
\end{align*}
where $\mathbf{0}_{1\times n}$ is the row zero vector.

Taking the transpose, we have
\begin{align*}
\mathbf{0}_{n\times 1}&=\mathbf{0}_{1\times n}^{\trans}=\left(\,\mathbf{y}(A-I) \,\right)^{\trans}\\
&=(A-I)^{\trans}\mathbf{y}^{\trans}.
\end{align*}

Since the vector $\mathbf{y}$ is nonzero, the transpose $\mathbf{y}^{\trans}$ is a nonzero column vector.
Thus, the above equality yields that the matrix $(A-I)^{\trans}$ is singular.
It follows that the matrix $A-I$ is singular as well.
Hence there exists a nonzero column vector $\mathbf{x}$ such that
\[(A-I)\mathbf{x}=\mathbf{0}_{n\times 1},\] and consequently we have
\[A\mathbf{x}=\mathbf{x}\] for a nonzero column vector $\mathbf{x}$.
 

Proof 2. (Using the theory of eigenvalues)

Taking the conjugate of the both sides of the identity $\mathbf{y}A=\mathbf{y}$, we obtain
\[A^{\trans}\mathbf{y}^{\trans}=\mathbf{y}^{\trans}.\] Since $\mathbf{y}$ is a nonzero row vector, $\mathbf{y}^{\trans}$ is a nonzero column vector.
It follows that $1$ is an eigenvalue of the matrix $A^{\trans}$ and $\mathbf{y}^{\trans}$ is a corresponding eigenvector.

Since the matrices $A$ and $A^{\trans}$ has the same eigenvalues , we deduce that the matrix $A$ has an eigenvalue $1$.
(See part (b) of the post “Transpose of a matrix and eigenvalues and related questions.“.)
Let $\mathbf{x}$ be an eigenvector corresponding to the eigenvalue $1$ (by definition $\mathbf{x}$ is nonzero). Then we have
\[A\mathbf{x}=\mathbf{x}\] as required.


LoadingAdd to solve later

Sponsored Links

More from my site

  • Subspaces of Symmetric, Skew-Symmetric MatricesSubspaces of Symmetric, Skew-Symmetric Matrices Let $V$ be the vector space over $\R$ consisting of all $n\times n$ real matrices for some fixed integer $n$. Prove or disprove that the following subsets of $V$ are subspaces of $V$. (a) The set $S$ consisting of all $n\times n$ symmetric matrices. (b) The set $T$ consisting of […]
  • Find the Nullity of the Matrix $A+I$ if Eigenvalues are $1, 2, 3, 4, 5$Find the Nullity of the Matrix $A+I$ if Eigenvalues are $1, 2, 3, 4, 5$ Let $A$ be an $n\times n$ matrix. Its only eigenvalues are $1, 2, 3, 4, 5$, possibly with multiplicities. What is the nullity of the matrix $A+I_n$, where $I_n$ is the $n\times n$ identity matrix? (The Ohio State University, Linear Algebra Final Exam […]
  • Find All the Values of $x$ so that a Given $3\times 3$ Matrix is SingularFind All the Values of $x$ so that a Given $3\times 3$ Matrix is Singular Find all the values of $x$ so that the following matrix $A$ is a singular matrix. \[A=\begin{bmatrix} x & x^2 & 1 \\ 2 &3 &1 \\ 0 & -1 & 1 \end{bmatrix}.\]   Hint. Use the fact that a matrix is singular if and only if its determinant is […]
  • Compute Determinant of a Matrix Using Linearly Independent VectorsCompute Determinant of a Matrix Using Linearly Independent Vectors Let $A$ be a $3 \times 3$ matrix. Let $\mathbf{x}, \mathbf{y}, \mathbf{z}$ are linearly independent $3$-dimensional vectors. Suppose that we have \[A\mathbf{x}=\begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}, A\mathbf{y}=\begin{bmatrix} 0 \\ 1 \\ 0 […]
  • Rotation Matrix in Space and its Determinant and EigenvaluesRotation Matrix in Space and its Determinant and Eigenvalues For a real number $0\leq \theta \leq \pi$, we define the real $3\times 3$ matrix $A$ by \[A=\begin{bmatrix} \cos\theta & -\sin\theta & 0 \\ \sin\theta &\cos\theta &0 \\ 0 & 0 & 1 \end{bmatrix}.\] (a) Find the determinant of the matrix $A$. (b) Show that $A$ is an […]
  • Eigenvalues of a Matrix and its Transpose are the SameEigenvalues of a Matrix and its Transpose are the Same Let $A$ be a square matrix. Prove that the eigenvalues of the transpose $A^{\trans}$ are the same as the eigenvalues of $A$.   Proof. Recall that the eigenvalues of a matrix are roots of its characteristic polynomial. Hence if the matrices $A$ and $A^{\trans}$ […]
  • Find All Values of $x$ so that a Matrix is SingularFind All Values of $x$ so that a Matrix is Singular Let \[A=\begin{bmatrix} 1 & -x & 0 & 0 \\ 0 &1 & -x & 0 \\ 0 & 0 & 1 & -x \\ 0 & 1 & 0 & -1 \end{bmatrix}\] be a $4\times 4$ matrix. Find all values of $x$ so that the matrix $A$ is singular.   Hint. Use the fact that a matrix is singular if and only […]
  • Column Rank = Row Rank. (The Rank of a Matrix is the Same as the Rank of its Transpose)Column Rank = Row Rank. (The Rank of a Matrix is the Same as the Rank of its Transpose) Let $A$ be an $m\times n$ matrix. Prove that the rank of $A$ is the same as the rank of the transpose matrix $A^{\trans}$.   Hint. Recall that the rank of a matrix $A$ is the dimension of the range of $A$. The range of $A$ is spanned by the column vectors of the matrix […]

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More in Linear Algebra
Linear algebra problems and solutions
Express a Hermitian Matrix as a Sum of Real Symmetric Matrix and a Real Skew-Symmetric Matrix

Recall that a complex matrix is called Hermitian if $A^*=A$, where $A^*=\bar{A}^{\trans}$. Prove that every Hermitian matrix $A$ can be...

Close