# A Relation of Nonzero Row Vectors and Column Vectors ## Problem 406

Let $A$ be an $n\times n$ matrix. Suppose that $\mathbf{y}$ is a nonzero row vector such that
$\mathbf{y}A=\mathbf{y}.$ (Here a row vector means a $1\times n$ matrix.)
Prove that there is a nonzero column vector $\mathbf{x}$ such that
$A\mathbf{x}=\mathbf{x}.$ (Here a column vector means an $n \times 1$ matrix.) Add to solve later

We give two proofs. The first proof does not use the theory of eigenvalues and the second one uses it.

## Proof 1.(Without the theory of eigenvalues)

Let $I$ be the $n\times n$ identity matrix. Then we have
\begin{align*}
\mathbf{0}_{1\times n}=\mathbf{y}A-\mathbf{y}=\mathbf{y}(A-I),
\end{align*}
where $\mathbf{0}_{1\times n}$ is the row zero vector.

Taking the transpose, we have
\begin{align*}
\mathbf{0}_{n\times 1}&=\mathbf{0}_{1\times n}^{\trans}=\left(\,\mathbf{y}(A-I) \,\right)^{\trans}\\
&=(A-I)^{\trans}\mathbf{y}^{\trans}.
\end{align*}

Since the vector $\mathbf{y}$ is nonzero, the transpose $\mathbf{y}^{\trans}$ is a nonzero column vector.
Thus, the above equality yields that the matrix $(A-I)^{\trans}$ is singular.
It follows that the matrix $A-I$ is singular as well.
Hence there exists a nonzero column vector $\mathbf{x}$ such that
$(A-I)\mathbf{x}=\mathbf{0}_{n\times 1},$ and consequently we have
$A\mathbf{x}=\mathbf{x}$ for a nonzero column vector $\mathbf{x}$.

## Proof 2. (Using the theory of eigenvalues)

Taking the conjugate of the both sides of the identity $\mathbf{y}A=\mathbf{y}$, we obtain
$A^{\trans}\mathbf{y}^{\trans}=\mathbf{y}^{\trans}.$ Since $\mathbf{y}$ is a nonzero row vector, $\mathbf{y}^{\trans}$ is a nonzero column vector.
It follows that $1$ is an eigenvalue of the matrix $A^{\trans}$ and $\mathbf{y}^{\trans}$ is a corresponding eigenvector.

Since the matrices $A$ and $A^{\trans}$ has the same eigenvalues , we deduce that the matrix $A$ has an eigenvalue $1$.
(See part (b) of the post “Transpose of a matrix and eigenvalues and related questions.“.)
Let $\mathbf{x}$ be an eigenvector corresponding to the eigenvalue $1$ (by definition $\mathbf{x}$ is nonzero). Then we have
$A\mathbf{x}=\mathbf{x}$ as required. Add to solve later

### More from my site

#### You may also like...

This site uses Akismet to reduce spam. Learn how your comment data is processed.

###### More in Linear Algebra ##### Express a Hermitian Matrix as a Sum of Real Symmetric Matrix and a Real Skew-Symmetric Matrix

Recall that a complex matrix is called Hermitian if $A^*=A$, where $A^*=\bar{A}^{\trans}$. Prove that every Hermitian matrix $A$ can be...

Close