3

I am studying Linear Algebra from the book of the same name by Larry Smith. I am confused with the way Smith proves that row rank of a matrix equals its column rank. Here are the things that he uses to prove it:


Proposition. Let $T: V \to V$ be an endomorphism of the finite dimensional vector space $V$. Then there exists bases $\{ a_1 , \ldots , a_n \}$ and $\{ b_1 , \ldots , b_n \}$ for $V$ such that the matrix of $T$ is $$\begin{bmatrix}  I_{k} & 0 \\ 0 & 0  \end{bmatrix} _{n\times n}$$ where $I_{k}$ is the identity matrix for some integer $k$. The integer $k$ is called the rank of the linear transformation $T$. In fact, $k= \dim \text{Im } T$.


Definition. For a matrix $M$ of size $m \times n$ , we define the row rank to be the maximal number of linearly independent rows of the matrix. Likewise, the column rank is the maximal number of linearly independent columns.


Here's the way the author completes the proof.

Corollary. Let $M$ be a matrix of size $m \times n$. Then the row rank of $M$ equals the column rank of $M$.

Proof. Let $T : \mathbb{R}^{n} \to \mathbb{R}^{m}$ be the linear transformation with matrix $M$ with respect to the standard bases. The columns of $M$ span $\text{Im } T$, so the number of linearly independent columns is the rank of the linear transformation $T$. The proposition implies that the rank of $T$ is also the number of linearly independent rows of $M$. $\blacksquare$

Here are the few questions that I have:

How does the proposition implies the rank of $T$ $=$ number of linearly independent rows of $M$. Is there anything I'm failing to notice? In fact, the proposition holds for endomorphisms and $T$ is not even an endomorphism. However, apparently the proposition can be easily generalized for any two vector spaces $V$ and $W$.

What follows is my attempt of proving it. I have tried using the change of basis theorem.

I generalized the above proposition and have stated it but not made an attempt to prove it here.


Proposition 2. Let $T: V \to W$ be a linear transformation from the finite dimensional vector space $V$ into $W$. Then there exists bases $\{ a_1 , \ldots , a_n \}$ and $\{ b_1 , \ldots , b_m \}$ for $V$ such that the matrix of $T$ is $$\begin{bmatrix}  I_{k} & 0 \\ 0 & 0  \end{bmatrix} _{m\times n}$$ where $I_{k}$ is the identity matrix for some integer $k$. In fact, $k= \dim \text{Im } T$.


Here's my incomplete attempt at proving that row rank equals column rank:

Let $T : \mathbb{R}^{n} \to \mathbb{R}^{m}$ be the linear transformation with matrix $M$ with respect to the standard bases. By proposition 2, there are bases $\{ a_{1} , \ldots , a_{n} \}$ and $\{ b_{1} , \ldots , b_{m} \}$ such that the matrix of $T$ is $$J= \begin{bmatrix}  I_{k} & 0 \\ 0 & 0  \end{bmatrix} _{m\times n}$$ Since $M$ and $J$ represent the same linear transformation, namely $T$, there are nonsingular matrices $P_{m \times m}$, $Q_{n \times n}$ such that $M=P^{-1}JQ$ or $PM=JQ$. I know that row rank of $M$ = row rank of $PM$ and column rank of $J$ = column rank of $JQ$ but how do I show that row rank of $M$ = column rank of $M$ = $k$?

ashK
  • 3,985
  • $PMQ^{-1}=J$, Row rank of $M$ = Row rank of $J =k$ and similarly column rank of $M$ = column rank of $J =k$ .... check https://math.stackexchange.com/questions/847329/prove-rankap-ranka-if-p-is-an-invertible-n-%c3%97-n-matrix-and-a-is-any-m-%c3%97-n-m it might help – Fareed Abi Farraj Apr 10 '19 at 15:32
  • @FareedAF the first answer assumes what I'm trying to prove. Is there any other way around it? I'm quite stuck here – ashK Apr 10 '19 at 15:51
  • The number of independent rows in the matrix $J$ is clearly $k$, is this what your trying to proof? And also the number of independent columns (i.e. here the number of non-zero columns) in $J$ is also $k$ – Fareed Abi Farraj Apr 10 '19 at 21:02
  • @FareedAF oh no. Row rank of J = column rank of J =k is evident from the matrix. I'm trying to prove that row rank of M = row rank of J and column rank of M = column rank of J. That would clearly complete the proof. – ashK Apr 11 '19 at 01:32
  • In the link that I sent in the first comment they proved that the rank of $PM$ equals the rank of $M$, and you can apply it again to get that the rank of $PMQ^{-1}$ equals the rank of $PM$. Hence, $PMQ^{-1}$ which is $J$ have the same rank as $M$. And knowing that row rank of $J$= column rank of $J$=rank of $J$ and thats equal to row rank of $M$= column rank of $M$=rank of $M$. Thus, row rank of $M$= row rank of $J$ and same for the columns. And if you still want a proof for why row rank of $M$ = column rank of $M$, you can find that by searching google or even maybe the MSE – Fareed Abi Farraj Apr 11 '19 at 03:51

0 Answers0