I am studying Linear Algebra from the book of the same name by Larry Smith. I am confused with the way Smith proves that row rank of a matrix equals its column rank. Here are the things that he uses to prove it:
Proposition. Let $T: V \to V$ be an endomorphism of the finite dimensional vector space $V$. Then there exists bases $\{ a_1 , \ldots , a_n \}$ and $\{ b_1 , \ldots , b_n \}$ for $V$ such that the matrix of $T$ is $$\begin{bmatrix} I_{k} & 0 \\ 0 & 0 \end{bmatrix} _{n\times n}$$ where $I_{k}$ is the identity matrix for some integer $k$. The integer $k$ is called the rank of the linear transformation $T$. In fact, $k= \dim \text{Im } T$.
Definition. For a matrix $M$ of size $m \times n$ , we define the row rank to be the maximal number of linearly independent rows of the matrix. Likewise, the column rank is the maximal number of linearly independent columns.
Here's the way the author completes the proof.
Corollary. Let $M$ be a matrix of size $m \times n$. Then the row rank of $M$ equals the column rank of $M$.
Proof. Let $T : \mathbb{R}^{n} \to \mathbb{R}^{m}$ be the linear transformation with matrix $M$ with respect to the standard bases. The columns of $M$ span $\text{Im } T$, so the number of linearly independent columns is the rank of the linear transformation $T$. The proposition implies that the rank of $T$ is also the number of linearly independent rows of $M$. $\blacksquare$
Here are the few questions that I have:
How does the proposition implies the rank of $T$ $=$ number of linearly independent rows of $M$. Is there anything I'm failing to notice? In fact, the proposition holds for endomorphisms and $T$ is not even an endomorphism. However, apparently the proposition can be easily generalized for any two vector spaces $V$ and $W$.
What follows is my attempt of proving it. I have tried using the change of basis theorem.
I generalized the above proposition and have stated it but not made an attempt to prove it here.
Proposition 2. Let $T: V \to W$ be a linear transformation from the finite dimensional vector space $V$ into $W$. Then there exists bases $\{ a_1 , \ldots , a_n \}$ and $\{ b_1 , \ldots , b_m \}$ for $V$ such that the matrix of $T$ is $$\begin{bmatrix} I_{k} & 0 \\ 0 & 0 \end{bmatrix} _{m\times n}$$ where $I_{k}$ is the identity matrix for some integer $k$. In fact, $k= \dim \text{Im } T$.
Here's my incomplete attempt at proving that row rank equals column rank:
Let $T : \mathbb{R}^{n} \to \mathbb{R}^{m}$ be the linear transformation with matrix $M$ with respect to the standard bases. By proposition 2, there are bases $\{ a_{1} , \ldots , a_{n} \}$ and $\{ b_{1} , \ldots , b_{m} \}$ such that the matrix of $T$ is $$J= \begin{bmatrix} I_{k} & 0 \\ 0 & 0 \end{bmatrix} _{m\times n}$$ Since $M$ and $J$ represent the same linear transformation, namely $T$, there are nonsingular matrices $P_{m \times m}$, $Q_{n \times n}$ such that $M=P^{-1}JQ$ or $PM=JQ$. I know that row rank of $M$ = row rank of $PM$ and column rank of $J$ = column rank of $JQ$ but how do I show that row rank of $M$ = column rank of $M$ = $k$?