Statistical Odds & Ends

Let’s say you have 2 matrices A and B

which you can multiply together to get AB. Knowing the the rank of
A and B does not in generally give you the rank of AB. However, the ranks of these 3 matrices are governed by the following inequality:

\text{rank} (AB) \leq \min \{ \text{rank}(A), \text{rank}(B) \}.

The proof is pretty easy. The rank of a matrix is the dimension of its column space. For any vector v in the column space of AB, there is some vector x such that v = (AB)x = A(Bx). The last equality implies that v is also in the column space of A. Thus, the column space of AB lies in the column space of A, and so \text{rank}(AB) \leq \text{rank} (A). Repeating the same argument with (AB)^T and noting that the rank of a matrix and its transpose is the same, we have \text{rank}(AB) \leq \text{rank} (B).

In the special case where B is an invertible matrix, we can say a lot more:

\text{rank}(AB) = \text{rank}(A).

The first theorem above proves that the LHS is less than or equal to the RHS; we just need to show that \text{rank}(AB) \geq \text{rank}(A). To do so, note that

\text{rank}(A) = \text{rank}(AB B^{-1}) \leq \min \{\text{rank}(AB),\text{rank}(B^{-1})\} \leq\text{rank}(AB).

Advertisement

Share this:

Like this:

Like

Loading…

Alternate Text Gọi ngay