LA11

From Example Problems
Jump to: navigation, search

Let A\, be an n\times n matrix such that A^{2}=A\,, and let I\, be the n\times n identity matrix. Prove that {{\rm {rank}}}(A)+{{\rm {rank}}}(A-I)=n\,.


Let x\in {{\rm {Col}}}(A), so there exists a vector y\in {\mathbb  {R}}^{n} such that Ay=x\,, or A^{2}y=Ax=x\,, so (A-I)x=0\,; that is, {{\rm {Col}}}(A)\subseteq {{\rm {Null}}}(A-I). Further, if x\in {{\rm {Null}}}(A-I), (A-I)x=0\, implies Ax=x\in {{\rm {Col}}}(A), so {{\rm {Col}}}(A)={{\rm {Null}}}(A-I)\,. Then, by the Rank Theorem, n={{\rm {rank}}}(A-I)+{{\rm {dim\ Null}}}(A-I)={{\rm {rank}}}(A-I)+{{\rm {dim\ Col}}}A={{\rm {rank}}}(A)+{{\rm {rank}}}(A-I).