Question #211221

 Show that the inverse of a square matrix A exists if and only if the

eigenvalues λ1

,λ2

,··· ,λn of A are different from zero. If A

−1

exists

show that its eigenvalues are 1

λ1

,

1

λ2

,···

1

λn

.


1
Expert's answer
2021-06-29T16:48:37-0400

If λ=0\lambda=0 is an eigenvalue of the matrix A, and X0X\ne0 is a corresponding column-eigenvector, then the inverse matrix A1A^{-1} doesn't exist, since X=IX=(A1A)X=A1(AX)=A10=0X=IX=(A^{-1}A)X=A^{-1}(AX)=A^{-1}0=0.


Conversely, if λ=0\lambda=0 is not an eigenvalue of the matrix A, then λ=0\lambda=0 is not a root of the characteristic polynomial pA(λ)=det(AλI)p_A(\lambda)=\det(A-\lambda I) of the matrix A. Therefore, detA0\det A\ne 0 and the matrix A is invertible.


Assuming A is invertible, consider the characteristic polynomial pA(λ)=det(AλI)p_{A}(\lambda)=\det(A-\lambda I) of the matrix A.

pA(λ)=det(AλI)=detAdet(IλA1)=(λ)ndetAdet(A1λ1I)=(λ)ndetApA1(1/λ)p_{A}(\lambda)=\det(A-\lambda I)=\det A\det(I-\lambda A^{-1})=(-\lambda)^n\det A \det(A^{-1} - \lambda^{-1}I)=(-\lambda)^n\det A \cdot p_{A^{-1}}(1/\lambda)

From this we can see that λ\lambda is an eigenvalue of the matrix A if and only if 1/λ1/\lambda is an eigenvalue of the matrix A1A^{-1}. Moreover, one can see that multiplicity of the eigenvalue λ\lambda of the matrix A is the same as multiplicity of the eigenvalue 1/λ1/\lambda of the matrix A1A^{-1}.

So, if λ1,,λn\lambda_1,\dots,\lambda_n are eigenvalues of the matrix A, then 1/λ1,,1/λn1/\lambda_1,\dots,1/\lambda_n are eigenvalues of the matrix A1A^{-1}.

The assertion is proved.


Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!
LATEST TUTORIALS
APPROVED BY CLIENTS