A square matrix "A" is said to be orthogonal if "AA'=I" "=A'A"
where "A'=" Transpose of "A" and "I=" Identity matrix.
Suppose "\\lambda \\in R" be a eigenvalue of "A" .
Then there exists a non zero eigenvector "X" such that
"AX=\\lambda X" "......(1)"
Taking transpose of both sides of the above equality, we get
"(AX)'=(\\lambda X)'"
"\\implies" "X'A'=X' \\lambda"
"\\implies" "X'A'=\\lambda X'"
Multiplying both sides by "AX" we get,
"X'A'AX=\\lambda X'AX"
"\\implies""X'X=\\lambda X' \\lambda X" [from equation (1)]
"\\implies X'X={ \\lambda }^2 X'X"
"\\implies (1- { \\lambda}^2)X'X=0"
Since ,"X \\neq0 \\implies X'X\\neq0."
Therefore,"(1-\\lambda^2)=0"
"\\implies \\lambda^2=1"
Hence "\\lambda =1,-1" .
"(Proved)" .
Let "B=" "\\begin{pmatrix}\n 1&0\\\\\n 0&1\n\\end{pmatrix}" which is a diagonal matrix and "BB'=B'B=I"
Hence ,"B" is an othogonal matrix ,whose eigen values are "+1, +1."
Let "C=\\begin{pmatrix}\n -1 & 0 \\\\\n 0 & -1\n\\end{pmatrix}" which is a diagonal matrix and "CC'=C'C=I"
Hence ,"C" is an orthogonal matrix, whose eigen values are "-1,-1." .
Comments
Leave a comment