(a) Any square matrix with real entries is either symmetric or skew-symmetric or a linear combination of such matrices.
True
Let be a symmetric matrix be a skew matrix
Let then
Therefore is a symmetric matrix
Now let
Hence, is a skew-symmetric matrix
If is a square matrix in then
(b) If a linear operator has an eigenvalue 0, then it cannot be one-one.
False
Let be an dimensional vector space over an algebraically closed field F. Let x be a non-zero vector in V. If , then 0 is an eigenvalue. Assume that . Then we known that is linearly dependent, so that there exist in F with such that and
i.e.,
Thus,
where . By fundamental theorem of algebra, there exist in F such that
Since , we have
This shows that at least one of is not one-one.
(c) If f : V --> K is a non-zero linear functional and V a vector space of dimension n, then there are n - 1 linearly independent vectors v belongs to V such that f(v) = 0.
False
By definition,
Suppose V is a vector space of dimension n, and that are vectors of V. The set of vectors is linearly independent.
If for some where at least one of is a non-zero. This implies that there are n linearly independent vectors of V as opposed to n-1
(d) Every binary operation on Rn is commutative, for all n belongs to N.
True
Let∗ be a binary operation on a set Rn. Given any three elements and of a set Rn, the binary operation, applied to the elements m∗ n and p of Rn ,yields an element (m∗n)∗p of Rn, and, applied to the elements m and n∗p of Rn, yields an element m∗(n∗p) of Rn.
(e) IAdj (A)I = IAI for all A belongs Mn(R).
False
Every invertible matrix is a product of some elementary matrices. This
is seen by applications of a series of elementary row operations to the matrix
to get the identity matrix I. When matrix A is invertible, the inverse can
be found by the adjoint, the formula
It is easy to see that where k is a constant and n is the order of the square matrix A.
We know that .Since is , we get which implies is .
So .
Comments