Answer to Question #132434 in Algebra for Evans Malepe

Question #132434
Suppose u, v and w are vectors in 3–space. Which of the following is/are defined? Explain
A. (u x v).w
B. u .(v x w)

Suppose u and v are vectors in 3–space where u = (u1;u2;u3) and v = (v1;v2;v3). Evaluate u x v x u
and v x u x u.


Suppose A and B are 3 x 3 matrices. Prove that
(i) (A + B)T = BT + AT
(ii) det(AB)T = det(A) : det(B)
1
Expert's answer
2020-09-14T11:29:19-0400

(u x v).w

Here (u x v) is a vector. The cross product of 2 vectors results in a vector.

Let (u x v) = p where p is a vector

(u x v).w= p.w .This gives a numerical value or a scalar as output. Since dot product of 2 vectors results in a scalar.

Thus (u x v).w is defined in the vector space.


u .(v x w)

The cross product of 2 vectors results in a vector. Hence (v x w) is a vector.

Let v x w = q . Where q is a vector

u .(v x w)= u.q Which results in a scalar output.

Thus u .(v x w) is also defined in a vector space







u = (u1;u2;u3) and v = (v1;v2;v3)

The triple cross product of 3 vectors A B C are given as

Ax(BxC)=(A.C)B - (A.B)C


u x v x u =(u.u)v - (u.v)u .......... (u.u=|u||u|cos0=|u|2 )

=|u|2 v - (u.v)u



Now

v x u x u = v x(u x u)= v x 0 =0 .......(uxu=|u||u|sin0=0, angle between u and u is 0 and sin0=0)

= v x 0

=0




Prove (A+B)T=BT+AT


A=[aij]

B=[bij]

A+B=[aij+bij]

(A+B)T=[aji+bji] ..........(1)

BT=[bji]

AT=[aji]

BT+AT=[bji + aji] =[aji + bji] ............(2)

Right hand side of equation (1) and (2) are equal. So Left hand side will also be equal

(A+B)T=BT+AT




We know Determinant of a matrix and its transpose are equal

det(AB)T=det(AB)


Case 1

If A is not invertible, then AB is not invertible, then the theorem holds,

because

det(A)=0

Thus det(A)det(B)=0

And Rank of AB will be less than 3

Thus det(AB)=0

det(AB) = det(A) det(B)


Case 2

If A is invertible,

Then there exist elementary row matrices Ek, · · · , E1 such that

A = Ek · · · E1. That is A could be represented as product of Elementary row matrices


Then,

det(AB) = det(Ek · · · E1B), (we use the property det(EA) = det(E) det(A) Where E is an elementary row matrix)

= det(Ek) det(Ek−1 · · · E1B),

= det(Ek)· · · det(E1) det(B),

= det(Ek · · · E1) det(B),

= det(A) det(B).


Hence

det(AB)T=det(A)det(B)



Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!

Leave a comment

LATEST TUTORIALS
New on Blog
APPROVED BY CLIENTS