Let the row space of the matrix A is denoted by {u1,u2,u3} .
Where u1=(1,1,0,0),u2=(−1,3,0,1),u3=(−3,1,−2,1)
Let the orthonormal basis of the row space of A is {w1,w2,w3} .
Now, applying Gram-Schmidt Orthogonalization process ,
w1=u1=(1,1,0,0)
w2=u2−<w1,w1><u2,w1>w1 =u2−22w1=(−2,2,0,1)
w3=u3−<w1,w1><u3,w1>w1−<w2,w2><u3,w2>w2 =u3−2−2w1−99w2
=(−3,1,−2,1)+(1,1,0,0)−(−2,2,0,1)
=(0,0,−2,0)
Comments