Question #133849
A ball is launched from ground level with an initial upwards velocity of 20 m/s and an initial horizontal velocity of 30 m/s. How far from its starting position does the ball land assuming the ground is level?
1
Expert's answer
2020-09-21T08:31:11-0400

Assume that vertical velocity is u,u,

horizontal is vv.

The time for the ball to go up and down:


t=2ugt=2\cdot\frac{u}{g}

The distance the ball will travel horizontally is


R=tv=2vug=122 m.R=t\cdot v=\frac{2vu}{g}=122\text{ m}.

Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!
LATEST TUTORIALS
APPROVED BY CLIENTS