One of the fastest recorded pitches in major-league baseball, thrown by Billy Wagner in 2003, was clocked at 101.0 mi/h. If a pitch were thrown horizontally with this velocity, how far would the ball fall vertically by the time it reached home plate, 60.5 ft away?
First, convert the values to conventional units:
101.0 mi/h is 45.15 m/s (divide by 2.237).
60.5 feet is 18.44 m (1 feet is 0.3048 m).
Calculate the time the ball moved horizontally assuming zero air friction:
And this is the time the ball was falling vertically (see the diagram below). Thus, we can find the height of the pitch:
Comments
Leave a comment