Question #93701
One of the fastest pitches ever thrown in Major League Baseball was by Aroldis Chapman and had a velocity of 105.1 miles/hour. How many seconds did it take this pitch to travel the 60 feet and 6 inches from the pitcher's mound to home plate? (1 mile = 5280 feet)
1
Expert's answer
2019-09-03T07:20:23-0400

Speed of ball=105.1 miles/hr=105.1×52803600feets/sec105.1 \space miles/hr=\frac{105.1 \times 5280}{3600} feets/sec =154.15feets/sec154.15 feets/sec

Distance traveled=60 feet+6 inches=60.5 feets (6 inches=0.5feet)60 \space feet +6 \space inches=60.5 \space feets\space( 6 \space inches=0.5 feet)

Time taken=distancespeed=60.5154.15=\frac{distance}{speed}=\frac{60.5}{154.15} =0.392sec=0.392 sec


Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!
LATEST TUTORIALS
APPROVED BY CLIENTS