If I throw a ball up in the air it will eventually reach zero velocity before descending. My question is this: how long will the ball be at zero velocity? It seems to me that this time interval should be infinitesimally small. What am I missing?
The ball will have a speed equal to zero during the time :\text{The ball will have a speed equal to zero during the time :}The ball will have a speed equal to zero during the time :
t=2Δtt = 2\Delta tt=2Δt
where Δt the absolute error of time measurement\text{where }\Delta t \text{ the absolute error of time measurement}where Δt the absolute error of time measurement
Need a fast expert's response?
and get a quick answer at the best price
for any assignment or question with DETAILED EXPLANATIONS!
Comments