Question #269185

The dean of a university wants to use the mean of a random sample to estimate the average amount of time students take to get from one class to the next. She wants to assert with probability 0.95 that her error will be at most 0.25 minutes. If she knows from the past studies that population standard deviation is 15 minutes, how large a sample she will need?


1
Expert's answer
2021-11-22T19:38:50-0500

If xˉ\bar{x} is used as an estimate of μ,\mu, we can be 100(1α)100(1-\alpha)% confident that the error xˉμ|\bar{x}-\mu|

will not exceed a specified amount EE when the sample size is


n(zα/2σE)2n\geq(\dfrac{z_{\alpha/2}\sigma}{E})^2

Given 95% confidence interval, zα/2=1.96z_{\alpha/2}=1.96

E=0.25 min,σ=15 minE=0.25\ min, \sigma=15\ min

n(1.96(15)0.25)2n\geq(\dfrac{1.96(15)}{0.25})^2

n13830n\geq13830

n=13830n=13830



Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!
LATEST TUTORIALS
APPROVED BY CLIENTS