According to Nielsen Media Research, the (population) average number of hours of TV viewing per household per week in the United States is 50.4 hours. Suppose the (population) standard deviation is known to be 11.8 hours. A random sample of 42 U.S. households is taken. What is the probability that the sample mean is less than 40 hours but greater than 35 hours?
"\\mu=50.4,s=11.8\n\n,n=42"
Population Standard deviation "\\sigma=\\dfrac{s}{\\sqrt{n}}=\\dfrac{11.8}{\\sqrt{42}}=1.82"
Let X denote the sample mean
"P(35<X<40)=P(\\dfrac{35-50.4}{1.82}<\\dfrac{x-\\mu}{\\sigma}<\\dfrac{40-50.4}{1.82})\\\\"
"=P(-8.46<z<-5.71)=0.0001"
Comments
Leave a comment