It is reported that children between 2 and 6 years old watch an average of 20 hours of TV per week. Assume the variable is normally distributed and the standard deviation is 3 hours. If 20 children between the ages of 2 and 6 are randomly selected, find the probability that the mean of the number of hours they watch TV will be greater than 21.3 hours.
Solution:
Given, "\\mu=20,\\sigma=3,n=20"
"X\\sim N(\\mu,\\sigma)\n\\\\P(X>21.3)=1-P(X\\le21.3)\n\\\\=1-P(z\\le \\dfrac{21.3-20}{3\/\\sqrt {20}})\n\\\\=1-P(z\\le 1.9379)\n\\\\=1-0.97381\n\\\\=0.02619"
Comments
Leave a comment