Nielsen Company reported that children between the ages of 2 and 5 watch an average of 25 hours of television per week. Assume the variable is normally distributed and the standard deviation is 3 hours. If a group of 20 children between the ages of 2 and 5 are randomly selected, find the probability that the mean of the number of hours they watch television will be greater than 26.3 hours.
"z=\\frac{x-\\mu}{\\sigma\/\\sqrt n}=\\frac{26.3-25}{3\/\\sqrt{20}}=1.94"
"P(x>26.3)=1-P(z<1.94)=1-0.9738=0.262"
Comments
Leave a comment