Answer to Question #280381 in Statistics and Probability for gunjan

Question #280381

An educator claims that the average IQ of American college student is at most 110, and

that in a study made to test this claim 150 American college students selected at random had

an average IQ of 111.2 with a standard deviation of 7.2. Use a level of significance of 0.01

to test the claim of the educator.


1
Expert's answer
2021-12-17T10:32:16-0500

"n=150,\\space s=7.2,\\bar{x}=111.2"

The hypotheses tested are,

"H_0:\\mu\\leq110\\space vs\\space H_1:\\mu\\gt110"

The test statistic is given by,

"t={(\\bar{x}-\\mu)\\over ({s\\over\\sqrt{n}})}={(111.2-110)\\over({7.2\\over\\sqrt{150}})}=2.0412"

"t" is compared with the t distribution table value at "\\alpha=0.01" with "n-1=150-1=149" degrees of freedom given as,

"t_{{0.01},149}=t_{0.01,149}= 2.351635"

The null hypothesis is rejected if, "t\\gt t_{0.01,149}"

Since "t=2.0412\\lt t_{0.01,149}=2.351635," we fail to reject the null hypothesis and conclude that there is sufficient evidence to support the educator's claim that the average IQ of American college student is at most 110 at 1% level of significance.


Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!

Leave a comment

LATEST TUTORIALS
New on Blog
APPROVED BY CLIENTS