Show that the adjusted sample variance is a consistent estimator of the true variance
Solution:
Let "X_{1}, X_{2} \\ldots \\ldots \\ldots X_{n}" be the independent observations from a population with mean "\\mu" and variance "\\sigma^{2}"
"E\\left(X_{i}\\right)=\\mu, \\operatorname{Var}\\left(X_{i}\\right)=\\sigma^{2}\n\n\\\\E\\left(X^{2}\\right)=\\sigma^{2}+\\mu^{2}\n\n\\\\\\operatorname{Var}(X)=E\\left(X^{2}\\right)-[E(X)]^{2}\n\n\\\\E\\left(\\bar{X}^{2}\\right)=\\frac{\\sigma^{2}}{n}+\\mu^{2}"
Now, we show that
"{E}\\left(s^{2}\\right)={E}\\left(\\frac{\\sum_{i=1}^{n}\\left(X_{i}-\\bar{X}\\right)^{2}}{n-1}\\right)=\\sigma^{2}"
"\\sum" is going from 1 to n
"{E}\\left(\\sum\\left(X_{i}-\\bar{X}\\right)^{2}\\right)={E}\\left(\\sum X_{i}^{2}-2 \\bar{X} \\sum X_{i}+n \\bar{X}^{2}\\right)=\\sum {E}\\left(X_{i}^{2}\\right)-{E}\\left(n \\bar{X}^{2}\\right)\n\n\\\\\\sum {E}\\left(X_{i}^{2}\\right)-{E}\\left(n \\bar{X}^{2}\\right)=\\sum {E}\\left(X_{i}^{2}\\right)-n {E}\\left(\\bar{X}^{2}\\right)=n \\sigma^{2}+n \\mu^{2}-\\sigma^{2}-n \\mu^{2}"
It will be simplified to "(\\mathrm{n}-1) \\sigma^{2}"
Therefore,
"{E}\\left(\\sum\\left(X_{i}-\\bar{X}\\right)^{2}\\right)=(n-1) \\sigma^{2}\n\\\\{E}\\left(s^{2}\\right)={E}\\left(\\frac{\\Sigma\\left(X_{i}-\\bar{X}\\right)^{2}}{n-1}\\right)=\\frac{1}{n-1} {E}\\left(\\sum\\left(X_{i}-\\bar{X}\\right)^{2}\\right)"
"{E}(s^2)=\\dfrac{(n-1) \\sigma^{2}}{n-1}=\\sigma^{2}"
Hence, it is proved that that the sample variance is an unbiased estimator of the population variance. Replacing X with the Y
"E[\\dfrac1{n-1}\\sum_{i=1}^{n}\\left(Y_{i}-\\bar{Y}\\right)]={\\sigma}^2"
Hence, proved.
Comments
Leave a comment