Answer 1)
Given that X1, X2, X3,.....XN are independently selected random variables from the population with the mean "\\mu" and variance "\\sigma^2"
Then the expected value of the sample mean is given by,
"E[\\bar X]=E[\\frac {X1+X2+X3+X4.....Xn}{n}]=E[\\frac {1}{n}(X1+X2+X3.....Xn)]"
"E[\\bar X]=\\frac {1}{n}E[X1+X2+X3+....+Xn]"
"E[\\bar X]=\\frac{1}{n}[E[X1]+E[X2]+E[X3]+....E[Xn]]"
But since the expected value of each of the above random samples is "\\mu" we can write
"E[\\bar X]=\\frac {1}{n}[\\mu+\\mu+\\mu+......\\mu]=\\frac {1}{n}[n*\\mu]=\\mu"
This implies that,
If X1, X2, ..., XE is a random sample, then the sample mean is an unbiased estimator of
the population mean = TRUE
Answer 2)
Let us check whether the sample variance "s^2" is an unbiased maximum likelihood estimator for population variance
"s^{2}=\\frac{1}{N} \\sum_{i=1}^{N}\\left(x_{i}-\\bar{x}\\right)^{2}"
In statistics, we evaluate the “goodness” of the estimation by checking if the estimation is “unbiased”. By saying “unbiased”, it means the expectation of the estimator equals to the true value, e.g. if E[x] = µ then the mean estimator is unbiased. Now we will show that the equation actually holds for mean estimator.
"\\begin{aligned}\n\\mathbb{E}[\\bar{x}] &=\\mathbb{E}\\left[\\frac{1}{N} \\sum_{i=1}^{N} x_{i}\\right]=\\frac{1}{N} \\sum_{i=1}^{N} \\mathbb{E}[x] \\\\\n&=\\frac{1}{N} \\cdot N \\cdot \\mathbb{E}[x] \\\\\n&=\\mathbb{E}[x]=\\mu\n\\end{aligned}"
The first line makes use of the assumption that the samples are drawn i.i.d from the true distribution, thus E[xi ] is actually E[x]. From the proof above, it is shown that the mean estimator is unbiased.
Now we move to the variance estimator. At the first glance, the variance estimator "s^{2}=\\frac{1}{N} \\sum_{i=1}^{N}\\left(x_{i}-\\bar{x}\\right)^{2}" should follow because mean estimator x is unbiased. However, it is not the case
"\\begin{aligned}\n\\mathbb{E}\\left[s^{2}\\right] &=\\mathbb{E}\\left[\\frac{1}{N} \\sum_{i=1}^{N}\\left(x_{i}-\\bar{x}\\right)^{2}\\right] \\\\\n&=\\frac{1}{N} \\mathbb{E}\\left[\\sum_{i=1}^{N} x_{i}^{2}-2 \\sum_{i=1}^{N} x_{i} \\bar{x}+\\sum_{i=1}^{N} \\bar{x}^{2}\\right]\n\\end{aligned}"
"\\text { We know } \\sum_{i=1}^{N} x_{i}=N \\cdot \\bar{x} \\text { and } \\sum_{i=1}^{N} \\bar{x}^{2}=N \\cdot \\bar{x}^{2} . \\text { Plug these into the derivation: }"
"\\begin{aligned}\n\\mathbb{E}\\left[s^{2}\\right] &=\\frac{1}{N} \\mathbb{E}\\left[\\sum_{i=1}^{N} x_{i}^{2}-2 N \\cdot \\bar{x}^{2}+N \\cdot \\bar{x}^{2}\\right] \\\\\n&=\\frac{1}{N} \\mathbb{E}\\left[\\sum_{i=1}^{N} x_{i}^{2}-N \\cdot \\bar{x}^{2}\\right] \\\\\n&=\\frac{1}{N} \\mathbb{E}\\left[\\sum_{i=1}^{N} x_{i}^{2}\\right]-\\mathbb{E}\\left[\\bar{x}^{2}\\right] \\\\\n&=\\mathbb{E}\\left[x^{2}\\right]-\\mathbb{E}\\left[\\bar{x}^{2}\\right]\n\\end{aligned}"
"\\begin{aligned}\n\\mathbb{E}\\left[s^{2}\\right] &=\\left(\\sigma_{x}^{2}+\\mu^{2}\\right)-\\left(\\sigma_{\\bar{x}}^{2}+\\mu^{2}\\right) \\\\\n&=\\sigma_{x}^{2}-\\sigma_{\\bar{x}}^{2} \\\\\n\\sigma_{\\bar{x}}^{2}=\\operatorname{VAR}[\\bar{x}]=& \\operatorname{VAR}\\left[\\frac{1}{N} \\sum_{i=1}^{N} x_{i}\\right]=\\frac{1}{N^{2}} \\operatorname{VAR}\\left[\\sum_{i=1}^{N} x_{i}\\right]\n\\end{aligned}"
"\\operatorname{VAR}\\left[\\sum_{i=1}^{N} x_{i}\\right]=\\sum_{i=1}^{N} \\operatorname{VAR}[x]=N \\cdot \\operatorname{VAR}[x]"
"\\sigma_{\\bar{x}}^{2}=\\frac{1}{N} \\operatorname{VAR}[x]=\\frac{1}{N} \\sigma_{x}^{2}"
"\\mathbb{E}\\left[s^{2}\\right]=\\frac{N-1}{N} \\sigma_{x}^{2}"
"\\mathbb{E}\\left[s^{2}\\right] \\neq \\sigma_{x}^{2}"
Hence we can say that the following statement,
A maximum likelihood estimator is always
unbiased = FALSE
Comments
Leave a comment