Answer to Question #131889 in Statistics and Probability for abhi

Question #131889
True or false ?
1. If X1, X2, ..., XE is a random sample, then
the sample mean is an unbiased estimator of
the population mean.

2. A maximum likelihood estimator is always
unbiased.
1
Expert's answer
2020-09-10T17:36:36-0400

Answer 1)


Given that X1, X2, X3,.....XN are independently selected random variables from the population with the mean μ\mu and variance σ2\sigma^2


Then the expected value of the sample mean is given by,


E[Xˉ]=E[X1+X2+X3+X4.....Xnn]=E[1n(X1+X2+X3.....Xn)]E[\bar X]=E[\frac {X1+X2+X3+X4.....Xn}{n}]=E[\frac {1}{n}(X1+X2+X3.....Xn)]


E[Xˉ]=1nE[X1+X2+X3+....+Xn]E[\bar X]=\frac {1}{n}E[X1+X2+X3+....+Xn]


E[Xˉ]=1n[E[X1]+E[X2]+E[X3]+....E[Xn]]E[\bar X]=\frac{1}{n}[E[X1]+E[X2]+E[X3]+....E[Xn]]


But since the expected value of each of the above random samples is μ\mu we can write


E[Xˉ]=1n[μ+μ+μ+......μ]=1n[nμ]=μE[\bar X]=\frac {1}{n}[\mu+\mu+\mu+......\mu]=\frac {1}{n}[n*\mu]=\mu


This implies that,


If X1, X2, ..., XE is a random sample, then the sample mean is an unbiased estimator of

the population mean = TRUE


Answer 2)


Let us check whether the sample variance s2s^2 is an unbiased maximum likelihood estimator for population variance


s2=1Ni=1N(xixˉ)2s^{2}=\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{2}


In statistics, we evaluate the “goodness” of the estimation by checking if the estimation is “unbiased”. By saying “unbiased”, it means the expectation of the estimator equals to the true value, e.g. if E[x] = µ then the mean estimator is unbiased. Now we will show that the equation actually holds for mean estimator.

E[xˉ]=E[1Ni=1Nxi]=1Ni=1NE[x]=1NNE[x]=E[x]=μ\begin{aligned} \mathbb{E}[\bar{x}] &=\mathbb{E}\left[\frac{1}{N} \sum_{i=1}^{N} x_{i}\right]=\frac{1}{N} \sum_{i=1}^{N} \mathbb{E}[x] \\ &=\frac{1}{N} \cdot N \cdot \mathbb{E}[x] \\ &=\mathbb{E}[x]=\mu \end{aligned}

The first line makes use of the assumption that the samples are drawn i.i.d from the true distribution, thus E[xi ] is actually E[x]. From the proof above, it is shown that the mean estimator is unbiased.


Now we move to the variance estimator. At the first glance, the variance estimator s2=1Ni=1N(xixˉ)2s^{2}=\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{2} should follow because mean estimator x is unbiased. However, it is not the case


E[s2]=E[1Ni=1N(xixˉ)2]=1NE[i=1Nxi22i=1Nxixˉ+i=1Nxˉ2]\begin{aligned} \mathbb{E}\left[s^{2}\right] &=\mathbb{E}\left[\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\bar{x}\right)^{2}\right] \\ &=\frac{1}{N} \mathbb{E}\left[\sum_{i=1}^{N} x_{i}^{2}-2 \sum_{i=1}^{N} x_{i} \bar{x}+\sum_{i=1}^{N} \bar{x}^{2}\right] \end{aligned}


 We know i=1Nxi=Nxˉ and i=1Nxˉ2=Nxˉ2. Plug these into the derivation: \text { We know } \sum_{i=1}^{N} x_{i}=N \cdot \bar{x} \text { and } \sum_{i=1}^{N} \bar{x}^{2}=N \cdot \bar{x}^{2} . \text { Plug these into the derivation: }

E[s2]=1NE[i=1Nxi22Nxˉ2+Nxˉ2]=1NE[i=1Nxi2Nxˉ2]=1NE[i=1Nxi2]E[xˉ2]=E[x2]E[xˉ2]\begin{aligned} \mathbb{E}\left[s^{2}\right] &=\frac{1}{N} \mathbb{E}\left[\sum_{i=1}^{N} x_{i}^{2}-2 N \cdot \bar{x}^{2}+N \cdot \bar{x}^{2}\right] \\ &=\frac{1}{N} \mathbb{E}\left[\sum_{i=1}^{N} x_{i}^{2}-N \cdot \bar{x}^{2}\right] \\ &=\frac{1}{N} \mathbb{E}\left[\sum_{i=1}^{N} x_{i}^{2}\right]-\mathbb{E}\left[\bar{x}^{2}\right] \\ &=\mathbb{E}\left[x^{2}\right]-\mathbb{E}\left[\bar{x}^{2}\right] \end{aligned}

E[s2]=(σx2+μ2)(σxˉ2+μ2)=σx2σxˉ2σxˉ2=VAR[xˉ]=VAR[1Ni=1Nxi]=1N2VAR[i=1Nxi]\begin{aligned} \mathbb{E}\left[s^{2}\right] &=\left(\sigma_{x}^{2}+\mu^{2}\right)-\left(\sigma_{\bar{x}}^{2}+\mu^{2}\right) \\ &=\sigma_{x}^{2}-\sigma_{\bar{x}}^{2} \\ \sigma_{\bar{x}}^{2}=\operatorname{VAR}[\bar{x}]=& \operatorname{VAR}\left[\frac{1}{N} \sum_{i=1}^{N} x_{i}\right]=\frac{1}{N^{2}} \operatorname{VAR}\left[\sum_{i=1}^{N} x_{i}\right] \end{aligned}


VAR[i=1Nxi]=i=1NVAR[x]=NVAR[x]\operatorname{VAR}\left[\sum_{i=1}^{N} x_{i}\right]=\sum_{i=1}^{N} \operatorname{VAR}[x]=N \cdot \operatorname{VAR}[x]


σxˉ2=1NVAR[x]=1Nσx2\sigma_{\bar{x}}^{2}=\frac{1}{N} \operatorname{VAR}[x]=\frac{1}{N} \sigma_{x}^{2}


E[s2]=N1Nσx2\mathbb{E}\left[s^{2}\right]=\frac{N-1}{N} \sigma_{x}^{2}


E[s2]σx2\mathbb{E}\left[s^{2}\right] \neq \sigma_{x}^{2}


Hence we can say that the following statement,


A maximum likelihood estimator is always

unbiased = FALSE

Need a fast expert's response?

Submit order

and get a quick answer at the best price

for any assignment or question with DETAILED EXPLANATIONS!

Comments

No comments. Be the first!

Leave a comment