Answer 1)
Given that X1, X2, X3,.....XN are independently selected random variables from the population with the mean μ and variance σ2
Then the expected value of the sample mean is given by,
E[Xˉ]=E[nX1+X2+X3+X4.....Xn]=E[n1(X1+X2+X3.....Xn)]
E[Xˉ]=n1E[X1+X2+X3+....+Xn]
E[Xˉ]=n1[E[X1]+E[X2]+E[X3]+....E[Xn]]
But since the expected value of each of the above random samples is μ we can write
E[Xˉ]=n1[μ+μ+μ+......μ]=n1[n∗μ]=μ
This implies that,
If X1, X2, ..., XE is a random sample, then the sample mean is an unbiased estimator of
the population mean = TRUE
Answer 2)
Let us check whether the sample variance s2 is an unbiased maximum likelihood estimator for population variance
s2=N1∑i=1N(xi−xˉ)2
In statistics, we evaluate the “goodness” of the estimation by checking if the estimation is “unbiased”. By saying “unbiased”, it means the expectation of the estimator equals to the true value, e.g. if E[x] = µ then the mean estimator is unbiased. Now we will show that the equation actually holds for mean estimator.
E[xˉ]=E[N1i=1∑Nxi]=N1i=1∑NE[x]=N1⋅N⋅E[x]=E[x]=μ
The first line makes use of the assumption that the samples are drawn i.i.d from the true distribution, thus E[xi ] is actually E[x]. From the proof above, it is shown that the mean estimator is unbiased.
Now we move to the variance estimator. At the first glance, the variance estimator s2=N1∑i=1N(xi−xˉ)2 should follow because mean estimator x is unbiased. However, it is not the case
E[s2]=E[N1i=1∑N(xi−xˉ)2]=N1E[i=1∑Nxi2−2i=1∑Nxixˉ+i=1∑Nxˉ2]
We know ∑i=1Nxi=N⋅xˉ and ∑i=1Nxˉ2=N⋅xˉ2. Plug these into the derivation:
E[s2]=N1E[i=1∑Nxi2−2N⋅xˉ2+N⋅xˉ2]=N1E[i=1∑Nxi2−N⋅xˉ2]=N1E[i=1∑Nxi2]−E[xˉ2]=E[x2]−E[xˉ2]
E[s2]σxˉ2=VAR[xˉ]==(σx2+μ2)−(σxˉ2+μ2)=σx2−σxˉ2VAR[N1i=1∑Nxi]=N21VAR[i=1∑Nxi]
VAR[∑i=1Nxi]=∑i=1NVAR[x]=N⋅VAR[x]
σxˉ2=N1VAR[x]=N1σx2
E[s2]=NN−1σx2
E[s2]=σx2
Hence we can say that the following statement,
A maximum likelihood estimator is always
unbiased = FALSE
Comments
Leave a comment