Method of moments;
(i) Equate the first sample moments about the origin M1 = "\\frac {\\sum x i}{n}=\\bar x" to the first theoretical moment (E(X))
(ii) Equate the second sample moments about the mean to the second theoretical moment about the mean "E[(X-\\mu)^{2}]"
(iii) Continue equating corresponding sample moment about the mean Mk with the corresponding theoretical moment about the mean until you have equation equal to the number of parameters."E[(X-\\mu)^{k}]" k=3,4, ...
(iv) Solve for the parameters and the resulting values are the method of moment estimators.
Method of Maximum Likelihood Estimation
Suppose having a random sample X1,X2 ,......,Xn where it is assumed the probability distribution depends on the same unknown parameter "\\theta" .
We find a point estimator u(X1,X2, ......,Xn ) such that u(X1,X2, ...,Xn ) is a good point estimate of the unknown parameter.
Let X1,X2 ,...,Xn be a random sample of a distribution depending on one or more unknown parameters "\\theta _{1},\\theta_{2},...,\\theta_{m}" with probability density or mass function"f(x_{i};\\theta_{1}, \\theta_{2},...,\\theta_{m})" . Suppose "(\\theta_{1}, \\theta_{2},...,\\theta_{m})" is restricted to the parameter space "\\Omega" . Then
(i) When regarded function of"\\theta_{1},\\theta_{2}, ...,\\theta_{m}" ,the joint probability density or mass function in "(x_{1},x_{2},...,x_{n})":
"L(\\theta_{1},\\theta_{2},...,\\theta_{m})=\\prod f(x_{i};\\theta_{1}, \\theta_{2},...,\\theta_{m}) [(\\theta_{1},\\theta_{2}, ..., \\theta_{m})in \\Omega]" is the likelihood function
(ii)if:
"[u_{1}(x_1,x_{2},..x_n),u_2(x_1,x_2, ...,x_n),...,u_m(x_1,x_2,...,x_m)]" is the m-tuple that maximizes the likelihood function ,then :
"\\hat \\theta=u_i(x_1, x_2,....,x_n)"
is the maximum likelihood estimator of "\\theta_i" for i=1,2,..,m
(iii)The corresponding observed values in "[u_{1}(x_1,x_{2},..x_n),u_2(x_1,x_2, ...,x_n),...,u_m(x_1,x_2,...,x_m)]" are the maximum likelihood estimates of "\\theta_i" for i=1,2,..,m
Comments
Leave a comment