Lecture 9 Properties of Point Estimators and Methods of Estimation Relative efficiency: If we have two unbiased estimators of a parameter, and, we say that is relatively more efficient than if ( ). Definition: Given two unbiased estimators and of, the efficiency of relative to, denoted eff(, ), is given by ( ) Example: Let be a random sample of size n from a population with mean µ and variance. Consider,,. Find eff(, ) and eff(, ).
Consistency: We toss a coin n times. The probability of having heads is p. Tosses are independent. Let Y = # of heads.
Definition: An estimator is a consistent estimator of θ, if, i.e., if converges in probability to θ. Theorem: An unbiased estimator for ( ). is consistent, if Proof: omitted. Example: Let be a random sample of size n from a population with mean µ and variance. Show that is a consistent estimator of µ.
Sufficiency: Example: Consider the outcomes of n trials of a binomial experiment,.
Definition: Let denote a random sample from a probability distribution with unknown parameter. Then the statistic is sufficient for if the conditional distribution of, given U, does not depend on. How to find it? Definition: Let be sample observations taken on corresponding random variables whose distribution depends on. Then if are discrete (continuous) random variables, the likelihood of the sample,, is defined to be the joint probability (density) function of.
Theorem (Factorization Criterion): Let U be a statistic based on the random sample. Then U is a sufficient statistic for the estimation of if and only if. Proof: omitted. Example: Let be a random sample, and {. Show that is a sufficient statistic for.
Example: (#9.49) Let be a random sample from U. Show that is sufficient for.
How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Method of Moments (MoM) The method of moments is a very simple procedure for finding an estimator for one or more parameters of a statistical model. It is one of the oldest methods for deriving point estimators. Recall: the moment of a random variable is The corresponding sample moment is The estimator based on the method of moments will be the solution to the equation.
Example: Let. Use MoM to estimate. Example: Let and.. Find Mom estimators of
Maximum Likelihood Estimators (MLEs) Suppose the likelihood function depends on k parameters. Choose as estimates those values of the parameters that maximize the likelihood. l(θ) = ln(l(θ)) is the log likelihood function. Both the likelihood function and the log likelihood function have their maximums at the same value of. It is often easier to maximize l(θ). Example: A binomial experiment consisting of n trials resulted in observations, where if the trial is a success and otherwise. Find the MLE of p, the probability of a success.
Example: Let. Find the MLEs of and.
More Examples... Example 1: Suppose that X is a discrete random variable with the following probability mass function: X 0 1 2 3 P(X) 2 /3 /3 2( )/3 /3 Where observations is a parameter. The following 10 independent 3, 0, 2, 1, 3, 2, 1, 0, 2, 1 were taken from such a distribution. What is the maximum likelihood estimate of.
Example 2: The Pareto distribution has a probability density function, for x α, θ > 1 where α and θ are positive parameters of the distribution. Assume that α is known and that is a random sample of size n. a) Find the method of moments estimator for θ. b) Find the maximum likelihood estimator for θ. Does this estimator differ from that found in part (a)? c) Estimate θ based on these data: 3, 5, 2, 3, 4, 1, 4, 3, 3, 3.
Example 3: Suppose that form a random sample from a uniform distribution on the interval (0, ), where parameter > 0 is unknown. Find MLE of.
Example 4: Suppose that form a random sample from a uniform distribution on the interval, where the value of the parameter is unknown. What is the MLE for?
Example 5: Let be an i.i.d. collection of Poisson( ) random variables, > 0. Find the MLE for.
Example 6: Let distribution with be a random sample from geometric Find the estimator for p.