Properties of Point Estimators and Methods of Estimation

Similar documents
Maximum Likelihood Estimation

Lecture 8. Confidence intervals and the central limit theorem

Chapter 6: Point Estimation. Fall Probability & Statistics

Practice problems for Homework 11 - Point Estimation

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Lecture 10: Depicting Sampling Distributions of a Sample Proportion

1 Sufficient statistics

Chapter 5. Random variables

An Introduction to Basic Statistics and Probability

Lecture Notes 1. Brief Review of Basic Probability

MAS108 Probability I

Master s Theory Exam Spring 2006

Multivariate Normal Distribution

Principle of Data Reduction

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

2. Discrete random variables

1 Prior Probability and Posterior Probability

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

2WB05 Simulation Lecture 8: Generating random variables

E3: PROBABILITY AND STATISTICS lecture notes

Chapter 4 Lecture Notes

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

2DI36 Statistics. 2DI36 Part II (Chapter 7 of MR)

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

Basics of Statistical Machine Learning

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

4. Continuous Random Variables, the Pareto and Normal Distributions

Math 461 Fall 2006 Test 2 Solutions

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Satistical modelling of clinical trials

Section 6.2 Definition of Probability

Notes on the Negative Binomial Distribution

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

PROBABILITY AND STATISTICS. Ma To teach a knowledge of combinatorial reasoning.

Normal Distribution as an Approximation to the Binomial Distribution

Characteristics of Binomial Distributions

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

e.g. arrival of a customer to a service station or breakdown of a component in some system.

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

Aggregate Loss Models

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics

Joint Exam 1/P Sample Exam 1

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Chapter 5: Normal Probability Distributions - Solutions

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

1 Maximum likelihood estimation

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

The normal approximation to the binomial

Non Parametric Inference

i=1 In practice, the natural logarithm of the likelihood function, called the log-likelihood function and denoted by

Chapter 5. Discrete Probability Distributions

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Maximum Likelihood Estimation by R

APPLIED MATHEMATICS ADVANCED LEVEL

Section 5.1 Continuous Random Variables: Introduction

Lecture 6: Discrete & Continuous Probability and Random Variables

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

ST 371 (IV): Discrete Random Variables

Random variables, probability distributions, binomial random variable

Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Lecture 8: Signal Detection and Noise Assumption

6.2. Discrete Probability Distributions

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

6 PROBABILITY GENERATING FUNCTIONS

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Notes on Continuous Random Variables

Sampling Distributions

LOGNORMAL MODEL FOR STOCK PRICES

The Binomial Probability Distribution

The normal approximation to the binomial

The Kelly Betting System for Favorable Games.

Binomial lattice model for stock prices

Probability: Terminology and Examples Class 2, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Example 1: Dear Abby. Stat Camp for the Full-time MBA Program

PROBABILITY AND SAMPLING DISTRIBUTIONS

Important Probability Distributions OPRE 6301

REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

Homework 4 - KEY. Jeff Brenion. June 16, Note: Many problems can be solved in more than one way; we present only a single solution here.

MATHEMATICAL METHODS OF STATISTICS

CITY UNIVERSITY LONDON. BEng Degree in Computer Systems Engineering Part II BSc Degree in Computer Systems Engineering Part III PART 2 EXAMINATION

171:290 Model Selection Lecture II: The Akaike Information Criterion

How To Find Out How Much Money You Get From A Car Insurance Claim

The Binomial Distribution

2 Binomial, Poisson, Normal Distribution

Statistics 100A Homework 4 Solutions

Chapter 9 Monté Carlo Simulation

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = x = 12. f(x) =

RNA-seq. Quantification and Differential Expression. Genomics: Lecture #12

Statistics 100A Homework 8 Solutions

Introduction to Probability

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Chapter 4 - Lecture 1 Probability Density Functions and Cumul. Distribution Functions

5.1 Identifying the Target Parameter

Bayesian logistic betting strategy against probability forecasting. Akimichi Takemura, Univ. Tokyo. November 12, 2012

Transcription:

Lecture 9 Properties of Point Estimators and Methods of Estimation Relative efficiency: If we have two unbiased estimators of a parameter, and, we say that is relatively more efficient than if ( ). Definition: Given two unbiased estimators and of, the efficiency of relative to, denoted eff(, ), is given by ( ) Example: Let be a random sample of size n from a population with mean µ and variance. Consider,,. Find eff(, ) and eff(, ).

Consistency: We toss a coin n times. The probability of having heads is p. Tosses are independent. Let Y = # of heads.

Definition: An estimator is a consistent estimator of θ, if, i.e., if converges in probability to θ. Theorem: An unbiased estimator for ( ). is consistent, if Proof: omitted. Example: Let be a random sample of size n from a population with mean µ and variance. Show that is a consistent estimator of µ.

Sufficiency: Example: Consider the outcomes of n trials of a binomial experiment,.

Definition: Let denote a random sample from a probability distribution with unknown parameter. Then the statistic is sufficient for if the conditional distribution of, given U, does not depend on. How to find it? Definition: Let be sample observations taken on corresponding random variables whose distribution depends on. Then if are discrete (continuous) random variables, the likelihood of the sample,, is defined to be the joint probability (density) function of.

Theorem (Factorization Criterion): Let U be a statistic based on the random sample. Then U is a sufficient statistic for the estimation of if and only if. Proof: omitted. Example: Let be a random sample, and {. Show that is a sufficient statistic for.

Example: (#9.49) Let be a random sample from U. Show that is sufficient for.

How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Method of Moments (MoM) The method of moments is a very simple procedure for finding an estimator for one or more parameters of a statistical model. It is one of the oldest methods for deriving point estimators. Recall: the moment of a random variable is The corresponding sample moment is The estimator based on the method of moments will be the solution to the equation.

Example: Let. Use MoM to estimate. Example: Let and.. Find Mom estimators of

Maximum Likelihood Estimators (MLEs) Suppose the likelihood function depends on k parameters. Choose as estimates those values of the parameters that maximize the likelihood. l(θ) = ln(l(θ)) is the log likelihood function. Both the likelihood function and the log likelihood function have their maximums at the same value of. It is often easier to maximize l(θ). Example: A binomial experiment consisting of n trials resulted in observations, where if the trial is a success and otherwise. Find the MLE of p, the probability of a success.

Example: Let. Find the MLEs of and.

More Examples... Example 1: Suppose that X is a discrete random variable with the following probability mass function: X 0 1 2 3 P(X) 2 /3 /3 2( )/3 /3 Where observations is a parameter. The following 10 independent 3, 0, 2, 1, 3, 2, 1, 0, 2, 1 were taken from such a distribution. What is the maximum likelihood estimate of.

Example 2: The Pareto distribution has a probability density function, for x α, θ > 1 where α and θ are positive parameters of the distribution. Assume that α is known and that is a random sample of size n. a) Find the method of moments estimator for θ. b) Find the maximum likelihood estimator for θ. Does this estimator differ from that found in part (a)? c) Estimate θ based on these data: 3, 5, 2, 3, 4, 1, 4, 3, 3, 3.

Example 3: Suppose that form a random sample from a uniform distribution on the interval (0, ), where parameter > 0 is unknown. Find MLE of.

Example 4: Suppose that form a random sample from a uniform distribution on the interval, where the value of the parameter is unknown. What is the MLE for?

Example 5: Let be an i.i.d. collection of Poisson( ) random variables, > 0. Find the MLE for.

Example 6: Let distribution with be a random sample from geometric Find the estimator for p.