Master s Theory Exam Spring 2006



Similar documents
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

LECTURE 16. Readings: Section 5.1. Lecture outline. Random processes Definition of the Bernoulli process Basic properties of the Bernoulli process

On the mathematical theory of splitting and Russian roulette

1. (First passage/hitting times/gambler s ruin problem:) Suppose that X has a discrete state space and let i be a fixed state. Let

University of Ljubljana Doctoral Programme in Statistics Methodology of Statistical Research Written examination February 14 th, 2014.

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

Tenth Problem Assignment

Math 461 Fall 2006 Test 2 Solutions

IEOR 6711: Stochastic Models, I Fall 2012, Professor Whitt, Final Exam SOLUTIONS

The Exponential Distribution

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Lecture 13: Martingales

Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics

Problem sets for BUEC 333 Part 1: Probability and Statistics

Random variables, probability distributions, binomial random variable

Lecture Notes 1. Brief Review of Basic Probability

NEW YORK STATE TEACHER CERTIFICATION EXAMINATIONS

Joint Exam 1/P Sample Exam 1

Notes on Continuous Random Variables

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

MATHEMATICAL METHODS OF STATISTICS

WHERE DOES THE 10% CONDITION COME FROM?

Sums of Independent Random Variables

Statistics in Retail Finance. Chapter 6: Behavioural models

Basics of Statistical Machine Learning

Random access protocols for channel access. Markov chains and their stability. Laurent Massoulié.

Monte Carlo Methods in Finance

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Algebra Unpacked Content For the new Common Core standards that will be effective in all North Carolina schools in the school year.

ECE302 Spring 2006 HW4 Solutions February 6,

Review Jeopardy. Blue vs. Orange. Review Jeopardy

e.g. arrival of a customer to a service station or breakdown of a component in some system.

LECTURE 4. Last time: Lecture outline

Statistics 100A Homework 8 Solutions

An Introduction to Basic Statistics and Probability

M/M/1 and M/M/m Queueing Systems

Modern Optimization Methods for Big Data Problems MATH11146 The University of Edinburgh

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

Chapter 4 Lecture Notes

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Exam Introduction Mathematical Finance and Insurance

Multivariate Normal Distribution

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

LOGNORMAL MODEL FOR STOCK PRICES

3. INNER PRODUCT SPACES

The Analysis of Data. Volume 1. Probability. Guy Lebanon

Homework set 4 - Solutions

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails

Probability and Random Variables. Generation of random variables (r.v.)

A Model of Optimum Tariff in Vehicle Fleet Insurance

1 Short Introduction to Time Series

1 Another method of estimation: least squares

Gambling and Data Compression

NOV /II. 1. Let f(z) = sin z, z C. Then f(z) : 3. Let the sequence {a n } be given. (A) is bounded in the complex plane

Maximum Likelihood Estimation

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

**BEGINNING OF EXAMINATION** The annual number of claims for an insured has probability function: , 0 < q < 1.

THE CENTRAL LIMIT THEOREM TORONTO

STA 4273H: Statistical Machine Learning

PROBABILITY AND STATISTICS. Ma To teach a knowledge of combinatorial reasoning.

α = u v. In other words, Orthogonal Projection

How to Gamble If You Must

Lecture 8: Signal Detection and Noise Assumption

2WB05 Simulation Lecture 8: Generating random variables

9.2 Summation Notation

Lecture 1: Schur s Unitary Triangularization Theorem

Math 312 Homework 1 Solutions

Pattern Analysis. Logistic Regression. 12. Mai Joachim Hornegger. Chair of Pattern Recognition Erlangen University

Least Squares Estimation

Data Modeling & Analysis Techniques. Probability & Statistics. Manfred Huber

Supplement to Call Centers with Delay Information: Models and Insights

The Dirichlet-Multinomial and Dirichlet-Categorical models for Bayesian inference

ACMS Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers

( ) is proportional to ( 10 + x)!2. Calculate the

Dirichlet forms methods for error calculus and sensitivity analysis

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

Wald s Identity. by Jeffery Hein. Dartmouth College, Math 100

( ) = 1 x. ! 2x = 2. The region where that joint density is positive is indicated with dotted lines in the graph below. y = x

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

Important Probability Distributions OPRE 6301

Monte Carlo Simulation

Statistical Machine Learning

The Chinese Restaurant Process

Statistics 104: Section 6!

1 Norms and Vector Spaces

MATH10212 Linear Algebra. Systems of Linear Equations. Definition. An n-dimensional vector is a row or a column of n numbers (or letters): a 1.

PREDICTIVE DISTRIBUTIONS OF OUTSTANDING LIABILITIES IN GENERAL INSURANCE

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Service courses for graduate students in degree programs other than the MS or PhD programs in Biostatistics.

Aggregate Loss Models

Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin

E3: PROBABILITY AND STATISTICS lecture notes

Statistics 100A Homework 7 Solutions

Math 4310 Handout - Quotient Vector Spaces

5. Continuous Random Variables

Permanents, Order Statistics, Outliers, and Robustness

CCNY. BME I5100: Biomedical Signal Processing. Linear Discrimination. Lucas C. Parra Biomedical Engineering Department City College of New York

Transcription:

Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem as possible. Put your answers for each question in a separate blue book, and be sure to put your name and the question number on the cover of the book.

1 Suppose Y 1, Y 2,..., Y n are independent with Y i drawn from a Normal βx i, 1 distribution. Here, β is an unknown parameter and x 1,..., x n are known scalars. Scalar Regression (a) Use the factorization theorem to obtain a sufficient statistic for β. (b) Compute the maximum likelihood estimate of β, call it β. (c) Show that β is unbiased for β and find Var( β). (d) Another unbiased estimator of β is β = ni=1 Y i ni=1 x i. Compute Var( β). (e) Which estimator β or β is preferred. Explain. 2 Let X 1,..., X n be a random sample from the continuous probability distribution with density A Family Affair where 0 < θ <. f(x θ) = { 1 θ e x θ for 0 < x <, 0 otherwise, (a) Find the form of a complete minimal sufficient statistic for θ. (b) Let c > 0 be an arbitrary positive constant and set τ c (θ) = P θ {X 1 c}. Derive the UMVUE for τ c (θ). (c) Consider the estimator τ c = 1 e nc/t for τ c (θ), where T = ni=1 X i. Argue (without actually finding the expected value of τ c ) that τ c is not unbiases for τ c (θ). 2

3 Let X = UΛV T be the singular value decomposition of X, a full rank matrix (rank= k). Assume the standard linear model Y = Xβ + ɛ. Let θ = V T β and θ = V T β, where β is the least squares estimator for β. Fun with Rotations (a) Show that θ θ 2 = β β 2. (b) Find the covariance matrix Σ = Cov( θ). (Hint: it simplifies to a simple quantity.) (c) Assume that ɛ i, for i = 1,..., n, are iid Normal 0, σ 2. Construct a (1 α) confidence band for the vector θ. 4 Let X be a Bernoulli p random variable. (This is a single Bernoulli observation, n = 1.) Assume that we have a Uniform 0, 1 prior for p; that is f(p) = 1 [0,1] (p). A Query from the Good Reverend (a) Suppose we observe X = 1. Find the posterior density and the posterior mean of p. (b) Find a set A = (0, a(x)) such that P{p A(X) X } = 1 α. (Hint: do this separately for the two cases: X = 0 and X = 1.) (c) Find the frequentist coverage probability of the set A in part(b), and comment on how the coverage varies as a function of p. 3

5 Let X i, i = 1, 2,... be a sequence of IID random variables, equal to +1 with probability p, or 1 with probability 1 p q. Let R 0 = 0 and R n = n i=1 X i for n 1. Then R n represents the position at time n of a particle following a random walk. (a) Prove that R n is a Markov chain. (b) Prove that P{R n = r} = pp{r n 1 = r 1} + qp{r n 1 = r + 1}. (c) Prove that P{R n = r} = if r n, and = 0 otherwise. Hint: Consider Y i = 1 2 (1 + X i). ( ) n n+r p n+r n r 2 q 2 2 (d) Use your results from (2) and (3) to prove that ( n r Hint: Consider p = q = 0.5. 6 ) ( = n 1 r 1 ) + ( n 1 r The cumulant generating function of a random variable, X, is defined for u R by Λ X (u) = log Ee ux, ). Random Walks and Pascal s Triangle Large Deviations or the logarithm of the moment generating function. It may take infinite values for some u. (a) Prove the exponential Markov inequality: For any u, P{X a} e ua e Λ X(u) (b) Let X i, i = 1, 2,... be IID copies of X, and S n = n i=1 X i. Show that Λ Sn (u) = nλ X (u). (c) Let X n = n 1 S n. Prove that, for all u 1 n log P{ X n a } ua + Λ X (u) (d) Show that Λ X (0) = 0, always. Suppose that Λ X (u) is finite for all u between d and +d. Show that the distribution of X has exponential tails, meaning there are positive real numbers x 0, c 1, c 2 such that P{ X x} c 1 e c 2x if x > x 0. 4

7 Consider the following discrete-time model of the waiting room at your local hospital Emergency Room. At each time n 1, a number U n of new patients enter the waiting room. Each patient remains in the waiting room for a time that has a Geometric p distributed, at which point they leave to be treated. Let X n denote the total number of patients in the waiting room at time n, including both the new patients U n and the patients still waiting from previous times. You may assume he following: 1. The patient s waiting times are independent of each other and of the U n s. 2. The U n s are independent Poisson λ random variables, with λ > 0. 3. X 0 = 0. 4. If the sum of a patient s arrival time and waiting time equal n + 1, then that patient is not included in the count X n+1. For example, if a patient arrives at time 4 and has waiting time 3, then that patient is included in X 4, X 5, and X 6 but not included in X 7. Note also that for this problem, we are using the version of the Geometric distribution with pmf Stuck in the Waiting Room p(k) = { (1 p) k 1 p for k = 1, 2,... 0 otherwise. (a) Show that the Geometric p distribution above has the memoryless property P{W > j + k W > k} = P{W > j }, for non-negative integers j, k. (b) Define indicators W n,i for i = 1,..., X n that patient i (out of X n in the waiting room at time n) is still waiting at time n + 1 (i.e., counted in X n+1 ). Note that, by the assumptions above, the W n,i are independent and identically distributed random variables. Show that W n,i has a Bernoulli 1 p distribution. 5

(c) Express X n+1 in terms of X n, the W i,n s, and U n+1 and thus show that X = (X n ) n 0 is a Markov chain. (Hint: Your expression will be of the form X n+1 = V n + U n+1, where V n is the number of patients who entered the room before time n + 1 and are still waiting.) (d) What is the state-space of this chain? Give a good argument that this chain is irreducible. (e) Suppose that X n has a Poisson α distribution for some α > 0. Find the distribution of X n+1. (Hint: Start by finding the distribution of V n from the hint in part (c).) (f) Using your answer to part (e), find the stationary distribution of this chain. 6