Final Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin



Similar documents
Binomial random variables

Chapter 4 Lecture Notes

Binomial random variables (Review)

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Statistics 100A Homework 8 Solutions

STT315 Chapter 4 Random Variables & Probability Distributions KM. Chapter 4.5, 6, 8 Probability Distributions for Continuous Random Variables

Joint Exam 1/P Sample Exam 1

Maximum Likelihood Estimation

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March Due:-March 25, 2015.

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

Section 5.1 Continuous Random Variables: Introduction

5. Continuous Random Variables

AMS 5 CHANCE VARIABILITY

1 Sufficient statistics

Solutions: Problems for Chapter 3. Solutions: Problems for Chapter 3

Midterm Exam #1 Instructions:

Statistics 100A Homework 3 Solutions

INSURANCE RISK THEORY (Problems)

Statistics and Random Variables. Math 425 Introduction to Probability Lecture 14. Finite valued Random Variables. Expectation defined

Statistics 100A Homework 4 Solutions

Math/Stats 342: Solutions to Homework

Practice problems for Homework 11 - Point Estimation

Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

Exact Confidence Intervals

MAS108 Probability I

Math 461 Fall 2006 Test 2 Solutions

Aggregate Loss Models

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

Recursive Estimation


TEST 2 STUDY GUIDE. 1. Consider the data shown below.

V. RANDOM VARIABLES, PROBABILITY DISTRIBUTIONS, EXPECTED VALUE

Stats on the TI 83 and TI 84 Calculator

Math 431 An Introduction to Probability. Final Exam Solutions

Lecture 6: Discrete & Continuous Probability and Random Variables

Section 7C: The Law of Large Numbers

4. Continuous Random Variables, the Pareto and Normal Distributions

Elementary Statistics and Inference. Elementary Statistics and Inference. 16 The Law of Averages (cont.) 22S:025 or 7P:025.

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

Notes on the Negative Binomial Distribution

Feb 7 Homework Solutions Math 151, Winter Chapter 4 Problems (pages )

e.g. arrival of a customer to a service station or breakdown of a component in some system.

University of California, Los Angeles Department of Statistics. Random variables

Example: Find the expected value of the random variable X. X P(X)

Department of Industrial Engineering IE 202: Engineering Statistics Example Questions Spring 2012

Principle of Data Reduction

3.2 Roulette and Markov Chains

Chapter 16: law of averages

6 PROBABILITY GENERATING FUNCTIONS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

Notes on Continuous Random Variables

ISyE 6761 Fall 2012 Homework #2 Solutions

Chapter 13 & 14 - Probability PART

Math 151. Rumbos Spring Solutions to Assignment #22

Exam 3 Review/WIR 9 These problems will be started in class on April 7 and continued on April 8 at the WIR.

Mind on Statistics. Chapter 8

WHERE DOES THE 10% CONDITION COME FROM?

Practice Problems #4

Inaugural Lecture. Jan Vecer

Math Quizzes Winter 2009

MATH4427 Notebook 2 Spring MATH4427 Notebook Definitions and Examples Performance Measures for Estimators...

Exponential Distribution

An Introduction to Information Theory

Section 6.1 Joint Distribution Functions

Random Variables. Chapter 2. Random Variables 1

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Chapter 4. Probability Distributions

Roulette. Math 5 Crew. Department of Mathematics Dartmouth College. Roulette p.1/14

Chapter 3 RANDOM VARIATE GENERATION

MAT 155. Key Concept. September 27, S5.5_3 Poisson Probability Distributions. Chapter 5 Probability Distributions

Summer School in Statistics for Astronomers & Physicists, IV June 9-14, 2008

13.0 Central Limit Theorem

Experimental Design. Power and Sample Size Determination. Proportions. Proportions. Confidence Interval for p. The Binomial Test

Master s Theory Exam Spring 2006

Normal distribution. ) 2 /2σ. 2π σ

Dongfeng Li. Autumn 2010

Probability: The Study of Randomness Randomness and Probability Models. IPS Chapters 4 Sections

Elementary Statistics and Inference. Elementary Statistics and Inference. 17 Expected Value and Standard Error. 22S:025 or 7P:025.

36 Odds, Expected Value, and Conditional Probability

Expected Value. 24 February Expected Value 24 February /19

The Binomial Probability Distribution

A Uniform Asymptotic Estimate for Discounted Aggregate Claims with Subexponential Tails

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The Kelly criterion for spread bets

MATH 10: Elementary Statistics and Probability Chapter 5: Continuous Random Variables

MA 1125 Lecture 14 - Expected Values. Friday, February 28, Objectives: Introduce expected values.

Statistics 100A Homework 7 Solutions

The normal approximation to the binomial

Errata and updates for ASM Exam C/Exam 4 Manual (Sixteenth Edition) sorted by page

ACMS Section 02 Elements of Statistics October 28, 2010 Midterm Examination II Answers

Examples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf

Name: Date: Use the following to answer questions 2-4:

Practice Problems for Homework #6. Normal distribution and Central Limit Theorem.

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

AP Statistics 7!3! 6!

Bayesian Statistics in One Hour. Patrick Lam

Bayesian Analysis for the Social Sciences

Transcription:

Final Mathematics 51, Section 1, Fall 24 Instructor: D.A. Levin Name YOU MUST SHOW YOUR WORK TO RECEIVE CREDIT. A CORRECT ANSWER WITHOUT SHOWING YOUR REASONING WILL NOT RECEIVE CREDIT. Problem Points Possible Points Earned 1 1 2 1 3 1 4 1 5 1 6 1 7 1 8 1 9 1 1 1 1

Problem 1. An instructor gives her class a set of 2 problems with the information the final exam will consist of a random selection of 1 of them. If a student has figured out how to do 12 of the problems, what is the probability that he or she will correctly answer (a) all 1 problems on the exam; (b) at least 8 of the problems on the exam. Solution. Imagine that an urn is filled with 2 balls, each representing one of the possible exam problems. Those balls corresponding to problems studied by the student are colored red; there are 12 such balls. The remaining balls are colored black. The instructor draws 1 of these balls at random without replacement. [Obviously, an instructor would not put the same problem twice on the exam! Thus, he samples without replacement.] Let X be the number of red balls drawn; this corresponds to the number of questions which the student can answer correctly. (a) (b) PX 1} ( 12 8 ) 1)( ). ( 2 1 PX 8} PX 8} + PX 9} + PX 1} ( 12 8 )( 8 2) ( 2 ) + 1 ( 12 9 )( 8 1) ( 2 ) + 1 ( 12 8 ) 1)( ). ( 2 1 2

Problem 2. Ninety-eight percent of all babies survive delivery. However, 15 percent of all births involve Cesarean (C) sections, and when a C section is performed the baby survives 96 percent of the time. If a randomly chosen pregnant woman does not have a C section, what is the probability that her baby survives? Solution. Let S be the event of survival, and C be the probability of Cesarian. We want to find P(S C ), and are given that P(S).98, P(S C).96, and P(C).15. The definition of conditional probability gives us P(S C ) P(S C ) P(C. ) We can find P(S C ) from the information given by using the identity P(S) P(S C) + P(S C ). In particular, rearranging and then using the conditional probability identity P(S C) P(S C)P(C) yields Since P(C ) 1 P(C).85, we have then P(S C ) P(S) P(S C) P(S) P(S C)P(C).98 (.96)(.15).836 P(S C ).836.85.9835. 3

Problem 3. Urn A has 5 white and 7 black balls. Urn B has 3 white and 12 black balls. We flip a fair coin. If the outcome is heads, then a ball from urn A is selected, whereas if the outcome is tails, then a ball from urn B is selected. Suppose that a white ball is selected. What is the probability that the coin landed tails? Solution. P(T W ) P(T W ) P(W ) P(W H)P(H) P(W H)P(H) + P(W T )P(T ) 3 1 15 2 5 1 12 2 + 3 1 15 2 12 37. 4

Problem 4. Suppose that X has the distribution function Find E(X ). F(t) if t < 1 Solution. The density is given by differentiating F: f (t) 1 1 t 3 if t 1. if t < 1 3t 4 if t > 1. Thus, E(X ) t f (t)dt t(3t 4 2 3t )dt 2 3 1 2. 1 5

Problem 5. Find the probability density function of R Asinθ, where A is a constant and θ is uniformly distributed on ( π/2, π/2). 1 Hint: The derivative of arcsin(t) is. 1 t 2 Solution. Instructor s Note: The derivative of arcsin was incorrectly given on the test. If you used this misinformation, you were not penalized! Notice that sinθ is one-to-one on ( π/2,π/2): 1.5-1.5-1 -.5.5 1 1.5 -.5 Clearly then Asinθ is also one-to-one, but it ranges from A to A. For A t A We conclude that To get the pdf, we differentiate: -1 F R (t) PR t} PAsinθ t} Psinθ t/a} Pθ arcsin(t/a)} arcsin(t/a) + π/2 π t < π/2 arcsin(t/a)+π/2 F R (t) π π/2 t π/2 1 t > π/2. t < π/2 1 1 f R (t) π/2 t π/2 πa 1 (t/a) 2 t > π/2. 6

Problem 6. Let X and Y have joint pdf f (s, t) se s(t+1) if s >, t >, otherwise. (a) Find the conditional probability density function of Y given X t. (b) Find the density function of Z X Y. Solution. f X (s) Consequently, if s > and t >, se s(t+1) dt e s(t+1) e s. f Y X (t s) f (s, t) f X (s) se s(t+1) e s se st. That is, given X s, the conditional distribution of Y is exponential with parameter s. We compute the density of Z in two ways. For u > F Z (u) PX Y u} u/s se s(t+1) dt ds e s u/s se st dt ds e s [ e st ] u/s ds e s [ 1 e u] ds [ 1 e u]. Differentiating, We can also compute as follows: f Z (u) e u if u, otherwise. PX Y u} P Y u/s X s} e s ds (1 e u )e s ds (1 e u ) e s ds (1 e u ). The second inequality follows from part (a). 7

Problem 7. 12 people get on an elevator on the ground floor of a 1 story building. Each person selects a floor; assume that each person selects independently and each person picks one of the 1 possible floors uniformly at random. No new people get on the elevator after the ground floor. Compute the expected number of stops the elevator makes. Solution. Let X be the number of stops the elevator makes. We can write X 1 i1 I i, where I i Since expectation is linear, we have Now E(X ) E 1 if the elevator stops at floor i, ( 1 otherwise. i1 I i ) 1 i1 E(I i ) 1 i1 Pstop at floor i}. Pstop at floor i} 1 Pno-one picks floor i} ( ) 9 12 1. 1 So [ ( ) 9 12 ] E(X ) 1 1. 1 8

Problem 8. Suppose two roommates, Bill and Bob, share a phone line. The number of calls received during the evening is a Poisson random variable with parameter 1. Bill is more popular, so each call independently has probability.7 of being for Bill. (a) Find the conditional pmf for X, the number of calls received by Bill, given that the total number of calls is n. (b) Find the pmf for X. (c) Given that Bill receives 5 calls, what is the probability that Bob receives exactly one call. Solution. Let X be the number of calls for Bill, and Y the number of calls for Bob. Let Z X + Y. Given that Z n, each of the n call is for Bill with probability.7, and not for Bill with probability.3. The calls are independent of each other. Thus the conditional distribution of X is Binomial(n, p.7). Then ( ) n p X (k) p X Z (k n)p Z (n) (.7) k (.3) n k 1 (1)n e nk nk k n! e 1 (.7) k (1) k (.3) n k (1) n k k! (n k)! nk e 1 7 k k! Thus X has a Poisson(7) distribution. Similarly, Y has a Poisson(3) distribution. We want to find PY 1 X 5}. PX 5,Y 1} PY 1 X 5} PX 5} PX 5 Z 6}PZ 6} PX 5} ( 6 5) (.7) 5 (.3) e 1 (1) 6 3e 3. e 7 7 5 5! 6! e 3 e 7 7 k. k! 9

Problem 9. A column bet in roulette wins 2$ with probability 12/38, and losses 1$ with probability 26/38. (a) Compute the mean and standard deviation of your winnings on a single game. (b) You place this bet 25 times. Estimate the probability that you have won a positive amount. (c) You place this bet 1 times. Estimate the a probability that you have won a positive amount. Solution. Suppose that X i is the amount won on the ith game. X 1,X 2,... are independent and all have the same distribution. Then E(X 1 ) 2 12 38 26 38 1 19.5263. and Thus and so SD(X 1 ) 1.394. Let S n n E ( X1 2 ) 2 2 12 26 + 12 38 38 37 19 1.947. V (X 1 ) E ( X1 2 ) [E(X1 )] 2 37 19 1 361 72 361 1.945. i1 X i. Write µ for E(X 1 ) and σ for SD(X 1 ). S25 25µ PS 25 > } P > 25(.526) } 5σ 5(1.394) S25 25µ P > 1.316 } 5σ 6.97 } S25 25µ P >.189 5σ 1 Φ(.189).425 S1 1µ PS 1 > } P > 52.63 } 1σ 44.8 } S25 1µ P > 1.194 1σ 1 Φ(1.194).116 1

Problem 1. Let X have a Gamma(α,λ) distribution, and let Y be an independent Gamma(β,λ) random variable. Let Z X + Y. (a) Find the MGF of Z. You can use Table 7.2. (b) What is the distribution of Z? Solution. M Z (t) M X (t)m Y (t) ( ) λ α ( λ λ t λ t ( ) λ α+β λ t Using Table 7.2, we see that Z has a Gamma(α + β,λ) distribution. ) β 11