The cumulative distribution function (c.d.f) (or simply the distribution function) of the random variable X, sayitf, is a function defined by

Size: px
Start display at page:

Download "The cumulative distribution function (c.d.f) (or simply the distribution function) of the random variable X, sayitf, is a function defined by"

Transcription

1 2 Random Variables 2.1 Random variables Real valued-functions defined on the sample space, are known as random variables (r.v. s): RV : S R Example. X is a randomly selected number from a set of 1, 2, 4, 5, 6, 10. Y is the number heads that has occured in tossing a coin 10 times. V is the height of a randomly selected student. U is a randomly selected number from the interval (0, 1). Discrete and Continuous Random Variables Random variables may take either a finite or a countable number of possible values. Such random variables are called discrete. However, there also exist random variables that take on a continuum of possible values. These are known as continuous random variables. Example Let X be the number of tosses needed to get the first head. Example Let U be a number randomly selected from the interval [0,1]. Distribution Function The cumulative distribution function (c.d.f) (or simply the distribution function) of the random variable X, sayitf, is a function defined by F (x) P (X x) x R. Here are some properties of the c.d.f F, (i) F (x) is a nondecreasing function, (ii) lim x F (x) 1 (iii) lim x F (x) 0 All probability questions about X can be answered in terms of the c.d.f F. For instance, P (a <X b) F (b) F (a) If we desire the probability that X is strictly smaller than b, we may calculate this probability by P (X <b) lim P (X b h) lim F (b h) + + h 0 h 0 Remark. Note that P (X <b) does not necessarily equal F (b). 1

2 2 2.2 Discrete Random Variables Definition. (Discrete Random Variable) A random variable that can take on at most a countable number of possible values is said to be discrete. For a discrete random variable X, we define the probability mass function (or probability density function, p.d.f) of X by p(a) P (X a). Let X be a random variable takes the values x 1,x 2,... Then we must have p(x i )1. i1 The distribution function F can be expressed in terms of the mass function by F (a) all x i a p(x i ) Example. Let X be a number randomly selected from the set of numbers 0, 1, 2, 3, 4, 5. Find the probability that P (X 4). The Binomial Random Variable Suppose that n independent trials, each of which results in a success with probability p and in a failure with probability 1 p, are to be performed. If X represents the number of successes that occur in the n trials, then X is said to be a binomial random variable with parameters (n, p). Denote X B(n, p). The probability mass function of a binomial random variable with parameters (n, p) is given by P (X k) p(k) ( ) n p k (1 p) n k, k 0, 1, 2,...,n k where Note that n p(k) k0 n k0 ( ) n n! k k!(n k)!. ( ) n p k (1 p) n k (p +(1 p)) n 1 k Example. According to a CNN/USA Today poll, approximately 70% of Americans believe the IRS abuses its power. Let X equal the number of people who believe the IRS abuses its power in a random sample of n20 Americans. Assuming that the poll results still valid, find the probability that (a) X is at least 13 (b) X is at most 11

3 3 The Geometric Random Variable Suppose that independent trials, each having probability p of being a success, are performed until a success occurs. If we let X be the number of trials required until the first success, then X is said to be a geometric random variable with parameter p. Its probability mass function is given by p(n) P (X n) (1 p) n 1 p, n 1, 2,... Note that, p(n) p n1 (1 p) n 1 Example. Let X be the number of tosses needed to get the first head. n1 P (X n) 1, x 1, 2, 3,.... 2n The mass function of X is then, p(x) 1/2 x. Hence, 1 p(x) 2 x 1. all x x1 Example. Signals are transmitted according to a Poisson process with rate λ. Each signal is successfully transmitted with probability p and lost with probability 1 p. The fates of different signals are independent. What is the distribution of the number of signals lost before the first one is successfully transmitted? The Poisson Random Variable A random variable X, taking on one of the values 0, 1, 2,... is said to be a Poisson random variable with parameter λ, if p(k) P (X k) e λ λk, k 0, 1, 2,... k! This equation defines a probability mass function since p(k) e λ λ k k! e λ e λ 1. k0 k0 A Poisson random variable involves observing discrete events in a continuous interval of time, length, or space. Example. Suppose that the number of typographical errors on a single page of a book has a Poisson distribution with parameter λ 1. Calculate the probability that there is at least one error on a page?

4 4 Assume that the average number of occurences of the event in per unit of time is λ. Let Y be the number of the occurences of the event in s unit of time. Then N(t) isa Poisson random variable with parameter λt, that is λt (λt)k P (N(t) k) e k! k 0, 1, 2,... Example. People enter a casino at a rate of 1 for every 2 minutes. (a) What is the probability that none enters between 12:00 and 12:05 (b) What is the probability that at least 4 people enter the casino during that time? Theorem. Let X B(n, p). Ifn is very large such that λ np. then P (X x) ( n )p x (1 p) n x e k k x. x x! In other words, B(n, p) P oisson(λ), where λ np. Proof. Let k np or p k/n P (X x) ( ) n p x (1 p) n x x n! x!(n x)! λx x! n! (n x)! ( n x ( ) x ( λ 1 λ n n 1 (n λ) x ) x ( 1 λ ) n ( 1 λ x n n) )( λ n ) n ( ) x n n λ ( 1 λ n ( ) x ( n! k 1 λ n x!(n x)! n λ n) ) n λx n(n 1) (n x +1) x! (n λ) x ( 1 λ n) n. Example. Suppose that the probability that a random chosen item to be defective is items are shipped to a warehouse. What is the probability that there will be at most 5 defective items in that 800 items?

5 5 2.3 Continuous Random Variables Let X be a random variable whose set of possible values is uncountable. It is known that such random variable is called continuous. Definition. A random variable X is continuous if there exists a nonnegative function f(x), defined for all real x (, ), having the property that for any set B of real numbers P (X B) f(x) dx. The function f(x) is called the probability density function of the random variable X. A density function must have and B f(x) dx P (X (, )) 1 P (a X b) b a f(x) dx. The relationship between the c.d.f F (x) and the p.d.f f(x) is expressed by d F (x) f(x). dx Remark. The density function is not a probability. P (a ε X a + ε) a ε a+ε f(x) dx εf(a) when ε is small. From this, we see that f(a) is a measure of how likely it is that the random variable will be near a. The Uniform Random Variable A random variable is said to be uniformly distributed over the interval (0, 1) if its probability density function is given by { 1, 0 <x<1 f(x) 0, otherwise. Note that the preceding is a density function since f(x) 0 and for any 0 <a<b<1 P (a X b) b a f(x) dx f(x) dx b a 1 0 dx 1. 1 dx b a f(x) dx b a.

6 6 In general, we say that X is a uniform random variable on the interval (α, β) if its p.d.f is given by 1 f(x) α β, α<x<β 0, otherwise. Exponential Random Variables A continuous random variable whose p.d.f is given, for some λ, by { λe λx, x 0 f(x) 0, x < 0. is said to be an exponential random variable with parameter λ. The c.d.f of X is F (x) x 0 f(x) x 0 λe λt dt 1 e λx, x 0.

7 7 2.4 Expectation of a Random Variable The Discrete Case If X is a discrete random variable having a probability mass function p(x), then the expected value of X is defined by E(X) xp(x) all x provided all x x p(x) <. Lemma. If X is non-negative integer valued random variable, then E(X) P (X >k). k0 Example. (a) (Expectation of a Binomial Random Variable) Let X B(n, p). Calculate E(X). (b) (Expectation of a Geometric Random Variable) Calculate the expectation of a geometric random variable having parameter p. (c) (Expectation of a Poisson Random Variable) Calculate the expectation of a Poisson random variable having parameter λ. The Continuous Case The expected value of a continuous random variable is defined by E(X) xf(x) dx provided x f(x) dx < Lemma. If X is non-negative random variable, then E(X) 0 P (X >x) dx. Example. (a) (Expectation of a Uniform Random Variable) Let X B(n, p). Calculate E(X). (b) (Expectation of an Exponential Random Variable) Calculate the expectation of an exponential random variable having parameter λ. (c) (Expectation of a Normal Random Variable) Calculate the expectation of a Normal random variable having parameter µ and σ 2.

8 8 2.5 Expectation of a Function of a Random Variable Now, we are interested in calculating, not the expected value of X, but the expected value of some function of X, say,g(x). Proposition 1. (a) If X is a discrete random variable with probability mass function p(x), then for any real-valued function g, E[g(x)] g(x) p(x) all x (b) If X is a continuous random variable with probability density function f(x), then for any real-valued function g, E[g(x)] Proposition 2. If a and b are constants, then g(x) f(x) dx E(aX + b) ae(x)+b. and E(X + Y )E(X)+E(Y ) Variance of a Random Variable The expected value of a random variable X, E(X), is also referred to as the mean or the first moment. The quantity E(X n ), n 1, is called the nth moment of X. The variance of X, denoted by Var(X), is defined by A useful formula to compute the variance is Var(X) E[X E(X)] 2. Var(X) E(X 2 ) [E(X)] 2.

9 2.6 Jointly Distributed Random Variables Thus far, we have concerned ourselves with the probability distribution of a single random variable. However, we are often interested in probability statements concerning two or more random variable. Joint Distribution Function To deal with probabilities of two random variables X and Y, we define the joint distribution function of X and Y by 9 F X,Y (a, b) P (X a, Y b), <a,b<. The distribution function of X can be obtained from the joint c.d.f as follows: F X (a) P (X a, Y < ) F (a, ). Similarly, the c.d.f. of Y is given by F Y (b) P (X <, Y b) F (, b). Joint Probability Mass Function Let X and Y be both discrete random variables, then the joint mass function of X and Y is given by p(x, y) P (X x, Y y). The probability mass function of X may be obtained from p(x, y) by p X (x) all y p(x, y) and similarly, the mass function of Y is p Y (y) all x p(x, y) Joint Probability Density Function We say that X and Y are jointly continuous if there exists a function f(x, y), defined for all real x and y, having the property that for all sets A and B of real numbers P (X A, y B) f(x, y) dx dy. B A The function f(x, y) is called the joint probability density function of X and Y. The p.d.f. of X and Y can be obtained from their joint p.d.f. by P (X A) f(x, y) dy dx A

10 10 and The integrals P (Y B) B f(x, y) dx dy. f X (x) f(x, y) dy and f Y (y) are called the density function of X and Y respectively. Expectation of a Function of Two Random Variables f(x, y) dx. If X and Y are random variables and g is a function of two variables, then E[g(X, Y )] g(x, y) p(x, y) in the discrete case y x g(x, y)f(x, y) dx dy For instance, if g(x, Y )X + Y, then, in the continuous case, E(X + Y ) E(X)+E(Y ) Proposition. For any constants a and b, (x + y) f(x, y) dx dy in the continuous case xf(x, y) dx dy + yf(x, y) dx dy E(aX + by )ae(x)+be(y ). Example. Let us compute the expectation of a binomial variable with parameters n and p. X B(n, p). Solution. X X 1 + X X n where { 1, if the ith trial is a success X i 0, if the ith trial is a failure. Hence, E(X i )0 (1 p)+1 p p. Thus E(X) E(X 1 + X X n )np. Example. At a party N men throw their hats into the center of a room. The hats are mixed up and each man randomly selects one. Find the expected number of men who select their own hat. Solution. Let X denote the number of men that select their own hats. Define X i by { 1, if the ith man selects his own hat X i 0, otherwise.

11 Hence, X X 1 + X X N. E(X i )1/N. Thus, E(X) 1. That is, no matter how many people at the party, on the average just one of them will select his own hat. Example. Suppose there are 4 different types of coupons and suppose that each time one obtains a coupon, it is equally likely to be any one of the 4 types. Compute the expected number of different types that are contained in a set of 10 coupons. Solution. Define { 1, if at least one type-i coupon is in the set of 10 X i 0, otherwise. Hence X X 1 + X 2 + X 3 + X 4. Now, E(X i )P(X i 1) P (at least one type-i coupon is in the set of 10) 1 P (no type-i coupons are in the set of 10 ) ( ) Hence, [ E(X) E(X 1 )+E(X 2 )+E(X 3 )+E(X 4 )4 1 ( ) ]

12 Independent Random Variables The random variable X and Y are said to be independent if, for all a, b, P (X a, Y b) P (X a) P (Y b). When X and Y are discrete, the condition of independence reduces to p(x, y) p X (x) p Y (y), and if X and Y are jointly continuous, independence reduces to f(x, y) f X (x) f Y (y). Proposition. If X and Y are independent, then for any functions h and g, g(x) and h(y ) are independent and E[g(X) h(y )] E[g(X)] E[h(Y )]. Remark. In general, E[g(X) h(y )] E[g(X)] E[h(Y )] does NOT imply independence.

13 Covariance The covariance of any two random variables X and Y, denoted by Cov(X, Y ), is defined by Cov(X, Y )E[(X E(X))(Y E(Y )]. The following is an useful formula to compute the covariance: Cov(X, Y )E(XY ) E(Y )E(Y ). Proposition. If X and Y are independent, then Cov(X, Y ) 0. Properties of Covariance For any random variables X, Y, Z, and a constant c, Cov(X, X) Var(X) Cov(X, Y )Cov(Y,X) Cov(cX,Y)cCov(X, Y ) Cov(X, Y + Z) Cov(X, Y )+Cov(X, Z) Sums of Random Variables Let X 1,X 2,...X n be a sequence of random variables. Then ( n ) Var X i i1 n Var(X i )+2 i1 If X 1,X 2,...X n are independent, then ( n ) Var X i i1 n i 1 i2 j1 n Var(X i ) i1 Cov(X i,x j )

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density

Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,

More information

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.

Random variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8. Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()

More information

Introduction to Probability

Introduction to Probability Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence

More information

e.g. arrival of a customer to a service station or breakdown of a component in some system.

e.g. arrival of a customer to a service station or breakdown of a component in some system. Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be

More information

5. Continuous Random Variables

5. Continuous Random Variables 5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be

More information

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab

Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?

More information

Lecture 6: Discrete & Continuous Probability and Random Variables

Lecture 6: Discrete & Continuous Probability and Random Variables Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference

What is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference 0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures

More information

Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

More information

Lecture Notes 1. Brief Review of Basic Probability

Lecture Notes 1. Brief Review of Basic Probability Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters -3 are a review. I will assume you have read and understood Chapters -3. Here is a very

More information

ST 371 (IV): Discrete Random Variables

ST 371 (IV): Discrete Random Variables ST 371 (IV): Discrete Random Variables 1 Random Variables A random variable (rv) is a function that is defined on the sample space of the experiment and that assigns a numerical variable to each possible

More information

Math 431 An Introduction to Probability. Final Exam Solutions

Math 431 An Introduction to Probability. Final Exam Solutions Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <

More information

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4)

Summary of Formulas and Concepts. Descriptive Statistics (Ch. 1-4) Summary of Formulas and Concepts Descriptive Statistics (Ch. 1-4) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete

More information

Section 5.1 Continuous Random Variables: Introduction

Section 5.1 Continuous Random Variables: Introduction Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,

More information

Notes on Continuous Random Variables

Notes on Continuous Random Variables Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes

More information

Important Probability Distributions OPRE 6301

Important Probability Distributions OPRE 6301 Important Probability Distributions OPRE 6301 Important Distributions... Certain probability distributions occur with such regularity in real-life applications that they have been given their own names.

More information

Random variables, probability distributions, binomial random variable

Random variables, probability distributions, binomial random variable Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

More information

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i ) Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

More information

Chapter 5. Random variables

Chapter 5. Random variables Random variables random variable numerical variable whose value is the outcome of some probabilistic experiment; we use uppercase letters, like X, to denote such a variable and lowercase letters, like

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Joint Exam 1/P Sample Exam 1

Joint Exam 1/P Sample Exam 1 Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question

More information

MAS108 Probability I

MAS108 Probability I 1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper

More information

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS

UNIT I: RANDOM VARIABLES PART- A -TWO MARKS UNIT I: RANDOM VARIABLES PART- A -TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1-x) 0

More information

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1].

The sample space for a pair of die rolls is the set. The sample space for a random number between 0 and 1 is the interval [0, 1]. Probability Theory Probability Spaces and Events Consider a random experiment with several possible outcomes. For example, we might roll a pair of dice, flip a coin three times, or choose a random real

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution

Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. Part 3: Discrete Uniform Distribution Binomial Distribution Chapter 3: DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS Part 3: Discrete Uniform Distribution Binomial Distribution Sections 3-5, 3-6 Special discrete random variable distributions we will cover

More information

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL

FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint

More information

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X

Probability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chi-squared distributions, normal approx'n to the binomial Uniform [,1] random

More information

Probability Generating Functions

Probability Generating Functions page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence

More information

Statistics 100A Homework 7 Solutions

Statistics 100A Homework 7 Solutions Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase

More information

Exponential Distribution

Exponential Distribution Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1

More information

The Exponential Distribution

The Exponential Distribution 21 The Exponential Distribution From Discrete-Time to Continuous-Time: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding

More information

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS

6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:30-9:30 PM. SOLUTIONS 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:3-9:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total

More information

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015.

Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 2-3, Probability and Statistics, March 2015. Due:-March 25, 2015. Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment -3, Probability and Statistics, March 05. Due:-March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x

More information

Section 6.1 Joint Distribution Functions

Section 6.1 Joint Distribution Functions Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function

More information

A Tutorial on Probability Theory

A Tutorial on Probability Theory Paola Sebastiani Department of Mathematics and Statistics University of Massachusetts at Amherst Corresponding Author: Paola Sebastiani. Department of Mathematics and Statistics, University of Massachusetts,

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces

More information

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.

3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved. 3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately

More information

Lecture 8. Confidence intervals and the central limit theorem

Lecture 8. Confidence intervals and the central limit theorem Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of

More information

Sums of Independent Random Variables

Sums of Independent Random Variables Chapter 7 Sums of Independent Random Variables 7.1 Sums of Discrete Random Variables In this chapter we turn to the important question of determining the distribution of a sum of independent random variables

More information

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 2 Solutions Math 370, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE

STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

2. Discrete random variables

2. Discrete random variables 2. Discrete random variables Statistics and probability: 2-1 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be

More information

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS. DISCRETE RANDOM VARIABLES.. Definition of a Discrete Random Variable. A random variable X is said to be discrete if it can assume only a finite or countable

More information

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions)

Math 370, Actuarial Problemsolving Spring 2008 A.J. Hildebrand. Practice Test, 1/28/2008 (with solutions) Math 370, Actuarial Problemsolving Spring 008 A.J. Hildebrand Practice Test, 1/8/008 (with solutions) About this test. This is a practice test made up of a random collection of 0 problems from past Course

More information

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.

CHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is. Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,

More information

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur

Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Probability and Statistics Prof. Dr. Somesh Kumar Department of Mathematics Indian Institute of Technology, Kharagpur Module No. #01 Lecture No. #15 Special Distributions-VI Today, I am going to introduce

More information

Statistics 100A Homework 8 Solutions

Statistics 100A Homework 8 Solutions Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the one-half

More information

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008

Math 425 (Fall 08) Solutions Midterm 2 November 6, 2008 Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the

More information

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution

Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution Lecture 3: Continuous distributions, expected value & mean, variance, the normal distribution 8 October 2007 In this lecture we ll learn the following: 1. how continuous probability distributions differ

More information

2WB05 Simulation Lecture 8: Generating random variables

2WB05 Simulation Lecture 8: Generating random variables 2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2014 1 2 3 4 5 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. Experiment, outcome, sample space, and

More information

Aggregate Loss Models

Aggregate Loss Models Aggregate Loss Models Chapter 9 Stat 477 - Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman - BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 2 Solutions Math 70/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 2 Solutions About this problem set: These are problems from Course /P actuarial exams that I have collected over the years,

More information

4. Continuous Random Variables, the Pareto and Normal Distributions

4. Continuous Random Variables, the Pareto and Normal Distributions 4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random

More information

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here.

Homework 4 - KEY. Jeff Brenion. June 16, 2004. Note: Many problems can be solved in more than one way; we present only a single solution here. Homework 4 - KEY Jeff Brenion June 16, 2004 Note: Many problems can be solved in more than one way; we present only a single solution here. 1 Problem 2-1 Since there can be anywhere from 0 to 4 aces, the

More information

E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

More information

Tenth Problem Assignment

Tenth Problem Assignment EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiple-choice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious

More information

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22

Math 151. Rumbos Spring 2014 1. Solutions to Assignment #22 Math 151. Rumbos Spring 2014 1 Solutions to Assignment #22 1. An experiment consists of rolling a die 81 times and computing the average of the numbers on the top face of the die. Estimate the probability

More information

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1

ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

1 Sufficient statistics

1 Sufficient statistics 1 Sufficient statistics A statistic is a function T = rx 1, X 2,, X n of the random sample X 1, X 2,, X n. Examples are X n = 1 n s 2 = = X i, 1 n 1 the sample mean X i X n 2, the sample variance T 1 =

More information

STAT 830 Convergence in Distribution

STAT 830 Convergence in Distribution STAT 830 Convergence in Distribution Richard Lockhart Simon Fraser University STAT 830 Fall 2011 Richard Lockhart (Simon Fraser University) STAT 830 Convergence in Distribution STAT 830 Fall 2011 1 / 31

More information

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions

Math 370/408, Spring 2008 Prof. A.J. Hildebrand. Actuarial Exam Practice Problem Set 5 Solutions Math 370/408, Spring 2008 Prof. A.J. Hildebrand Actuarial Exam Practice Problem Set 5 Solutions About this problem set: These are problems from Course 1/P actuarial exams that I have collected over the

More information

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1

ECE302 Spring 2006 HW5 Solutions February 21, 2006 1 ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics

More information

Notes on Probability Theory

Notes on Probability Theory Notes on Probability Theory Christopher King Department of Mathematics Northeastern University July 31, 2009 Abstract These notes are intended to give a solid introduction to Probability Theory with a

More information

How To Find Out How Much Money You Get From A Car Insurance Claim

How To Find Out How Much Money You Get From A Car Insurance Claim Chapter 11. Poisson processes. Section 11.4. Superposition and decomposition of a Poisson process. Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/18 Superposition

More information

An Introduction to Basic Statistics and Probability

An Introduction to Basic Statistics and Probability An Introduction to Basic Statistics and Probability Shenek Heyward NCSU An Introduction to Basic Statistics and Probability p. 1/4 Outline Basic probability concepts Conditional probability Discrete Random

More information

Transformations and Expectations of random variables

Transformations and Expectations of random variables Transformations and Epectations of random variables X F X (): a random variable X distributed with CDF F X. Any function Y = g(x) is also a random variable. If both X, and Y are continuous random variables,

More information

Measurements of central tendency express whether the numbers tend to be high or low. The most common of these are:

Measurements of central tendency express whether the numbers tend to be high or low. The most common of these are: A PRIMER IN PROBABILITY This handout is intended to refresh you on the elements of probability and statistics that are relevant for econometric analysis. In order to help you prioritize the information

More information

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day.

Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day. 16 The Exponential Distribution Example: 1. You have observed that the number of hits to your web site follow a Poisson distribution at a rate of 2 per day. Let T be the time (in days) between hits. 2.

More information

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) =

0 x = 0.30 x = 1.10 x = 3.05 x = 4.15 x = 6 0.4 x = 12. f(x) = . A mail-order computer business has si telephone lines. Let X denote the number of lines in use at a specified time. Suppose the pmf of X is as given in the accompanying table. 0 2 3 4 5 6 p(.0.5.20.25.20.06.04

More information

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation

Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 13. Random Variables: Distribution and Expectation CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 3 Random Variables: Distribution and Expectation Random Variables Question: The homeworks of 20 students are collected

More information

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1

ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in

More information

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung

M2S1 Lecture Notes. G. A. Young http://www2.imperial.ac.uk/ ayoung M2S1 Lecture Notes G. A. Young http://www2.imperial.ac.uk/ ayoung September 2011 ii Contents 1 DEFINITIONS, TERMINOLOGY, NOTATION 1 1.1 EVENTS AND THE SAMPLE SPACE......................... 1 1.1.1 OPERATIONS

More information

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.)

Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.) Name: Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m. - 10:45 a.m.) The total score is 100 points. Instructions: There are six questions. Each one is worth 20 points. TA will grade the best five

More information

Principle of Data Reduction

Principle of Data Reduction Chapter 6 Principle of Data Reduction 6.1 Introduction An experimenter uses the information in a sample X 1,..., X n to make inferences about an unknown parameter θ. If the sample size n is large, then

More information

THE CENTRAL LIMIT THEOREM TORONTO

THE CENTRAL LIMIT THEOREM TORONTO THE CENTRAL LIMIT THEOREM DANIEL RÜDT UNIVERSITY OF TORONTO MARCH, 2010 Contents 1 Introduction 1 2 Mathematical Background 3 3 The Central Limit Theorem 4 4 Examples 4 4.1 Roulette......................................

More information

( ) is proportional to ( 10 + x)!2. Calculate the

( ) is proportional to ( 10 + x)!2. Calculate the PRACTICE EXAMINATION NUMBER 6. An insurance company eamines its pool of auto insurance customers and gathers the following information: i) All customers insure at least one car. ii) 64 of the customers

More information

Covariance and Correlation

Covariance and Correlation Covariance and Correlation ( c Robert J. Serfling Not for reproduction or distribution) We have seen how to summarize a data-based relative frequency distribution by measures of location and spread, such

More information

Poisson Processes. Chapter 5. 5.1 Exponential Distribution. The gamma function is defined by. Γ(α) = t α 1 e t dt, α > 0.

Poisson Processes. Chapter 5. 5.1 Exponential Distribution. The gamma function is defined by. Γ(α) = t α 1 e t dt, α > 0. Chapter 5 Poisson Processes 5.1 Exponential Distribution The gamma function is defined by Γ(α) = t α 1 e t dt, α >. Theorem 5.1. The gamma function satisfies the following properties: (a) For each α >

More information

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1

Lecture 2: Discrete Distributions, Normal Distributions. Chapter 1 Lecture 2: Discrete Distributions, Normal Distributions Chapter 1 Reminders Course website: www. stat.purdue.edu/~xuanyaoh/stat350 Office Hour: Mon 3:30-4:30, Wed 4-5 Bring a calculator, and copy Tables

More information

Normal distribution. ) 2 /2σ. 2π σ

Normal distribution. ) 2 /2σ. 2π σ Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

More information

1. Prove that the empty set is a subset of every set.

1. Prove that the empty set is a subset of every set. 1. Prove that the empty set is a subset of every set. Basic Topology Written by Men-Gen Tsai email: b89902089@ntu.edu.tw Proof: For any element x of the empty set, x is also an element of every set since

More information

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean

Solution to HW - 1. Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean Problem 1. [Points = 3] In September, Chapel Hill s daily high temperature has a mean of 81 degree F and a standard deviation of 10 degree F. What is the mean, standard deviation and variance in terms

More information

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions

Math 370, Spring 2008 Prof. A.J. Hildebrand. Practice Test 1 Solutions Math 70, Spring 008 Prof. A.J. Hildebrand Practice Test Solutions About this test. This is a practice test made up of a random collection of 5 problems from past Course /P actuarial exams. Most of the

More information

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291)

Feb 28 Homework Solutions Math 151, Winter 2012. Chapter 6 Problems (pages 287-291) Feb 8 Homework Solutions Math 5, Winter Chapter 6 Problems (pages 87-9) Problem 6 bin of 5 transistors is known to contain that are defective. The transistors are to be tested, one at a time, until the

More information

Master s Theory Exam Spring 2006

Master s Theory Exam Spring 2006 Spring 2006 This exam contains 7 questions. You should attempt them all. Each question is divided into parts to help lead you through the material. You should attempt to complete as much of each problem

More information

Statistics 100A Homework 4 Solutions

Statistics 100A Homework 4 Solutions Problem 1 For a discrete random variable X, Statistics 100A Homework 4 Solutions Ryan Rosario Note that all of the problems below as you to prove the statement. We are proving the properties of epectation

More information

The Binomial Distribution

The Binomial Distribution The Binomial Distribution James H. Steiger November 10, 00 1 Topics for this Module 1. The Binomial Process. The Binomial Random Variable. The Binomial Distribution (a) Computing the Binomial pdf (b) Computing

More information

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...

1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles... MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 2009-2016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability

More information

Probability and Random Variables. Generation of random variables (r.v.)

Probability and Random Variables. Generation of random variables (r.v.) Probability and Random Variables Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly

More information

Mathematical Expectation

Mathematical Expectation Mathematical Expectation Properties of Mathematical Expectation I The concept of mathematical expectation arose in connection with games of chance. In its simplest form, mathematical expectation is the

More information

You flip a fair coin four times, what is the probability that you obtain three heads.

You flip a fair coin four times, what is the probability that you obtain three heads. Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

More information

STAT 3502. x 0 < x < 1

STAT 3502. x 0 < x < 1 Solution - Assignment # STAT 350 Total mark=100 1. A large industrial firm purchases several new word processors at the end of each year, the exact number depending on the frequency of repairs in the previous

More information

Practice with Proofs

Practice with Proofs Practice with Proofs October 6, 2014 Recall the following Definition 0.1. A function f is increasing if for every x, y in the domain of f, x < y = f(x) < f(y) 1. Prove that h(x) = x 3 is increasing, using

More information

Math/Stats 342: Solutions to Homework

Math/Stats 342: Solutions to Homework Math/Stats 342: Solutions to Homework Steven Miller (sjm1@williams.edu) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability

More information