6.041/6.431 Spring 2008 Quiz 2 Wednesday, April 16, 7:309:30 PM. SOLUTIONS


 Geraldine Matthews
 2 years ago
 Views:
Transcription
1 6.4/6.43 Spring 28 Quiz 2 Wednesday, April 6, 7:39:3 PM. SOLUTIONS Name: Recitation Instructor: TA: 6.4/6.43: Question Part Score Out of 3 all 36 2 a 4 b 5 c 5 d 8 e 5 f 6 3 a 4 b 6 c 6 d 6 e 6 Total Write your solutions in this quiz packet, only solutions in the quiz packet will be graded. Question one, multiple choice questions, will receive no partial credit. Partial credit for question two and three will be awarded. You are allowed 2 twosided 8.5 by formula sheet plus a calculator. You have 2 minutes to complete the quiz. Be neat! You will not get credit if we can t read it. We will send out an with more information on how to obtain your quiz before drop date. Good Luck!
2 6.4/6.43: Probabilistic Systems Analysis (Spring 28) Question Multiple choice questions. CLEARLY circle the best answer for each question below. Each question is worth 4 points each, with no partial credit given. a. Let X, X 2, and X 3 be independent random variables with the continuous uniform distribution over [, ]. Then P(X < X 2 < X 3 ) (i) /6 (ii) /3 (iii) /2 (iv) /4 : To understand the principle, first consider a simpler problem with X and X 2 as given above. Note that P(X < X 2 ) + P(X 2 < X ) + P(X X 2 ) since the corresponding events are disjoint and exaust all the possibilities. But P(X < X 2 ) P(X 2 < X ) by symmetry. Furthermore, P(X X 2 ) since the random variables are continuous. Therefore, P(X < X 2 ) /2. Analogously, omitting the events with zero probability but making sure to exhaust all other possibilities, we have that P(X < X 2 < X 3 ) + P(X < X 3 < X 2 ) + P(X 2 < X < X 3 ) + P(X 2 < X 3 < X ) + P(X 3 < X < X 2 ) + P(X 3 < X 2 < X ). And, by symmetry, P(X < X 2 < X 3 ) P(X < X 3 < X 2 ) P(X 2 < X < X 3 ) P(X 2 < X 3 < X ) P(X 3 < X < X 2 ) P(X 3 < X 2 < X ). Thus, P(X < X 2 < X 3 ) /6. b. Let X and Y be two continuous random variables. Then (i) E[XY ] E[X]E[Y ] (ii) E[X 2 + Y 2 ] E[X 2 ] + E[Y 2 ] (iii) f X+Y (x + y) f X (x)f Y (y) (iv) var(x + Y ) var(x) + var(y ) : Since X 2 and Y 2 are random variables, the result follows by the linearity of expectation. c. Suppose X is uniformly distributed over [, 4] and Y is uniformly distributed over [, ]. Assume X and Y are independent. Let Z X + Y. Then (i) f Z (4.5) (ii) f Z (4.5) /8 (iii) f Z (4.5) /4 (iv) f Z (4.5) /2 : Since X and Y are independent, the result follows by convolution: 4 f Z (4.5) f X (α)f Y (4.5 α) dα dα Page 3 of
3 6.4/6.43: Probabilistic Systems Analysis (Spring 28) d. For the random variables defined in part (c), P(max(X, Y ) > 3) is equal to (i) (ii) 9/4 (iii) 3/4 (iv) /4 : Note that P(max(X, Y ) > 3) P(max(X, Y ) 3) P({X 3} {Y 3}). But, X and Y are independent, so P({X 3} {Y 3}) P(X 3)P(Y 3). Finally, computing the probabilities, we have P(X 3) 3/4 and P(X 3). Thus, P(max(X, Y ) > 3) 3/4 /4. e. Recall the hat problem from lecture: N people put their hats in a closet at the start of a party, where each hat is uniquely identified. At the end of the party each person randomly selects a hat from the closet. Suppose N is a Poisson random variable with parameter λ. If X is the number of people who pick their own hats, then E[X] is equal to (i) λ (ii) /λ 2 (iii) /λ (iv) : Let X X X N where each X i is the indicator function such that X i if the i th person picks their own hat and X i otherwise. By the linearity of the expectation, E[X N n] E[X N n] E[X N N n]. But E[X i N n] /n for all i,..., n. Thus, E[X N n] ne[x i N n]. Finally, E[X] E[E[X N n]]. f. Suppose X and Y are Poisson random variables with parameters λ and λ 2 respectively, where X and Y are independent. Define W X + Y, then (i) W is Poisson with parameter min(λ, λ 2 ) (ii) W is Poisson with parameter λ + λ 2 (iii) W may not be Poisson but has mean equal to min(λ, λ 2 ) (iv) W may not be Poisson but has mean equal to λ + λ 2 : The quickest way to obtain the answer is through transforms: M X (s) e λ (e s ) and M Y (s) e λ 2(e s ). Since X and Y are independent, we have that M W (s) e λ (e s ) e λ 2(e s ) e (λ +λ 2 )(e s ), which equals the transform of a Poisson random variable with mean λ + λ 2. Page 4 of
4 6.4/6.43: Probabilistic Systems Analysis (Spring 28) g. Let X be a random variable whose transform is given by M X (s) (.4 +.6e s ) 5. Then (i) P(X ) P(X 5) (ii) P(X 5) > (iii) P(X ) (.4) 5 (iv) P(X 5).6 : Note that M X (s) is the transform of a binomial random variable X with n 5 trials and the probability of success p.6. Thus, P(X ).4 5. h. Let X i, i, 2,... be independent random variables all distributed according to the PDF f X (x) x/8 for x 4. Let S i X i. Then P(S > 3) is approximately equal to (i) Φ(5) (ii) Φ(5) 5 (iii) Φ 2 5 (iv) Φ 2 : Let S i Y i where Y i is the random variable given by Y i X /. Since Y i are iid, the distribution of S is approximately normal with mean E[S] and variance var(s). 3 E(S) Thus, P(S > 3) P(S 3) Φ. Now, var(s) Therefore, 4 x x3 4 8 E[X i ] x dx var(x i ) E[X 2 ] (E[X i ]) 2 i 4 x 2 x 8 dx x and E[S] E[X i ] E[X ] 8/3. 8 var(s) var(x i ) var(x i ) /3 P(S > 3) Φ Φ. 2 Page 5 of
5 6.4/6.43: Probabilistic Systems Analysis (Spring 28) i. Let X i, i, 2,... be independent random variables all distributed according to the PDF f X (x), x. Define Y n X X 2 X 3... X n, for some integer n. Then var(y n ) is equal to n (i) 2 (ii) 3 n 4 n (iii) 2 n (iv) 2 : Since X,..., X n are independent, we have that E[Y n ] E[X ]... E[X n ]. Similarly, E[Y n 2 ] E[X 2 ]... E[Xn 2 ]. Since E[X i ] /2 and E[Xi 2 ] /3 for i,..., n, it follows that var(s n ) E[Y 2 n ] (E[Y n ]) 2 4 n. 3 n Question 2 Each Mac book has a lifetime that is exponentially distributed with parameter λ. The lifetime of Mac books are independent of each other. Suppose you have two Mac books, which you begin using at the same time. Define T as the time of the first laptop failure and T 2 as the time of the second laptop failure. a. Compute f T (t ). Let M be the life time of mac book and M 2 the lifetime of mac book 2, where M and M 2 are iid exponential random variables with CDF F M (m) e λm. T, the time of the first mac book failure, is the minimum of M and M 2. To derive the distribution of T, we first find the CDF F T (t), and then differentiate to find the PDF f T (t). Differentiating F T (t) with respect to t yields: F T (t) P (min(m, M 2 ) < t) P (min(m, M 2 ) t) P (M t)p (M 2 t) ( F M (t)) 2 e 2λt t f T (t) 2λe 2λt t Page 6 of
6 6.4/6.43: Probabilistic Systems Analysis (Spring 28) b. Let X T 2 T. Compute f X T (x t ). Conditioned on the time of the first mac book failure, the time until the other mac book fails is an exponential random variable by the memoryless property. The memoryless property tells us that regardless of the elapsed life time of the mac book, the time until failure has the same exponential CDF. Consequently, f X T (x) λe λx x. c. Is X independent of T? Give a mathematical justification for your answer. Since we have shown in 2(c) that f X T (x t) does not depend on t, X and T are independent. d. Compute f T2 (t 2 ) and E[T 2 ]. The time of the second laptop failure T 2 is equal to T + X. Since X and T were shown to be independent in 2(b), we convolve the densities found in 2(a) and 2(b) to determine f T2 (t). f T2 (t) f T (τ)f X (t τ)dτ t 2(λ) 2 e 2λτ e λ(t τ) dτ t 2λe λt λe λτ dτ 2λe λt ( e λt ) t Also, by the linearity of expectation, we have that E[T 2 ] E[T ] + E[X] 2 λ + 3 λ 2 λ. An equivalent method for solving this problem is to note that T 2 is the maximum of M and M 2, and deriving the distribution of T 2 in our standard CDF to PDF method: Differentiating F T2 (t) with respect to t yields: F T2 (t) P(max(M, M 2 ) < t) P(M 2 t)p (M 2 t) F M (t) 2 2e λt + e 2λt t f T2 (t) 2λe λt 2λe 2λt t which is equivalent to our solution by convolution above. 2 Finally, from the above density we obtain that E[T 2 ] λ 3 2λ 2λ, which matches our earlier solution. Page 7 of
7 6.4/6.43: Probabilistic Systems Analysis (Spring 28) e. Now suppose you have Mac books, and let Y be the time of the first laptop failure. Find the best answer for P(Y <.) Y is equal to the minimum of independent exponential random variables. Following the derivation in (a), we determine by analogy: f Y (y) λe λy t Integrating over y from to., we find P(Y <.) e λ. Your friend, Charlie, loves Mac books so much he buys S new Mac books every day! On any given day S is equally likely to be 4 or 8, and all days are independent from each other. Let S be the number of Mac books Charlie buys over the next days. f. (6 pts) Find the best approximation for P(S 68). Express your final answer in terms of Φ( ), the CDF of the standard normal. Using the De Moivre  Laplace Approximation to the Binomial, and noting that the step size between values that S can take on is 4, P (S 68) Φ 4 Φ 2 Φ (.5) Question 3 Saif is a well intentioned though slightly indecisive fellow. Every morning he flips a coin to decide where to go. If the coin is heads he drives to the mall, if it comes up tails he volunteers at the local shelter. Saif s coin is not necessarily fair, rather it possesses a probability of heads equal to q. We do not know q, but we do know it is wellmodeled by a random variable Q where the density of Q is 2q for q f Q (q) otherwise Assume conditioned on Q each coin flip is independent. Note parts a, b, c, and {d, e} may be answered independent of each other. Page 8 of
8 6.4/6.43: Probabilistic Systems Analysis (Spring 28) a. (4 pts) What s the probability that Saif goes to the local shelter if he flips the coin once? Let X i be the outcome of a coin toss on the i th trial, where X i if the coin lands heads, and X i if the coin lands tails. By the total probability theorem: P(X ) P (X Q q)f(q)dq ( q)2q dq 3 In an attempt to promote virtuous behavior, Saif s father offers to pay him $4 every day he volunteers at the local shelter. Define X as Saif s payout if he flips the coin every morning for the next 3 days. b. Find var(x) Let Y i be a Bernoulli random variable describing the outcome of a coin tossed on morning i. Then, Y i corresponds to the event that on morning i, Saif goes to the local shelter; Y i corresponds to the event that on morning i, Saif goes to the mall. Assuming that the coin lands heads with probability q, i.e. that Q q, we have that P (Y i ) q, and P (Y i ) q for i,..., 3. Saif s payout for next 3 days is described by random variable X 4(Y + Y Y 3 ). var(x) 6 var(y + Y Y 3 ) 6 var(e[y + Y Y 3 Q]) + E[var(Y + Y Y 3 Q)] Now note that, conditioned on Q q, Y,..., Y 3 are independent. Thus, var(y +Y 2 + +Y 3 Q) var(y Q) var(y 3 Q). So, var(x) 6 var(3q) + 6 E[var(Y Q) var(y 3 Q)] var(q) E[Q( Q)] (E[Q 2 ] (E[Q]) 2 ) + 6 3(E[Q] E[Q 2 ]) (/2 4/9) + 6 3(2/3 /2) 88 since E[Q] 2q 2 dq 2/3 and E[Q 2 ] 2q 3 dq /2. Page 9 of
9 6.4/6.43: Probabilistic Systems Analysis (Spring 28) Let event B be that Saif goes to the local shelter at least once in k days. c. Find the conditional density of Q given B, f Q B (q) By Bayes Rule: P (B Q q)f Q (q) f Q B (q) P (B Q q)fq (q)dq ( q k )2q ( q k )2q dq 2q( q k ) 2/(k + 2) q While shopping at the mall, Saif gets a call from his sister Mais. They agree to meet at the Coco Cabana Court yard at exactly :3PM. Unfortunately Mais arrives Z minutes late, where Z is a continuous uniform random variable from zero to minutes. Saif is furious that Mais has kept him waiting, and demands Mais pay him R dollars, where R exp(z + 2). e. Find Saif s expected payout, E[R]. f. Find the density of Saif s payout, f R (r). E[R] e z+2 f(z)dz e 2 e z dz e2 e2 F R (r) P(e Z+2 r) P(Z + 2 ln(r)) ln(r) 2 dz ln(r) e r e f R (r) r e 2 r e 2 Page of
10 MIT OpenCourseWare / 6.43 Probabilistic Systems Analysis and Applied Probability Fall 2 For information about citing these materials or our Terms of Use, visit:
e.g. arrival of a customer to a service station or breakdown of a component in some system.
Poisson process Events occur at random instants of time at an average rate of λ events per second. e.g. arrival of a customer to a service station or breakdown of a component in some system. Let N(t) be
More informationChapters 5. Multivariate Probability Distributions
Chapters 5. Multivariate Probability Distributions Random vectors are collection of random variables defined on the same sample space. Whenever a collection of random variables are mentioned, they are
More informationMath 431 An Introduction to Probability. Final Exam Solutions
Math 43 An Introduction to Probability Final Eam Solutions. A continuous random variable X has cdf a for 0, F () = for 0 <
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More information5. Continuous Random Variables
5. Continuous Random Variables Continuous random variables can take any value in an interval. They are used to model physical characteristics such as time, length, position, etc. Examples (i) Let X be
More information6. Jointly Distributed Random Variables
6. Jointly Distributed Random Variables We are often interested in the relationship between two or more random variables. Example: A randomly chosen person may be a smoker and/or may get cancer. Definition.
More informationExponential Distribution
Exponential Distribution Definition: Exponential distribution with parameter λ: { λe λx x 0 f(x) = 0 x < 0 The cdf: F(x) = x Mean E(X) = 1/λ. f(x)dx = Moment generating function: φ(t) = E[e tx ] = { 1
More informationMath 461 Fall 2006 Test 2 Solutions
Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two
More informationStat 515 Midterm Examination II April 6, 2010 (9:30 a.m.  10:45 a.m.)
Name: Stat 515 Midterm Examination II April 6, 2010 (9:30 a.m.  10:45 a.m.) The total score is 100 points. Instructions: There are six questions. Each one is worth 20 points. TA will grade the best five
More informationLecture 7: Continuous Random Variables
Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider
More informationFor a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )
Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (19031987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll
More informationSection 5.1 Continuous Random Variables: Introduction
Section 5. Continuous Random Variables: Introduction Not all random variables are discrete. For example:. Waiting times for anything (train, arrival of customer, production of mrna molecule from gene,
More informationST 371 (VIII): Theory of Joint Distributions
ST 371 (VIII): Theory of Joint Distributions So far we have focused on probability distributions for single random variables. However, we are often interested in probability statements concerning two or
More informationUniversity of California, Berkeley, Statistics 134: Concepts of Probability
University of California, Berkeley, Statistics 134: Concepts of Probability Michael Lugo, Spring 211 Exam 2 solutions 1. A fair twentysided die has its faces labeled 1, 2, 3,..., 2. The die is rolled
More informationMATH 201. Final ANSWERS August 12, 2016
MATH 01 Final ANSWERS August 1, 016 Part A 1. 17 points) A bag contains three different types of dice: four 6sided dice, five 8sided dice, and six 0sided dice. A die is drawn from the bag and then rolled.
More informationSolutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG820). December 15, 2012.
Solutions for the exam for Matematisk statistik och diskret matematik (MVE050/MSG810). Statistik för fysiker (MSG8). December 15, 12. 1. (3p) The joint distribution of the discrete random variables X and
More informationOverview of Monte Carlo Simulation, Probability Review and Introduction to Matlab
Monte Carlo Simulation: IEOR E4703 Fall 2004 c 2004 by Martin Haugh Overview of Monte Carlo Simulation, Probability Review and Introduction to Matlab 1 Overview of Monte Carlo Simulation 1.1 Why use simulation?
More information3 Multiple Discrete Random Variables
3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f
More informationFinal Mathematics 5010, Section 1, Fall 2004 Instructor: D.A. Levin
Final Mathematics 51, Section 1, Fall 24 Instructor: D.A. Levin Name YOU MUST SHOW YOUR WORK TO RECEIVE CREDIT. A CORRECT ANSWER WITHOUT SHOWING YOUR REASONING WILL NOT RECEIVE CREDIT. Problem Points Possible
More informationCHAPTER 6: Continuous Uniform Distribution: 6.1. Definition: The density function of the continuous random variable X on the interval [A, B] is.
Some Continuous Probability Distributions CHAPTER 6: Continuous Uniform Distribution: 6. Definition: The density function of the continuous random variable X on the interval [A, B] is B A A x B f(x; A,
More informationCSE 312, 2011 Winter, W.L.Ruzzo. 7. continuous random variables
CSE 312, 2011 Winter, W.L.Ruzzo 7. continuous random variables continuous random variables Discrete random variable: takes values in a finite or countable set, e.g. X {1,2,..., 6} with equal probability
More informationMTH135/STA104: Probability
MTH135/STA14: Probability Homework # 8 Due: Tuesday, Nov 8, 5 Prof Robert Wolpert 1 Define a function f(x, y) on the plane R by { 1/x < y < x < 1 f(x, y) = other x, y a) Show that f(x, y) is a joint probability
More informationWorked examples Multiple Random Variables
Worked eamples Multiple Random Variables Eample Let X and Y be random variables that take on values from the set,, } (a) Find a joint probability mass assignment for which X and Y are independent, and
More informationEE 302 Division 1. Homework 5 Solutions.
EE 32 Division. Homework 5 Solutions. Problem. A fair foursided die (with faces labeled,, 2, 3) is thrown once to determine how many times a fair coin is to be flipped: if N is the number that results
More informationJoint Probability Distributions and Random Samples (Devore Chapter Five)
Joint Probability Distributions and Random Samples (Devore Chapter Five) 101634501 Probability and Statistics for Engineers Winter 20102011 Contents 1 Joint Probability Distributions 1 1.1 Two Discrete
More informationQuestion: What is the probability that a fivecard poker hand contains a flush, that is, five cards of the same suit?
ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the
More informationAggregate Loss Models
Aggregate Loss Models Chapter 9 Stat 477  Loss Models Chapter 9 (Stat 477) Aggregate Loss Models Brian Hartman  BYU 1 / 22 Objectives Objectives Individual risk model Collective risk model Computing
More informationJoint Exam 1/P Sample Exam 1
Joint Exam 1/P Sample Exam 1 Take this practice exam under strict exam conditions: Set a timer for 3 hours; Do not stop the timer for restroom breaks; Do not look at your notes. If you believe a question
More informationLecture 8. Confidence intervals and the central limit theorem
Lecture 8. Confidence intervals and the central limit theorem Mathematical Statistics and Discrete Mathematics November 25th, 2015 1 / 15 Central limit theorem Let X 1, X 2,... X n be a random sample of
More informationECE302 Spring 2006 HW7 Solutions March 11, 2006 1
ECE32 Spring 26 HW7 Solutions March, 26 Solutions to HW7 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics where
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 5 9/17/2008 RANDOM VARIABLES Contents 1. Random variables and measurable functions 2. Cumulative distribution functions 3. Discrete
More informationMassachusetts Institute of Technology
n (i) m m (ii) n m ( (iii) n n n n (iv) m m Massachusetts Institute of Technology 6.0/6.: Probabilistic Systems Analysis (Quiz Solutions Spring 009) Question Multiple Choice Questions: CLEARLY circle the
More informationThe Exponential Distribution
21 The Exponential Distribution From DiscreteTime to ContinuousTime: In Chapter 6 of the text we will be considering Markov processes in continuous time. In a sense, we already have a very good understanding
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS
MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 14 10/27/2008 MOMENT GENERATING FUNCTIONS Contents 1. Moment generating functions 2. Sum of a ranom number of ranom variables 3. Transforms
More informationMAS108 Probability I
1 QUEEN MARY UNIVERSITY OF LONDON 2:30 pm, Thursday 3 May, 2007 Duration: 2 hours MAS108 Probability I Do not start reading the question paper until you are instructed to by the invigilators. The paper
More informationStatistics 100A Homework 8 Solutions
Part : Chapter 7 Statistics A Homework 8 Solutions Ryan Rosario. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, the onehalf
More informationECE302 Spring 2006 HW5 Solutions February 21, 2006 1
ECE3 Spring 6 HW5 Solutions February 1, 6 1 Solutions to HW5 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in italics
More informationDefinition: Suppose that two random variables, either continuous or discrete, X and Y have joint density
HW MATH 461/561 Lecture Notes 15 1 Definition: Suppose that two random variables, either continuous or discrete, X and Y have joint density and marginal densities f(x, y), (x, y) Λ X,Y f X (x), x Λ X,
More informationProbability density function : An arbitrary continuous random variable X is similarly described by its probability density function f x = f X
Week 6 notes : Continuous random variables and their probability densities WEEK 6 page 1 uniform, normal, gamma, exponential,chisquared distributions, normal approx'n to the binomial Uniform [,1] random
More information4. Joint Distributions of Two Random Variables
4. Joint Distributions of Two Random Variables 4.1 Joint Distributions of Two Discrete Random Variables Suppose the discrete random variables X and Y have supports S X and S Y, respectively. The joint
More informationChapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams
Review for Final Chapter 2: Data quantifiers: sample mean, sample variance, sample standard deviation Quartiles, percentiles, median, interquartile range Dot diagrams Histogram Boxplots Chapter 3: Set
More informationFEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL
FEGYVERNEKI SÁNDOR, PROBABILITY THEORY AND MATHEmATICAL STATIsTICs 4 IV. RANDOm VECTORs 1. JOINTLY DIsTRIBUTED RANDOm VARIABLEs If are two rom variables defined on the same sample space we define the joint
More informationWorked examples Random Processes
Worked examples Random Processes Example 1 Consider patients coming to a doctor s office at random points in time. Let X n denote the time (in hrs) that the n th patient has to wait before being admitted
More informationMath/Stats 342: Solutions to Homework
Math/Stats 342: Solutions to Homework Steven Miller (sjm1@williams.edu) November 17, 2011 Abstract Below are solutions / sketches of solutions to the homework problems from Math/Stats 342: Probability
More informationContinuous random variables
Continuous random variables So far we have been concentrating on discrete random variables, whose distributions are not continuous. Now we deal with the socalled continuous random variables. A random
More informationRandom variables P(X = 3) = P(X = 3) = 1 8, P(X = 1) = P(X = 1) = 3 8.
Random variables Remark on Notations 1. When X is a number chosen uniformly from a data set, What I call P(X = k) is called Freq[k, X] in the courseware. 2. When X is a random variable, what I call F ()
More informationMath 425 (Fall 08) Solutions Midterm 2 November 6, 2008
Math 425 (Fall 8) Solutions Midterm 2 November 6, 28 (5 pts) Compute E[X] and Var[X] for i) X a random variable that takes the values, 2, 3 with probabilities.2,.5,.3; ii) X a random variable with the
More informationDiscrete and Continuous Random Variables. Summer 2003
Discrete and Continuous Random Variables Summer 003 Random Variables A random variable is a rule that assigns a numerical value to each possible outcome of a probabilistic experiment. We denote a random
More informationContents. TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics. Yuming Jiang. Basic Concepts Random Variables
TTM4155: Teletraffic Theory (Teletrafikkteori) Probability Theory Basics Yuming Jiang 1 Some figures taken from the web. Contents Basic Concepts Random Variables Discrete Random Variables Continuous Random
More informationSection 6.1 Joint Distribution Functions
Section 6.1 Joint Distribution Functions We often care about more than one random variable at a time. DEFINITION: For any two random variables X and Y the joint cumulative probability distribution function
More informationDiscrete Mathematics and Probability Theory Fall 2009 Satish Rao, David Tse Note 18. A Brief Introduction to Continuous Probability
CS 7 Discrete Mathematics and Probability Theory Fall 29 Satish Rao, David Tse Note 8 A Brief Introduction to Continuous Probability Up to now we have focused exclusively on discrete probability spaces
More informationRANDOM VARIABLES MATH CIRCLE (ADVANCED) 3/3/2013. 3 k) ( 52 3 )
RANDOM VARIABLES MATH CIRCLE (ADVANCED) //0 0) a) Suppose you flip a fair coin times. i) What is the probability you get 0 heads? ii) head? iii) heads? iv) heads? For = 0,,,, P ( Heads) = ( ) b) Suppose
More informationLecture Notes 1. Brief Review of Basic Probability
Probability Review Lecture Notes Brief Review of Basic Probability I assume you know basic probability. Chapters 3 are a review. I will assume you have read and understood Chapters 3. Here is a very
More informationECE302 Spring 2006 HW4 Solutions February 6, 2006 1
ECE302 Spring 2006 HW4 Solutions February 6, 2006 1 Solutions to HW4 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in
More informationContinuous Random Variables and Probability Distributions. Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4  and Cengage
4 Continuous Random Variables and Probability Distributions Stat 4570/5570 Material from Devore s book (Ed 8) Chapter 4  and Cengage Continuous r.v. A random variable X is continuous if possible values
More informationLecture 8: Continuous random variables, expectation and variance
Lecture 8: Continuous random variables, expectation and variance Lejla Batina Institute for Computing and Information Sciences Digital Security Version: spring 2012 Lejla Batina Version: spring 2012 Wiskunde
More informationPractice problems for Homework 11  Point Estimation
Practice problems for Homework 11  Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:
More informationRandom Variables, Expectation, Distributions
Random Variables, Expectation, Distributions CS 5960/6960: Nonparametric Methods Tom Fletcher January 21, 2009 Review Random Variables Definition A random variable is a function defined on a probability
More informationTopic 4: Multivariate random variables. Multiple random variables
Topic 4: Multivariate random variables Joint, marginal, and conditional pmf Joint, marginal, and conditional pdf and cdf Independence Expectation, covariance, correlation Conditional expectation Two jointly
More information3. Continuous Random Variables
3. Continuous Random Variables A continuous random variable is one which can take any value in an interval (or union of intervals) The values that can be taken by such a variable cannot be listed. Such
More informationISyE 6761 Fall 2012 Homework #2 Solutions
1 1. The joint p.m.f. of X and Y is (a) Find E[X Y ] for 1, 2, 3. (b) Find E[E[X Y ]]. (c) Are X and Y independent? ISE 6761 Fall 212 Homework #2 Solutions f(x, ) x 1 x 2 x 3 1 1/9 1/3 1/9 2 1/9 1/18 3
More informationChapter 4 Lecture Notes
Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a realvalued function defined on the sample space of some experiment. For instance,
More informationSTAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE
STAT 315: HOW TO CHOOSE A DISTRIBUTION FOR A RANDOM VARIABLE TROY BUTLER 1. Random variables and distributions We are often presented with descriptions of problems involving some level of uncertainty about
More informationECE302 Spring 2006 HW3 Solutions February 2, 2006 1
ECE302 Spring 2006 HW3 Solutions February 2, 2006 1 Solutions to HW3 Note: Most of these solutions were generated by R. D. Yates and D. J. Goodman, the authors of our textbook. I have added comments in
More informationLecture 6: Discrete & Continuous Probability and Random Variables
Lecture 6: Discrete & Continuous Probability and Random Variables D. Alex Hughes Math Camp September 17, 2015 D. Alex Hughes (Math Camp) Lecture 6: Discrete & Continuous Probability and Random September
More informationWhat is Statistics? Lecture 1. Introduction and probability review. Idea of parametric inference
0. 1. Introduction and probability review 1.1. What is Statistics? What is Statistics? Lecture 1. Introduction and probability review There are many definitions: I will use A set of principle and procedures
More informationStatistics 100A Homework 7 Solutions
Chapter 6 Statistics A Homework 7 Solutions Ryan Rosario. A television store owner figures that 45 percent of the customers entering his store will purchase an ordinary television set, 5 percent will purchase
More informationChapter 3 RANDOM VARIATE GENERATION
Chapter 3 RANDOM VARIATE GENERATION In order to do a Monte Carlo simulation either by hand or by computer, techniques must be developed for generating values of random variables having known distributions.
More informationExamples: Joint Densities and Joint Mass Functions Example 1: X and Y are jointly continuous with joint pdf
AMS 3 Joe Mitchell Eamples: Joint Densities and Joint Mass Functions Eample : X and Y are jointl continuous with joint pdf f(,) { c 2 + 3 if, 2, otherwise. (a). Find c. (b). Find P(X + Y ). (c). Find marginal
More informationSTAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case
STAT 430/510 Probability Lecture 14: Joint Probability Distribution, Continuous Case Pengyuan (Penelope) Wang June 20, 2011 Joint density function of continuous Random Variable When X and Y are two continuous
More information4. Joint Distributions
Virtual Laboratories > 2. Distributions > 1 2 3 4 5 6 7 8 4. Joint Distributions Basic Theory As usual, we start with a random experiment with probability measure P on an underlying sample space. Suppose
More informationP (x) 0. Discrete random variables Expected value. The expected value, mean or average of a random variable x is: xp (x) = v i P (v i )
Discrete random variables Probability mass function Given a discrete random variable X taking values in X = {v 1,..., v m }, its probability mass function P : X [0, 1] is defined as: P (v i ) = Pr[X =
More informationRecitation 4. 24xy for 0 < x < 1, 0 < y < 1, x + y < 1 0 elsewhere
Recitation. Exercise 3.5: If the joint probability density of X and Y is given by xy for < x
More informationTenth Problem Assignment
EECS 40 Due on April 6, 007 PROBLEM (8 points) Dave is taking a multiplechoice exam. You may assume that the number of questions is infinite. Simultaneously, but independently, his conscious and subconscious
More informationCombinatorics: The Fine Art of Counting
Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli
More information4. Continuous Random Variables, the Pareto and Normal Distributions
4. Continuous Random Variables, the Pareto and Normal Distributions A continuous random variable X can take any value in a given range (e.g. height, weight, age). The distribution of a continuous random
More informationFeb 7 Homework Solutions Math 151, Winter 2012. Chapter 4 Problems (pages 172179)
Feb 7 Homework Solutions Math 151, Winter 2012 Chapter Problems (pages 172179) Problem 3 Three dice are rolled. By assuming that each of the 6 3 216 possible outcomes is equally likely, find the probabilities
More informationDepartment of Mathematics, Indian Institute of Technology, Kharagpur Assignment 23, Probability and Statistics, March 2015. Due:March 25, 2015.
Department of Mathematics, Indian Institute of Technology, Kharagpur Assignment 3, Probability and Statistics, March 05. Due:March 5, 05.. Show that the function 0 for x < x+ F (x) = 4 for x < for x
More informationRandom variables, probability distributions, binomial random variable
Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that
More information1.1 Introduction, and Review of Probability Theory... 3. 1.1.1 Random Variable, Range, Types of Random Variables... 3. 1.1.2 CDF, PDF, Quantiles...
MATH4427 Notebook 1 Spring 2016 prepared by Professor Jenny Baglivo c Copyright 20092016 by Jenny A. Baglivo. All Rights Reserved. Contents 1 MATH4427 Notebook 1 3 1.1 Introduction, and Review of Probability
More informationThe Normal Distribution. Alan T. Arnholt Department of Mathematical Sciences Appalachian State University
The Normal Distribution Alan T. Arnholt Department of Mathematical Sciences Appalachian State University arnholt@math.appstate.edu Spring 2006 R Notes 1 Copyright c 2006 Alan T. Arnholt 2 Continuous Random
More informationLahore University of Management Sciences
Lahore University of Management Sciences CMPE 501: Applied Probability (Fall 2010) Homework 3: Solution 1. A candy factory has an endless supply of red, orange, yellow, green, blue and violet jelly beans.
More information3.4. The Binomial Probability Distribution. Copyright Cengage Learning. All rights reserved.
3.4 The Binomial Probability Distribution Copyright Cengage Learning. All rights reserved. The Binomial Probability Distribution There are many experiments that conform either exactly or approximately
More informationDefinition 6.1.1. A r.v. X has a normal distribution with mean µ and variance σ 2, where µ R, and σ > 0, if its density is f(x) = 1. 2σ 2.
Chapter 6 Brownian Motion 6. Normal Distribution Definition 6... A r.v. X has a normal distribution with mean µ and variance σ, where µ R, and σ > 0, if its density is fx = πσ e x µ σ. The previous definition
More information4 Sums of Random Variables
Sums of a Random Variables 47 4 Sums of Random Variables Many of the variables dealt with in physics can be expressed as a sum of other variables; often the components of the sum are statistically independent.
More informationProbability Generating Functions
page 39 Chapter 3 Probability Generating Functions 3 Preamble: Generating Functions Generating functions are widely used in mathematics, and play an important role in probability theory Consider a sequence
More informationMath 141. Lecture 7: Variance, Covariance, and Sums. Albyn Jones 1. 1 Library 304. jones/courses/141
Math 141 Lecture 7: Variance, Covariance, and Sums Albyn Jones 1 1 Library 304 jones@reed.edu www.people.reed.edu/ jones/courses/141 Last Time Variance: expected squared deviation from the mean: Standard
More informationContinuous Random Variables
Continuous Random Variables COMP 245 STATISTICS Dr N A Heard Contents 1 Continuous Random Variables 2 11 Introduction 2 12 Probability Density Functions 3 13 Transformations 5 2 Mean, Variance and Quantiles
More informationSummary of Formulas and Concepts. Descriptive Statistics (Ch. 14)
Summary of Formulas and Concepts Descriptive Statistics (Ch. 14) Definitions Population: The complete set of numerical information on a particular quantity in which an investigator is interested. We assume
More informationStats on the TI 83 and TI 84 Calculator
Stats on the TI 83 and TI 84 Calculator Entering the sample values STAT button Left bracket { Right bracket } Store (STO) List L1 Comma Enter Example: Sample data are {5, 10, 15, 20} 1. Press 2 ND and
More informationUNIT I: RANDOM VARIABLES PART A TWO MARKS
UNIT I: RANDOM VARIABLES PART A TWO MARKS 1. Given the probability density function of a continuous random variable X as follows f(x) = 6x (1x) 0
More informationExercises with solutions (1)
Exercises with solutions (). Investigate the relationship between independence and correlation. (a) Two random variables X and Y are said to be correlated if and only if their covariance C XY is not equal
More informationPractice Exam 1: Long List 18.05, Spring 2014
Practice Eam : Long List 8.05, Spring 204 Counting and Probability. A full house in poker is a hand where three cards share one rank and two cards share another rank. How many ways are there to get a fullhouse?
More information2WB05 Simulation Lecture 8: Generating random variables
2WB05 Simulation Lecture 8: Generating random variables Marko Boon http://www.win.tue.nl/courses/2wb05 January 7, 2013 Outline 2/36 1. How do we generate random variables? 2. Fitting distributions Generating
More informationIntroduction to Probability
Introduction to Probability EE 179, Lecture 15, Handout #24 Probability theory gives a mathematical characterization for experiments with random outcomes. coin toss life of lightbulb binary data sequence
More informationPSTAT 120B Probability and Statistics
 Week University of California, Santa Barbara April 10, 013 Discussion section for 10B Information about TA: FangI CHU Office: South Hall 5431 T Office hour: TBA email: chu@pstat.ucsb.edu Slides will
More informationRandom Variable: A function that assigns numerical values to all the outcomes in the sample space.
STAT 509 Section 3.2: Discrete Random Variables Random Variable: A function that assigns numerical values to all the outcomes in the sample space. Notation: Capital letters (like Y) denote a random variable.
More informationConditional expectation
A Conditional expectation A.1 Review of conditional densities, expectations We start with the continuous case. This is sections 6.6 and 6.8 in the book. Let X, Y be continuous random variables. We defined
More information2. Discrete random variables
2. Discrete random variables Statistics and probability: 21 If the chance outcome of the experiment is a number, it is called a random variable. Discrete random variable: the possible outcomes can be
More informationProbability Theory. Florian Herzog. A random variable is neither random nor variable. GianCarlo Rota, M.I.T..
Probability Theory A random variable is neither random nor variable. GianCarlo Rota, M.I.T.. Florian Herzog 2013 Probability space Probability space A probability space W is a unique triple W = {Ω, F,
More information