# People have thought about, and defined, probability in different ways. important to note the consequences of the definition:

Save this PDF as:

Size: px
Start display at page:

Download "People have thought about, and defined, probability in different ways. important to note the consequences of the definition:"

## Transcription

1 PROBABILITY AND LIKELIHOOD, A BRIEF INTRODUCTION IN SUPPORT OF A COURSE ON MOLECULAR EVOLUTION (BIOL 3046) Probability The subject of PROBABILITY is a branch of mathematics dedicated to building models to describe conditions of uncertainty and providing tools to make decisions or draw conclusions on the basis of such models. In the broad sense, a PROBABILITY is a measure of the degree to which an occurrence is certain [or uncertain]. A statistical definition of probability People have thought about, and defined, probability in different ways. important to note the consequences of the definition: It is 1. All definitions agree on the algebraic and arithmetic procedures that must be followed; hence, the definition does not influence the outcome. 2. The definition has a fundamental impact on the meaning of the result! We will consider the frequentist definition of probability, as it is the one that currently is the most widely held. To do this we need to define two concepts: (i) sample space, and (ii) relative frequency. 1. Sample space, S, is the collection [sometimes called universe] of all possible outcomes. For a stochastic system, or an experiment, the sample space is a set where each outcome comprises one element of the set. 2. Relative frequency is the proportion of the sample space on which an event E occurs. In an experiment with 100 outcomes, and E occurs 81 times, the relative frequency is 81/100 or The frequentist approach is based on the notion of statistical regularity; i.e., in the long run, over replicates, the cumulative relative frequency of an event (E) stabilizes. The best way to illustrate this is with an example experiment that we run many times and measure the cumulative relative frequency (crf). The crf is simply the relative frequency computed cumulatively over some number of replicates of samples, each with a space S. Let s take a look at an example of statistical regularity. Suppose we have a treatment for high blood pressure. The event, E, we are interested in is successfully controlling the blood pressure. So, we want to be able to make a prediction about the probability that a patient treated in the future will have

2 blood pressure under control, P(E). To estimate this probability we conduct an experiment that is replicated over time in months. The data are presented in the table below. Month Number of subjects (S) Number Controlled (E) Cumulative S Cumulative E crf [data for example is after McColl (1995)] The crf values down the right most column fluctuate the most in the beginning, but rapidly stabilize. Statistical regularity is the stabilization of the crf in the face of individual fluctuations form month to month in the relative frequency of E. Finally, we are in a position where we can obtain a definition of probability. Here goes: In words, the probability of an event E, written as P(E), is the long run (cumulative) relative frequency of E. More formally we define P(E) as follows: n n ( ) P(E) = lim crf E We can get an idea of this by using an example with nearly infinite replications Hypothetical plot of crf of an event

3 Probability models For all probability models to give consistent results about the outcomes of future events they need to obey four simple axioms (Kolmogorov 1933). Probability axioms: 1. Probability scale= 1 to 0. Hence, 0 P(E) Probabilities are derived from a relative frequency of an event (E) in the space of all possible outcomes (S), where P(S) = 1. Hence, if the probability of an event (E) is P(E), then the probability that E does not occur is 1 P(E). 3. When events E and F are disjoint, they cannot occur together. The probability of disjoint events E or F = P(E or F) = P(E) + P(F). 4. Axiom 3 above deals with a finite sequence of events. Axiom 4 is an extension of axiom 3 to an infinite sequence of events. For the purpose of modelling in molecular evolution, we need to assume these probability axioms and just one additional theorem, the multiplication theorem. I will not provide a detailed explanation of this theorem. However, a consequence of this theorem is what is sometime referred to as the product rule or multiplication rule ; see the box below for an explanation. Product rule: The product rule applies when two events E1 and E2 are independent. E1 and E2 are independent if the occurrence or non-occurrence of E1 does not change the probability of E2 [and vice versa]. [A further statistical definition requires the use of the multiplication theorem] It is important to note that a proof of statistical independence for a specific case by using the multiplication theorem is rarely possible; hence, most models incorporate independence as a model assumption. Typically, probability refers to the occurrence of some future event: When E1 and E2 occur together they are joint events. The joint probability of For example, the probability that a tossed [fair] coin will be heads is ½. the independent events E1 and E2 = P(E1,E2) = P(E1) P(E2). Hence the term What is the probability of getting 5H and 6T if the coin is fair product rule or multiplication principle, or whatever you call it. Conditional probability is very useful as it allows us to express a probability given some further information; specifically, it is the probability of event E2 assuming that event E1 has already occurred. We assume the E1 and E2 events are in a given sample space, S, and P(E1) > 0. We write the conditional probability as P(E2 E1); the vertical bar is read as given.

4 Let s look at an example of a probability model. The familiar binomial distribution provides the appropriate model for describing the probability of the outcomes of flipping a coin. The binomial model is as follows: n P = 1 k k n k ( p) ( p) n = n! k k!( n k )! If we had a fair coin we could predict the probability of specific outcomes (e.g., 1 head & 1 tail in two tosses) by setting the p parameter equal to 0.5. Note that the model does not require this. In the case of the coin toss, we are interested in a conditional probability; i.e., what is the probability of obtaining, say, 5 heads given a fair coin (p = 0.5) and 12 tosses, or P(k=5 p=0.5, n=12). Probability and likelihood are inverted Probability refers to the occurrence of some future outcome. For example: If I toss a fair coin 12 times, what is the probability that I will obtain 5 heads and 7 tails? Likelihood refers to a past event with a known outcome. For example: What is the probability that my coin is fair if I tossed it 12 times and observed 5 heads and 7 tails Let s continue to use the familiar coin tossing experiment to examine this inversion. n P = 2 k k n k ( 1 / 2) ( 1/ ) n = n! k k!( n k )! n is the number of flips k is the number of successes

5 CASE 1: PROBABILITY. The question is the same: If I toss a fair coin 12 times, what is the probability that I will obtain 5 heads and 7 tails? The answer comes directly from the above formula where n = 12, and k = 5. The probability of such a future event is From the probability perspective we can look at the distribution of all possible outcomes Probability of 5 heads & 7 tails = Our outcome of 5 heads & 7 tails This is the distribution of mutually exclusive outcomes that comprise the set of all possible outcomes under the model where p = 0.5. Remember probability axiom 2 where P(S) = 1; the probability of each outcome (i.e., 0 to 12 heads) sum to 1. CASE 2: LIKELIHOOD. The second question is: What is the probability that my coin is fair if I tossed it 12 times and observed 5 heads and 7 tails? We have inverted the problem. In the previous case (1) we were interested in the probability of a future outcome given that my coin is fair. In this case (2) we are interested in the probability hat my coin is fair, given a particular outcome. So, in the likelihood framework we have inverted the question such that the hypothesis (H) is variable, and the outcome (let s call it the data, D) is constant. A problem: What we want to measure is P(H D). The problem is that we can t work with the probability of a hypothesis, only the relative frequencies of outcomes. The

6 solution comes from the knowledge that there is a relationship between P(H D) and P(D H): The P(H D) = αp(d H) Constant value of proportionality The likelihood of the hypothesis given the data, L(H D), is proportional to the probability of the data given the hypothesis, P(D H). As long as we stick to comparing hypotheses on the same data and probability model, the constant remains the same, and we can compare the likelihood scores. We cannot make comparisons on different data using likelihoods. Just remember: with likelihoods, the hypotheses are the variables! Let s use the binomial model to look at the application of probability as compared with likelihood. PROBABILITIES Data D1: 1H & 1T D2: 2H Hypotheses H1: p(h) = 1/ H2: p(h) = 1/ Following the probability axioms, and as we saw in the binomial distribution above, given a singe hypothesis (i.e., H2: p(h) = 0.5), the different outcomes can be summed. For example P(D1 or D2 H2) = P(D1 H2) + P(D2 H2), a well known result; with all possible outcomes summing to 1. However, we cannot use the addition axiom over different hypotheses H1 and H2; i.e., P(D1 H1 or D2 H2) P(D1 H1) + P(D2 H2). LIKELIHOODS Data D1: 1H & 1T D2: 2H Hypotheses H1: p(h) = 1/4 α α H2: p(h) = 1/2 α α Under likelihood we can work with different hypotheses as long as we stick to the same dataset. Take the likelihoods of H1 and H2 under D1. We can infer that the H1 is ¾ less likely than H2. Note that when working with likelihoods, we compute the probabilities, and we drop the constant for convenience. The likelihoods do not sum to 1 because the probabilities terms are for the same outcome drawn from different distributions [probabilities for the total set of outcomes S in same distribution sum to 1]. An example of Likelihood in action Let s use likelihood to follow through on our question of the probability that the coin is fair given 12 tosses with 5 heads and 7 tails. As always our tosses are independent. The L(p=0.5 12,5) = α P(2,5 p=0.5)

7 [it s easy to use the binomial formula to get the probability term] L = α [we drop the constant for convenience] L = Perhaps there is an alternative hypothesis; i.e., where p 0.05, that has a higher likelihood. To explore this possibility we take the binomial formula as our likelihood function and evaluate the resulting likelihoods with respect to various values of p and the given data. The results can be plotted as a curve; this curve is sometimes called the likelihood surface. The curve for our data (12,5) is shown below. Maximum Likelihood score = ML estimate of p = 0.42 IMPORTANT NOTE: It looks like a distribution, but don t be fooled, the area under the curve does not sum to 1. The curve reflects the probabilities of different values of p (a parameter of the model) under the same data, and these are not mutually exclusive outcomes within a single set of all the possible outcomes.

### Math/Stats 425 Introduction to Probability. 1. Uncertainty and the axioms of probability

Math/Stats 425 Introduction to Probability 1. Uncertainty and the axioms of probability Processes in the real world are random if outcomes cannot be predicted with certainty. Example: coin tossing, stock

### E3: PROBABILITY AND STATISTICS lecture notes

E3: PROBABILITY AND STATISTICS lecture notes 2 Contents 1 PROBABILITY THEORY 7 1.1 Experiments and random events............................ 7 1.2 Certain event. Impossible event............................

### MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS

MATHEMATICS FOR ENGINEERS STATISTICS TUTORIAL 4 PROBABILITY DISTRIBUTIONS CONTENTS Sample Space Accumulative Probability Probability Distributions Binomial Distribution Normal Distribution Poisson Distribution

### Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

1 Learning Goals Bayesian Updating with Discrete Priors Class 11, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1. Be able to apply Bayes theorem to compute probabilities. 2. Be able to identify

### + Section 6.2 and 6.3

Section 6.2 and 6.3 Learning Objectives After this section, you should be able to DEFINE and APPLY basic rules of probability CONSTRUCT Venn diagrams and DETERMINE probabilities DETERMINE probabilities

### 4: Probability. What is probability? Random variables (RVs)

4: Probability b binomial µ expected value [parameter] n number of trials [parameter] N normal p probability of success [parameter] pdf probability density function pmf probability mass function RV random

### Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett

Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.

### Chapter 4. Probability and Probability Distributions

Chapter 4. robability and robability Distributions Importance of Knowing robability To know whether a sample is not identical to the population from which it was selected, it is necessary to assess the

### Sampling Distributions and the Central Limit Theorem

135 Part 2 / Basic Tools of Research: Sampling, Measurement, Distributions, and Descriptive Statistics Chapter 10 Sampling Distributions and the Central Limit Theorem In the previous chapter we explained

### What is a Statistical Hypothesis?

Research seminar Paris IHPST 2009 What is a Statistical Hypothesis? Jan-Willem Romeijn Faculty of Philosophy University of Groningen Contents 1 What this talk is about 3 2 Von Mises frequentism 5 3 Statistical

### PROBABILITIES AND PROBABILITY DISTRIBUTIONS

Published in "Random Walks in Biology", 1983, Princeton University Press PROBABILITIES AND PROBABILITY DISTRIBUTIONS Howard C. Berg Table of Contents PROBABILITIES PROBABILITY DISTRIBUTIONS THE BINOMIAL

### IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS INTRODUCTION

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS INTRODUCTION 1 WHAT IS STATISTICS? Statistics is a science of collecting data, organizing and describing it and drawing conclusions from it. That is, statistics

### MATH 140 Lab 4: Probability and the Standard Normal Distribution

MATH 140 Lab 4: Probability and the Standard Normal Distribution Problem 1. Flipping a Coin Problem In this problem, we want to simualte the process of flipping a fair coin 1000 times. Note that the outcomes

### Probability - Part I. Definition : A random experiment is an experiment or a process for which the outcome cannot be predicted with certainty.

Probability - Part I Definition : A random experiment is an experiment or a process for which the outcome cannot be predicted with certainty. Definition : The sample space (denoted S) of a random experiment

### The basics of probability theory. Distribution of variables, some important distributions

The basics of probability theory. Distribution of variables, some important distributions 1 Random experiment The outcome is not determined uniquely by the considered conditions. For example, tossing a

### P (A) = lim P (A) = N(A)/N,

1.1 Probability, Relative Frequency and Classical Definition. Probability is the study of random or non-deterministic experiments. Suppose an experiment can be repeated any number of times, so that we

### Probabilities and Random Variables

Probabilities and Random Variables This is an elementary overview of the basic concepts of probability theory. 1 The Probability Space The purpose of probability theory is to model random experiments so

### 4. Introduction to Statistics

Statistics for Engineers 4-1 4. Introduction to Statistics Descriptive Statistics Types of data A variate or random variable is a quantity or attribute whose value may vary from one unit of investigation

### Unit 19: Probability Models

Unit 19: Probability Models Summary of Video Probability is the language of uncertainty. Using statistics, we can better predict the outcomes of random phenomena over the long term from the very complex,

### 6.4 Normal Distribution

Contents 6.4 Normal Distribution....................... 381 6.4.1 Characteristics of the Normal Distribution....... 381 6.4.2 The Standardized Normal Distribution......... 385 6.4.3 Meaning of Areas under

### Fairfield Public Schools

Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity

### Section 1.3 P 1 = 1 2. = 1 4 2 8. P n = 1 P 3 = Continuing in this fashion, it should seem reasonable that, for any n = 1, 2, 3,..., = 1 2 4.

Difference Equations to Differential Equations Section. The Sum of a Sequence This section considers the problem of adding together the terms of a sequence. Of course, this is a problem only if more than

### MAT 1000. Mathematics in Today's World

MAT 1000 Mathematics in Today's World We talked about Cryptography Last Time We will talk about probability. Today There are four rules that govern probabilities. One good way to analyze simple probabilities

### z-scores AND THE NORMAL CURVE MODEL

z-scores AND THE NORMAL CURVE MODEL 1 Understanding z-scores 2 z-scores A z-score is a location on the distribution. A z- score also automatically communicates the raw score s distance from the mean A

### Using Laws of Probability. Sloan Fellows/Management of Technology Summer 2003

Using Laws of Probability Sloan Fellows/Management of Technology Summer 2003 Uncertain events Outline The laws of probability Random variables (discrete and continuous) Probability distribution Histogram

### Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University

Lecture 1 Introduction Properties of Probability Methods of Enumeration Asrat Temesgen Stockholm University 1 Chapter 1 Probability 1.1 Basic Concepts In the study of statistics, we consider experiments

### Question: What is the probability that a five-card poker hand contains a flush, that is, five cards of the same suit?

ECS20 Discrete Mathematics Quarter: Spring 2007 Instructor: John Steinberger Assistant: Sophie Engle (prepared by Sophie Engle) Homework 8 Hints Due Wednesday June 6 th 2007 Section 6.1 #16 What is the

### Probability OPRE 6301

Probability OPRE 6301 Random Experiment... Recall that our eventual goal in this course is to go from the random sample to the population. The theory that allows for this transition is the theory of probability.

### REPEATED TRIALS. The probability of winning those k chosen times and losing the other times is then p k q n k.

REPEATED TRIALS Suppose you toss a fair coin one time. Let E be the event that the coin lands heads. We know from basic counting that p(e) = 1 since n(e) = 1 and 2 n(s) = 2. Now suppose we play a game

### Probability and Statistics Vocabulary List (Definitions for Middle School Teachers)

Probability and Statistics Vocabulary List (Definitions for Middle School Teachers) B Bar graph a diagram representing the frequency distribution for nominal or discrete data. It consists of a sequence

### Chapter 8 Hypothesis Testing Chapter 8 Hypothesis Testing 8-1 Overview 8-2 Basics of Hypothesis Testing

Chapter 8 Hypothesis Testing 1 Chapter 8 Hypothesis Testing 8-1 Overview 8-2 Basics of Hypothesis Testing 8-3 Testing a Claim About a Proportion 8-5 Testing a Claim About a Mean: s Not Known 8-6 Testing

### IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Tuesday, September 11 Normal Approximations and the Central Limit Theorem Time on my hands: Coin tosses. Problem Formulation: Suppose that I have

### 6.3 Conditional Probability and Independence

222 CHAPTER 6. PROBABILITY 6.3 Conditional Probability and Independence Conditional Probability Two cubical dice each have a triangle painted on one side, a circle painted on two sides and a square painted

### Combinatorics: The Fine Art of Counting

Combinatorics: The Fine Art of Counting Week 7 Lecture Notes Discrete Probability Continued Note Binomial coefficients are written horizontally. The symbol ~ is used to mean approximately equal. The Bernoulli

### STA 371G: Statistics and Modeling

STA 371G: Statistics and Modeling Decision Making Under Uncertainty: Probability, Betting Odds and Bayes Theorem Mingyuan Zhou McCombs School of Business The University of Texas at Austin http://mingyuanzhou.github.io/sta371g

### Senior Secondary Australian Curriculum

Senior Secondary Australian Curriculum Mathematical Methods Glossary Unit 1 Functions and graphs Asymptote A line is an asymptote to a curve if the distance between the line and the curve approaches zero

### AP Statistics 1998 Scoring Guidelines

AP Statistics 1998 Scoring Guidelines These materials are intended for non-commercial use by AP teachers for course and exam preparation; permission for any other use must be sought from the Advanced Placement

### Items related to expected use of graphing technology appear in bold italics.

- 1 - Items related to expected use of graphing technology appear in bold italics. Investigating the Graphs of Polynomial Functions determine, through investigation, using graphing calculators or graphing

### For a partition B 1,..., B n, where B i B j = for i. A = (A B 1 ) (A B 2 ),..., (A B n ) and thus. P (A) = P (A B i ) = P (A B i )P (B i )

Probability Review 15.075 Cynthia Rudin A probability space, defined by Kolmogorov (1903-1987) consists of: A set of outcomes S, e.g., for the roll of a die, S = {1, 2, 3, 4, 5, 6}, 1 1 2 1 6 for the roll

### Section 6.2 Definition of Probability

Section 6.2 Definition of Probability Probability is a measure of the likelihood that an event occurs. For example, if there is a 20% chance of rain tomorrow, that means that the probability that it will

### Comparison of frequentist and Bayesian inference. Class 20, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom

Comparison of frequentist and Bayesian inference. Class 20, 18.05, Spring 2014 Jeremy Orloff and Jonathan Bloom 1 Learning Goals 1. Be able to explain the difference between the p-value and a posterior

### Bayesian Analysis for the Social Sciences

Bayesian Analysis for the Social Sciences Simon Jackman Stanford University http://jackman.stanford.edu/bass November 9, 2012 Simon Jackman (Stanford) Bayesian Analysis for the Social Sciences November

### CHAPTER 2 Estimating Probabilities

CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a

### One Period Binomial Model

FIN-40008 FINANCIAL INSTRUMENTS SPRING 2008 One Period Binomial Model These notes consider the one period binomial model to exactly price an option. We will consider three different methods of pricing

### GROUPS SUBGROUPS. Definition 1: An operation on a set G is a function : G G G.

Definition 1: GROUPS An operation on a set G is a function : G G G. Definition 2: A group is a set G which is equipped with an operation and a special element e G, called the identity, such that (i) the

### Hypothesis Testing - Relationships

- Relationships Session 3 AHX43 (28) 1 Lecture Outline Correlational Research. The Correlation Coefficient. An example. Considerations. One and Two-tailed Tests. Errors. Power. for Relationships AHX43

### Mathematical Induction. Mary Barnes Sue Gordon

Mathematics Learning Centre Mathematical Induction Mary Barnes Sue Gordon c 1987 University of Sydney Contents 1 Mathematical Induction 1 1.1 Why do we need proof by induction?.... 1 1. What is proof by

### Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note 11

CS 70 Discrete Mathematics and Probability Theory Fall 2009 Satish Rao,David Tse Note Conditional Probability A pharmaceutical company is marketing a new test for a certain medical condition. According

### Definition and Calculus of Probability

In experiments with multivariate outcome variable, knowledge of the value of one variable may help predict another. For now, the word prediction will mean update the probabilities of events regarding the

### Curriculum Map Statistics and Probability Honors (348) Saugus High School Saugus Public Schools 2009-2010

Curriculum Map Statistics and Probability Honors (348) Saugus High School Saugus Public Schools 2009-2010 Week 1 Week 2 14.0 Students organize and describe distributions of data by using a number of different

### What Is Probability?

1 What Is Probability? The idea: Uncertainty can often be "quantified" i.e., we can talk about degrees of certainty or uncertainty. This is the idea of probability: a higher probability expresses a higher

### Basic Probability Concepts

page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes

### Random variables, probability distributions, binomial random variable

Week 4 lecture notes. WEEK 4 page 1 Random variables, probability distributions, binomial random variable Eample 1 : Consider the eperiment of flipping a fair coin three times. The number of tails that

### WHERE DOES THE 10% CONDITION COME FROM?

1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

### A Few Basics of Probability

A Few Basics of Probability Philosophy 57 Spring, 2004 1 Introduction This handout distinguishes between inductive and deductive logic, and then introduces probability, a concept essential to the study

### 1 The Brownian bridge construction

The Brownian bridge construction The Brownian bridge construction is a way to build a Brownian motion path by successively adding finer scale detail. This construction leads to a relatively easy proof

### Probabilities. Probability of a event. From Random Variables to Events. From Random Variables to Events. Probability Theory I

Victor Adamchi Danny Sleator Great Theoretical Ideas In Computer Science Probability Theory I CS 5-25 Spring 200 Lecture Feb. 6, 200 Carnegie Mellon University We will consider chance experiments with

### Chapter 1: The binomial asset pricing model

Chapter 1: The binomial asset pricing model Simone Calogero April 17, 2015 Contents 1 The binomial model 1 2 1+1 dimensional stock markets 4 3 Arbitrage portfolio 8 4 Implementation of the binomial model

### Simple Regression Theory II 2010 Samuel L. Baker

SIMPLE REGRESSION THEORY II 1 Simple Regression Theory II 2010 Samuel L. Baker Assessing how good the regression equation is likely to be Assignment 1A gets into drawing inferences about how close the

### 8-2 Basics of Hypothesis Testing. Definitions. Rare Event Rule for Inferential Statistics. Null Hypothesis

8-2 Basics of Hypothesis Testing Definitions This section presents individual components of a hypothesis test. We should know and understand the following: How to identify the null hypothesis and alternative

### Induction. Margaret M. Fleck. 10 October These notes cover mathematical induction and recursive definition

Induction Margaret M. Fleck 10 October 011 These notes cover mathematical induction and recursive definition 1 Introduction to induction At the start of the term, we saw the following formula for computing

### Elements of probability theory

2 Elements of probability theory Probability theory provides mathematical models for random phenomena, that is, phenomena which under repeated observations yield di erent outcomes that cannot be predicted

### 10.2 Series and Convergence

10.2 Series and Convergence Write sums using sigma notation Find the partial sums of series and determine convergence or divergence of infinite series Find the N th partial sums of geometric series and

### You flip a fair coin four times, what is the probability that you obtain three heads.

Handout 4: Binomial Distribution Reading Assignment: Chapter 5 In the previous handout, we looked at continuous random variables and calculating probabilities and percentiles for those type of variables.

### Hypothesis Testing. Learning Objectives. After completing this module, the student will be able to

Hypothesis Testing Learning Objectives After completing this module, the student will be able to carry out a statistical test of significance calculate the acceptance and rejection region calculate and

### Chapter 6: Probability

Chapter 6: Probability In a more mathematically oriented statistics course, you would spend a lot of time talking about colored balls in urns. We will skip over such detailed examinations of probability,

### Lesson 9 Hypothesis Testing

Lesson 9 Hypothesis Testing Outline Logic for Hypothesis Testing Critical Value Alpha (α) -level.05 -level.01 One-Tail versus Two-Tail Tests -critical values for both alpha levels Logic for Hypothesis

### Summary of Probability

Summary of Probability Mathematical Physics I Rules of Probability The probability of an event is called P(A), which is a positive number less than or equal to 1. The total probability for all possible

### Odds ratio, Odds ratio test for independence, chi-squared statistic.

Odds ratio, Odds ratio test for independence, chi-squared statistic. Announcements: Assignment 5 is live on webpage. Due Wed Aug 1 at 4:30pm. (9 days, 1 hour, 58.5 minutes ) Final exam is Aug 9. Review

### Chapter 4 Lecture Notes

Chapter 4 Lecture Notes Random Variables October 27, 2015 1 Section 4.1 Random Variables A random variable is typically a real-valued function defined on the sample space of some experiment. For instance,

### MITES Physics III Summer Introduction 1. 3 Π = Product 2. 4 Proofs by Induction 3. 5 Problems 5

MITES Physics III Summer 010 Sums Products and Proofs Contents 1 Introduction 1 Sum 1 3 Π Product 4 Proofs by Induction 3 5 Problems 5 1 Introduction These notes will introduce two topics: A notation which

### How to Conduct a Hypothesis Test

How to Conduct a Hypothesis Test The idea of hypothesis testing is relatively straightforward. In various studies we observe certain events. We must ask, is the event due to chance alone, or is there some

### Biological Sciences Initiative

Biological Sciences Initiative HHMI This activity is an adaptation of an exercise originally published by L. A. Welch. 1993. A model of microevolution in action. The American Biology Teacher. 55(6), 362-365.

### Chapter 1 Axioms of Probability. Wen-Guey Tzeng Computer Science Department National Chiao University

Chapter 1 Axioms of Probability Wen-Guey Tzeng Computer Science Department National Chiao University Introduction Luca Paccioli(1445-1514), Studies of chances of events Niccolo Tartaglia(1499-1557) Girolamo

### 9.2 Summation Notation

9. Summation Notation 66 9. Summation Notation In the previous section, we introduced sequences and now we shall present notation and theorems concerning the sum of terms of a sequence. We begin with a

### Mathematical Induction

Mathematical Induction Victor Adamchik Fall of 2005 Lecture 2 (out of three) Plan 1. Strong Induction 2. Faulty Inductions 3. Induction and the Least Element Principal Strong Induction Fibonacci Numbers

### WRITING PROOFS. Christopher Heil Georgia Institute of Technology

WRITING PROOFS Christopher Heil Georgia Institute of Technology A theorem is just a statement of fact A proof of the theorem is a logical explanation of why the theorem is true Many theorems have this

### The Binomial Probability Distribution

The Binomial Probability Distribution MATH 130, Elements of Statistics I J. Robert Buchanan Department of Mathematics Fall 2015 Objectives After this lesson we will be able to: determine whether a probability

### Limit processes are the basis of calculus. For example, the derivative. f f (x + h) f (x)

SEC. 4.1 TAYLOR SERIES AND CALCULATION OF FUNCTIONS 187 Taylor Series 4.1 Taylor Series and Calculation of Functions Limit processes are the basis of calculus. For example, the derivative f f (x + h) f

### Chapter ML:IV. IV. Statistical Learning. Probability Basics Bayes Classification Maximum a-posteriori Hypotheses

Chapter ML:IV IV. Statistical Learning Probability Basics Bayes Classification Maximum a-posteriori Hypotheses ML:IV-1 Statistical Learning STEIN 2005-2015 Area Overview Mathematics Statistics...... Stochastics

### HYPOTHESIS TESTING: POWER OF THE TEST

HYPOTHESIS TESTING: POWER OF THE TEST The first 6 steps of the 9-step test of hypothesis are called "the test". These steps are not dependent on the observed data values. When planning a research project,

### Introduction to. Hypothesis Testing CHAPTER LEARNING OBJECTIVES. 1 Identify the four steps of hypothesis testing.

Introduction to Hypothesis Testing CHAPTER 8 LEARNING OBJECTIVES After reading this chapter, you should be able to: 1 Identify the four steps of hypothesis testing. 2 Define null hypothesis, alternative

### Normal distribution. ) 2 /2σ. 2π σ

Normal distribution The normal distribution is the most widely known and used of all distributions. Because the normal distribution approximates many natural phenomena so well, it has developed into a

### Module 5 Hypotheses Tests: Comparing Two Groups

Module 5 Hypotheses Tests: Comparing Two Groups Objective: In medical research, we often compare the outcomes between two groups of patients, namely exposed and unexposed groups. At the completion of this

### Lecture 2: Introduction to belief (Bayesian) networks

Lecture 2: Introduction to belief (Bayesian) networks Conditional independence What is a belief network? Independence maps (I-maps) January 7, 2008 1 COMP-526 Lecture 2 Recall from last time: Conditional

### Slides for Risk Management

Slides for Risk Management Introduction to the modeling of assets Groll Seminar für Finanzökonometrie Prof. Mittnik, PhD Groll (Seminar für Finanzökonometrie) Slides for Risk Management Prof. Mittnik,

### Concepts of Probability

Concepts of Probability Trial question: we are given a die. How can we determine the probability that any given throw results in a six? Try doing many tosses: Plot cumulative proportion of sixes Also look

### 1. The sample space S is the set of all possible outcomes. 2. An event is a set of one or more outcomes for an experiment. It is a sub set of S.

1 Probability Theory 1.1 Experiment, Outcomes, Sample Space Example 1 n psychologist examined the response of people standing in line at a copying machines. Student volunteers approached the person first

### Statistical Inference. Prof. Kate Calder. If the coin is fair (chance of heads = chance of tails) then

Probability Statistical Inference Question: How often would this method give the correct answer if I used it many times? Answer: Use laws of probability. 1 Example: Tossing a coin If the coin is fair (chance

### Basic Probability. Probability: The part of Mathematics devoted to quantify uncertainty

AMS 5 PROBABILITY Basic Probability Probability: The part of Mathematics devoted to quantify uncertainty Frequency Theory Bayesian Theory Game: Playing Backgammon. The chance of getting (6,6) is 1/36.

### Probability, Binomial Distributions and Hypothesis Testing Vartanian, SW 540

Probability, Binomial Distributions and Hypothesis Testing Vartanian, SW 540 1. Assume you are tossing a coin 11 times. The following distribution gives the likelihoods of getting a particular number of

### Lecture I. Definition 1. Statistics is the science of collecting, organizing, summarizing and analyzing the information in order to draw conclusions.

Lecture 1 1 Lecture I Definition 1. Statistics is the science of collecting, organizing, summarizing and analyzing the information in order to draw conclusions. It is a process consisting of 3 parts. Lecture

### CONTENTS OF DAY 2. II. Why Random Sampling is Important 9 A myth, an urban legend, and the real reason NOTES FOR SUMMER STATISTICS INSTITUTE COURSE

1 2 CONTENTS OF DAY 2 I. More Precise Definition of Simple Random Sample 3 Connection with independent random variables 3 Problems with small populations 8 II. Why Random Sampling is Important 9 A myth,

### 6.2 Permutations continued

6.2 Permutations continued Theorem A permutation on a finite set A is either a cycle or can be expressed as a product (composition of disjoint cycles. Proof is by (strong induction on the number, r, of

### Chapter 2 - Graphical Summaries of Data

Chapter 2 - Graphical Summaries of Data Data recorded in the sequence in which they are collected and before they are processed or ranked are called raw data. Raw data is often difficult to make sense

### Sampling Distribution of the Mean & Hypothesis Testing

Sampling Distribution of the Mean & Hypothesis Testing Let s first review what we know about sampling distributions of the mean (Central Limit Theorem): 1. The mean of the sampling distribution will be

### Basic Proof Techniques

Basic Proof Techniques David Ferry dsf43@truman.edu September 13, 010 1 Four Fundamental Proof Techniques When one wishes to prove the statement P Q there are four fundamental approaches. This document

The result of the bayesian analysis is the probability distribution of every possible hypothesis H, given one real data set D. This prestatistical approach to our problem was the standard approach of Laplace

### Unit 29 Chi-Square Goodness-of-Fit Test

Unit 29 Chi-Square Goodness-of-Fit Test Objectives: To perform the chi-square hypothesis test concerning proportions corresponding to more than two categories of a qualitative variable To perform the Bonferroni