# RNR / ENTO Assumptions for Simple Linear Regression

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 74 RNR / ENTO 63 --Assumptions for Simple Linear Regression Statistical statements (hypothesis tests and CI estimation) with least squares estimates depends on 4 assumptions:. Linearity of the mean responses 2. Constant variance of the responses around the straight line 3. Normality of subpopulations (Y s) at the different X values 4. Independence within and across subpopulations Relative importance of these assumptions depends on type of inferences: Proposed Inference Predicting a single Y at X; or single X at Y (prediction interval) Estimating the mean Y at X; or mean X at Y (confidence interval). Estimating value of the slope Required Assumptions All assumptions of models are necessary (normality is assumed) All assumptions except normality (normality of distribution of mean is met because of Central Limit Theorem) Only linearity maters (lack of independence affects the SE but not the estimate of the slope). Model Assessment with graphical tools: Scatterplot of response variable versus Explanatory variable A simple scatterplot of Y * X is useful to evaluate compliance to the assumptions of the linear regression model. The pattern for the means and variability of the responses suggests a strategy for analysis. <<Fig. 8.6 Sleuth>> - The regression is a straight line and the variability of Y is about the same at all locations along the line: use simple linear regression. 2- Regression is not a straight line but SD is constant: transform X 3- Regression line is not monotonic: use multiple regression. 4- Regression not a straight line and SD increase as a function of X: transform Y 5- Regression is a straight line, homogenous SD, but outliers: use simple regression tool and report presence of outliers 6- Regression is a straight line but SD increases in X: use weighted regression. Note on Weighted regression: The variance may increase with increases in the explanatory variable even if the regression line is linear. This suggests that the chance fluctuation (i.e. variance of estimation error) affecting Y increases with increased values of X.

2 75 For example, on days when many people drive to the park, many other people may also get there by other means. Thus the more people in the park, the greater would be the error about the regression line: SD (Y) would be proportional to X. With such a pattern, it is not recommended to predict single or average values of Y or X with a simple linear regression approach because no single estimate of SE applies across all the Xs values. However, the slope of the line can be estimated without bias using weighted regression. Weighted regression gives less influence to values of Y that are subject to larger error for estimating the parameters of the line. In weighted regression, the least squares estimate minimizes: n wi Y i= ( ˆ ) i Y i 2 where w i is a weight given to every observation. Depending on the pattern of the relation between X and the variance of Y, one may choose w i = / X i (variance proportional to X: V shape of residuals), or w i = / X i 2 (variance proportional to SQRT X: Horn shape of residuals). JMP allows computation of weighted regression models (in the Fit Model platform). Scatterplot of residuals versus Fitted values Assessment of the adequacy of regression models is done with plots of residual vs fitted values. These show the error (residual) about the fitted line when the linear component of variation has been removed. Looking at distribution of the residuals about a horizontal line makes it easier to assess curvature and spread of observations. These plots are essential for evaluating nonlinearity, nonconstant variance, and presence of outliers. The Park example: Residual VEHICLE

3 76 Assessing whether a transformation improves fit of the model is done by trial and error. Example: breakdown times for insulating fluids under different voltages (Sleuth case 8..2) Time = Voltage 25 Par amet er Est i mat es 2 Ter m Intercept Est i mat e St d Er r or t Rati o Pr ob> t SQRT(Time) = Voltage 5 Par amet er Est i mat es 4 Ter m Intercept Est i mat e St d Er r or t Rati o Pr ob> t Ln (Time) = Voltage Par amet er Est i mat es Ter m Intercept 3 Est i mat e St d Er r or t Rati o Prob> t

4 77 Testing whether the regression parameters differ from zero is not assessing model fit. In the above examples, the null hypothesis of β = is rejected in the three analyses, but the assumption of linearity and homogeneity of variance are only met when a logarithmic transformation of Y is used. Interpretation of linear relationship after log transformation Only Y is log transformed: A one-unit change in X results in a multiplicative change of exp (β) in the median of Y: Median{Y (X+)} / Median {Y X} = exp (β) Example: Breakdown time versus Voltage of insulation fluids Ln (Time) = (Voltage). Each increase of kv results in a change of e (-.5) =.6 in the median breakdown time. Thus each increase in kv results in a.6 decrease (i.e., 39%) in breakdown time. The 95 % CI for β was.62 to.39, so the 95 % CI for the f.6 decrease is exp (-.62) to exp (-.39), or.54 to.68. Only X is log transformed: A doubling of X results in a β log (2) change in the mean of Y: μ{ Y ln (2X)} - μ{ Y ln (X)} = β ln (2) Example: Change in ph as a function of time in meat ph = ln (Time). A doubling of time after slaughter is associated with a ln (2)(-.726) =.53 unit change in ph, i.e. the mean ph is reduced by.53 for each doubling of time after slaughter. The 95% CI for βwas from.85 to.646, so the 95 % CI for the reduction in ph per doubling of X is from ln (2)(-.85) to ln (2) (-.646), or from.449 to.558. Both X and Y are log transformed: A doubling of X results in a multiplicative change of 2 β in the median of Y Median{ Y ln (2X)} - Median{ Y ln (X)} = 2 (β)

5 78 Example: Island size versus number of species Log (Species) = log (Area). Thus an island of area 2A will have a median number of species that is 2.25 greater (i.e..9 fold) than an island of area A. The 95% CI for β was from.29 to.28, so the 95 % CI for the change in median is 2.29 to 2.28, or.6 to.22. << Display 8.2>> Interpretation for other types of transformations May be difficult. Interpretation is not required: ) if the regression is used to detect presence of an association between two variables. 2) to predict values of X and Y. Here the only requirement is to back transform the predicted value (and their CI). Simple regression analysis with a F-test (extra-sum-of-squares F-test) The approach for linear regression described so far used the distribution of the least squares estimates and t-tools to draw statistical inferences. Another approach uses the Extra sum of squares method to compare the regression model (full model) to the grand mean model (reduced model). To test the linear model, we compare the difference in explanatory power between the reduced and a full model. The sum of the residual sum of squares measures the variability in the observations that remains unexplained after we fit a regression model. The sum of squares (SS ) calculated after fitting the Grand mean is: i = i Y 2 n Total SS = ( ) Y The SS calculated after fitting the regression line is: i = ˆ 2 i n Error SS = ( ) Y i Y The difference between the two is the extra variability explained by the regression model: Total SS Error SS = Model SS Unexplained by Unexplained by Extra variation Grand mean regression explained by regression df = n df = n 2 df = = [n ] [n 2] The model comparison (regression vs grand mean) is given by:

6 79 F = Model SS / df Error SS/ df If the null hypothesis of equal mean responses is correct (i.e. = ), then the numerator and denominator of the F-statistic both estimate the same population variance (σ 2 ), so the F-statistic should be close to one. If the null hypothesis is not correct, the F-statistic will be greater than. β The Park example: n = 2 observation pairs. Anal ysi s of Var i ance 3 Source Model Er r or C Tot al DF Sum of Squares Mean Squar e F Rat i o Pr ob>f Paramet er Est i mat es Ter m Est i mat e Intercept VEHI CLE St d Er r or t Rati o Prob> t VEHI CLE (The F-statistic is equal to the square of the t-statistic, i.e = 49.) Assessment of the Regression Fit using ANOVA (Lack of fit test): When a response variable is measured repeatedly for each (or some) level of the explanatory variable (i.e. with replicates), the Extra SS approach provides a way for comparing the fit of simple linear regression model (reduced model) to the separatemeans model (full model, i.e. one-way ANOVA). <<Display 8.4>> The question: Is the straight-line regression model sufficient to describe the data, or should we instead use a separate-mean model? (because the mean responses are not linear) For i groups, we can fit the following hierarchical models:

7 8. Separate-means model (Anova): μ{y Xi}= μi (i parameters) 2. Simple linear regression model: μ{y Xi}= βo + βxi (2 parameters) 3. Equal-means model: μ{y Xi}= μ ( parameter) We can now use the Extra sum of squares method to compare the simple regression and ANOVA models (i.e., perform a lack of fit test). Example: Insulation fluid data (7 Voltage treatments) Anal ysi s of Var i ance Source DF Sum of Squares Model Er r or C Tot al Mean Squar e F Ratio Pr ob>f ANOVA (X coded as ordinal) Anal ysi s of Var i ance Sour ce DF Sum of Squares Model Er r or C Tot al Mean Squar e F Rat i o Pr ob>f REGRESSION (X coded as continuous) In both cases, Total SS is the variability left unexplained by fitting a single mean, while Error SS is the variability left unexplained by each model. Error SS is smaller for the ANOVA than for the regression because the separate means explain sightly more variation than the regression line (but it uses more parameters). Lack-of Fit F-test We compare the fit of the separates means (ANOVA) to the fit of the regresion line, by comparing the residual SS from both models using the Extra sum of squares F-test: F = ( SS Res ) ( ) LR - SS ResSM df df LR SM / σ 2 SM Where LR and SM stands for linear regression and separate mean model. Insulating fluid example: SS Res LR =8.7, SS Res SM = 73.75, df LR =74, df SM = 69, σ 2 SM =2.58 F 5,69 = [( ) / (74 69)] / 2.58 =.52, p =.78

8 8 This large p-value, corresponding to a F, provides no evidence of a lack-of fit of the simple linear model. A small p-value would suggest that the ANOVA model fits better, or in other words that the variability between group means is not explained adequatedly by a simple linear regression model. In JMP, you get the lack of fit test both on the Fit Model platform and the Fit Y by X platform, as long as there are replicated Y s at some levels of X s. For the Insulation fluid data (7 Voltage treatments): Lack Of Fit Source DF Sum of Squares Mean Square F Ratio Lack Of Fit Pure Error Prob > F Total Error Max RSq.537 Pure error is the SS for the ANOVA model (i.e. the best estimate of the population variance), whereas total error is the SS for the simple regression model. The lack of fit test (above) assesses the null hypothesis: both models fit equally well, versus the alternative hypothesis: ANOVA fits better. A composite Analysis of variance table A composite ANOVA table can be used to sumarize results from the lack of fit test. The separate-means model (i parameters, here i = 7) is a generalization of the simple linear model (2 parameters), which in turn is a generalization of the equal-mean model ( parameter). The reduction in SS obtained when fitting the separate-mean model instead of the equal-mean model (i.e ), can be decomposed in: - The reduction in SS obtained when fitting a simple linear regression instead of an equal-mean model (9.5), and 2- The reduction in SS obtained when a separate-means model is fitted instead of a linear regression model ( = = Lack of fit SS). Each of those components can be represented in a composite ANOVA table, which is the same table used to represent the separate means-model, except that the Between group SS (treatment SS) is decomposed into two components:. One that represents the variability explained by the linear regression line (9.5), and 2. One that estimates the variability that arizes because the separate group means do not exactly fall on a straight line (the lack-of-fit component, i.e. 6.32). <<Fig. 8.9 Sleuth>>

9 82 R 2 : The proportion of variation explained R 2 is the coefficient of determination, a measure of the percentage of the total response variation that is explained by the explanatory variable. Thus, R 2 Total = SS - Residual Total SS SS Model SS % = % Total SS The proportion (or %) of the total variation in Y that is explained by the fitted regression (or by any model) is a measure of the strength of the fitted relationship. In the insulating fluids example analysed with a linear regression, R 2 = ( / 37.2)% = 5.4 %. Thus, Fifty-one percent of the variation in log breackdown times was explained by the linear regression on voltage. R 2 provides useful information to compare the fit of 2 models with equal numbers of parameters. R 2 alone should not be used to assess adequacy of the linear model because R 2 can be large even when the linear model is not adequate (e.g., the response is not linear). Simple Linear regression or One-way ANOVA? When data are collected from groups that correspond to different levels of an explanatory variable (eg. insulation fluids), both one-way ANOVA and simple linear regression can be used for analysis. The choice between the 2 is easy: If the simple linear regression fits, then it is preferred. When appropriate, regression is better than ANOVA because it:. allows for interpolation 2. provides more degrees of freedom for error estimation (thus small error MS and high power) 3. gives smaller SE for estimates of mean responses (i.e., narower CI). <<Display 8. Sleuth>> 4. provides a simpler model

10 In general, we seek the simplest model --the one with the fewest parameters that adequately fits the data. This is called the principle of parsimony, or the Occam s (or Ockham) razor principle. 83

### Multiple Linear Regression

Multiple Linear Regression A regression with two or more explanatory variables is called a multiple regression. Rather than modeling the mean response as a straight line, as in simple regression, it is

### UCLA STAT 13 Statistical Methods - Final Exam Review Solutions Chapter 7 Sampling Distributions of Estimates

UCLA STAT 13 Statistical Methods - Final Exam Review Solutions Chapter 7 Sampling Distributions of Estimates 1. (a) (i) µ µ (ii) σ σ n is exactly Normally distributed. (c) (i) is approximately Normally

### Statistics courses often teach the two-sample t-test, linear regression, and analysis of variance

2 Making Connections: The Two-Sample t-test, Regression, and ANOVA In theory, there s no difference between theory and practice. In practice, there is. Yogi Berra 1 Statistics courses often teach the two-sample

### Regression step-by-step using Microsoft Excel

Step 1: Regression step-by-step using Microsoft Excel Notes prepared by Pamela Peterson Drake, James Madison University Type the data into the spreadsheet The example used throughout this How to is a regression

### 1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96

1 Final Review 2 Review 2.1 CI 1-propZint Scenario 1 A TV manufacturer claims in its warranty brochure that in the past not more than 10 percent of its TV sets needed any repair during the first two years

### Section 13, Part 1 ANOVA. Analysis Of Variance

Section 13, Part 1 ANOVA Analysis Of Variance Course Overview So far in this course we ve covered: Descriptive statistics Summary statistics Tables and Graphs Probability Probability Rules Probability

### Regression Analysis: A Complete Example

Regression Analysis: A Complete Example This section works out an example that includes all the topics we have discussed so far in this chapter. A complete example of regression analysis. PhotoDisc, Inc./Getty

### Module 5: Multiple Regression Analysis

Using Statistical Data Using to Make Statistical Decisions: Data Multiple to Make Regression Decisions Analysis Page 1 Module 5: Multiple Regression Analysis Tom Ilvento, University of Delaware, College

### CHAPTER 13 SIMPLE LINEAR REGRESSION. Opening Example. Simple Regression. Linear Regression

Opening Example CHAPTER 13 SIMPLE LINEAR REGREION SIMPLE LINEAR REGREION! Simple Regression! Linear Regression Simple Regression Definition A regression model is a mathematical equation that descries the

### t-tests and F-tests in regression

t-tests and F-tests in regression Johan A. Elkink University College Dublin 5 April 2012 Johan A. Elkink (UCD) t and F-tests 5 April 2012 1 / 25 Outline 1 Simple linear regression Model Variance and R

### NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )

Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates

### 2. Simple Linear Regression

Research methods - II 3 2. Simple Linear Regression Simple linear regression is a technique in parametric statistics that is commonly used for analyzing mean response of a variable Y which changes according

### Technology Step-by-Step Using StatCrunch

Technology Step-by-Step Using StatCrunch Section 1.3 Simple Random Sampling 1. Select Data, highlight Simulate Data, then highlight Discrete Uniform. 2. Fill in the following window with the appropriate

### One-Way Analysis of Variance: A Guide to Testing Differences Between Multiple Groups

One-Way Analysis of Variance: A Guide to Testing Differences Between Multiple Groups In analysis of variance, the main research question is whether the sample means are from different populations. The

### Using Minitab for Regression Analysis: An extended example

Using Minitab for Regression Analysis: An extended example The following example uses data from another text on fertilizer application and crop yield, and is intended to show how Minitab can be used to

### Simple Linear Regression Inference

Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation

### Using JMP with a Specific

1 Using JMP with a Specific Example of Regression Ying Liu 10/21/ 2009 Objectives 2 Exploratory data analysis Simple liner regression Polynomial regression How to fit a multiple regression model How to

### Lesson Lesson Outline Outline

Lesson 15 Linear Regression Lesson 15 Outline Review correlation analysis Dependent and Independent variables Least Squares Regression line Calculating l the slope Calculating the Intercept Residuals and

### Inferential Statistics

Inferential Statistics Sampling and the normal distribution Z-scores Confidence levels and intervals Hypothesis testing Commonly used statistical methods Inferential Statistics Descriptive statistics are

### The importance of graphing the data: Anscombe s regression examples

The importance of graphing the data: Anscombe s regression examples Bruce Weaver Northern Health Research Conference Nipissing University, North Bay May 30-31, 2008 B. Weaver, NHRC 2008 1 The Objective

### Analysis of Variance ANOVA

Analysis of Variance ANOVA Overview We ve used the t -test to compare the means from two independent groups. Now we ve come to the final topic of the course: how to compare means from more than two populations.

### Regression, least squares

Regression, least squares Joe Felsenstein Department of Genome Sciences and Department of Biology Regression, least squares p.1/24 Fitting a straight line X Two distinct cases: The X values are chosen

### ABSORBENCY OF PAPER TOWELS

ABSORBENCY OF PAPER TOWELS 15. Brief Version of the Case Study 15.1 Problem Formulation 15.2 Selection of Factors 15.3 Obtaining Random Samples of Paper Towels 15.4 How will the Absorbency be measured?

### SIMPLE REGRESSION ANALYSIS

SIMPLE REGRESSION ANALYSIS Introduction. Regression analysis is used when two or more variables are thought to be systematically connected by a linear relationship. In simple regression, we have only two

### Final Exam Practice Problem Answers

Final Exam Practice Problem Answers The following data set consists of data gathered from 77 popular breakfast cereals. The variables in the data set are as follows: Brand: The brand name of the cereal

### Chapter 5 Analysis of variance SPSS Analysis of variance

Chapter 5 Analysis of variance SPSS Analysis of variance Data file used: gss.sav How to get there: Analyze Compare Means One-way ANOVA To test the null hypothesis that several population means are equal,

### Simple linear regression

Simple linear regression Introduction Simple linear regression is a statistical method for obtaining a formula to predict values of one variable from another where there is a causal relationship between

### " Y. Notation and Equations for Regression Lecture 11/4. Notation:

Notation: Notation and Equations for Regression Lecture 11/4 m: The number of predictor variables in a regression Xi: One of multiple predictor variables. The subscript i represents any number from 1 through

### POLYNOMIAL AND MULTIPLE REGRESSION. Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model.

Polynomial Regression POLYNOMIAL AND MULTIPLE REGRESSION Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model. It is a form of linear regression

### Chapter 7: Simple linear regression Learning Objectives

Chapter 7: Simple linear regression Learning Objectives Reading: Section 7.1 of OpenIntro Statistics Video: Correlation vs. causation, YouTube (2:19) Video: Intro to Linear Regression, YouTube (5:18) -

### GLM I An Introduction to Generalized Linear Models

GLM I An Introduction to Generalized Linear Models CAS Ratemaking and Product Management Seminar March 2009 Presented by: Tanya D. Havlicek, Actuarial Assistant 0 ANTITRUST Notice The Casualty Actuarial

### Prediction and Confidence Intervals in Regression

Fall Semester, 2001 Statistics 621 Lecture 3 Robert Stine 1 Prediction and Confidence Intervals in Regression Preliminaries Teaching assistants See them in Room 3009 SH-DH. Hours are detailed in the syllabus.

### e = random error, assumed to be normally distributed with mean 0 and standard deviation σ

1 Linear Regression 1.1 Simple Linear Regression Model The linear regression model is applied if we want to model a numeric response variable and its dependency on at least one numeric factor variable.

### Regression. Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question.

Class: Date: Regression Multiple Choice Identify the choice that best completes the statement or answers the question. 1. Given the least squares regression line y8 = 5 2x: a. the relationship between

### 5. Linear Regression

5. Linear Regression Outline.................................................................... 2 Simple linear regression 3 Linear model............................................................. 4

### 1.1. Simple Regression in Excel (Excel 2010).

.. Simple Regression in Excel (Excel 200). To get the Data Analysis tool, first click on File > Options > Add-Ins > Go > Select Data Analysis Toolpack & Toolpack VBA. Data Analysis is now available under

### Statistics 112 Regression Cheatsheet Section 1B - Ryan Rosario

Statistics 112 Regression Cheatsheet Section 1B - Ryan Rosario I have found that the best way to practice regression is by brute force That is, given nothing but a dataset and your mind, compute everything

### Factors affecting online sales

Factors affecting online sales Table of contents Summary... 1 Research questions... 1 The dataset... 2 Descriptive statistics: The exploratory stage... 3 Confidence intervals... 4 Hypothesis tests... 4

### Univariate Regression

Univariate Regression Correlation and Regression The regression line summarizes the linear relationship between 2 variables Correlation coefficient, r, measures strength of relationship: the closer r is

### Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression

Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression Objectives: To perform a hypothesis test concerning the slope of a least squares line To recognize that testing for a

### Outline. Topic 4 - Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares

Topic 4 - Analysis of Variance Approach to Regression Outline Partitioning sums of squares Degrees of freedom Expected mean squares General linear test - Fall 2013 R 2 and the coefficient of correlation

### Fairfield Public Schools

Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity

### 12: Analysis of Variance. Introduction

1: Analysis of Variance Introduction EDA Hypothesis Test Introduction In Chapter 8 and again in Chapter 11 we compared means from two independent groups. In this chapter we extend the procedure to consider

### Testing for Lack of Fit

Chapter 6 Testing for Lack of Fit How can we tell if a model fits the data? If the model is correct then ˆσ 2 should be an unbiased estimate of σ 2. If we have a model which is not complex enough to fit

### DATA INTERPRETATION AND STATISTICS

PholC60 September 001 DATA INTERPRETATION AND STATISTICS Books A easy and systematic introductory text is Essentials of Medical Statistics by Betty Kirkwood, published by Blackwell at about 14. DESCRIPTIVE

### Data Analysis. Lecture Empirical Model Building and Methods (Empirische Modellbildung und Methoden) SS Analysis of Experiments - Introduction

Data Analysis Lecture Empirical Model Building and Methods (Empirische Modellbildung und Methoden) Prof. Dr. Dr. h.c. Dieter Rombach Dr. Andreas Jedlitschka SS 2014 Analysis of Experiments - Introduction

### A Primer on Forecasting Business Performance

A Primer on Forecasting Business Performance There are two common approaches to forecasting: qualitative and quantitative. Qualitative forecasting methods are important when historical data is not available.

### Chapter 13 Introduction to Linear Regression and Correlation Analysis

Chapter 3 Student Lecture Notes 3- Chapter 3 Introduction to Linear Regression and Correlation Analsis Fall 2006 Fundamentals of Business Statistics Chapter Goals To understand the methods for displaing

### Chapter 7. One-way ANOVA

Chapter 7 One-way ANOVA One-way ANOVA examines equality of population means for a quantitative outcome and a single categorical explanatory variable with any number of levels. The t-test of Chapter 6 looks

### 1.5 Oneway Analysis of Variance

Statistics: Rosie Cornish. 200. 1.5 Oneway Analysis of Variance 1 Introduction Oneway analysis of variance (ANOVA) is used to compare several means. This method is often used in scientific or medical experiments

### INTRODUCTORY STATISTICS

INTRODUCTORY STATISTICS FIFTH EDITION Thomas H. Wonnacott University of Western Ontario Ronald J. Wonnacott University of Western Ontario WILEY JOHN WILEY & SONS New York Chichester Brisbane Toronto Singapore

### Regression Analysis. Data Calculations Output

Regression Analysis In an attempt to find answers to questions such as those posed above, empirical labour economists use a useful tool called regression analysis. Regression analysis is essentially a

### Statistical Inference and t-tests

1 Statistical Inference and t-tests Objectives Evaluate the difference between a sample mean and a target value using a one-sample t-test. Evaluate the difference between a sample mean and a target value

### Statistiek II. John Nerbonne. March 24, 2010. Information Science, Groningen Slides improved a lot by Harmut Fitz, Groningen!

Information Science, Groningen j.nerbonne@rug.nl Slides improved a lot by Harmut Fitz, Groningen! March 24, 2010 Correlation and regression We often wish to compare two different variables Examples: compare

### AP Statistics 2001 Solutions and Scoring Guidelines

AP Statistics 2001 Solutions and Scoring Guidelines The materials included in these files are intended for non-commercial use by AP teachers for course and exam preparation; permission for any other use

### Some Essential Statistics The Lure of Statistics

Some Essential Statistics The Lure of Statistics Data Mining Techniques, by M.J.A. Berry and G.S Linoff, 2004 Statistics vs. Data Mining..lie, damn lie, and statistics mining data to support preconceived

### Outline. Correlation & Regression, III. Review. Relationship between r and regression

Outline Correlation & Regression, III 9.07 4/6/004 Relationship between correlation and regression, along with notes on the correlation coefficient Effect size, and the meaning of r Other kinds of correlation

### Introductory Statistics Notes

Introductory Statistics Notes Jamie DeCoster Department of Psychology University of Alabama 348 Gordon Palmer Hall Box 870348 Tuscaloosa, AL 35487-0348 Phone: (205) 348-4431 Fax: (205) 348-8648 August

### Using R for Linear Regression

Using R for Linear Regression In the following handout words and symbols in bold are R functions and words and symbols in italics are entries supplied by the user; underlined words and symbols are optional

### Nonparametric Methods for Two Samples. Nonparametric Methods for Two Samples

Nonparametric Methods for Two Samples An overview In the independent two-sample t-test, we assume normality, independence, and equal variances. This t-test is robust against nonnormality, but is sensitive

### TRINITY COLLEGE. Faculty of Engineering, Mathematics and Science. School of Computer Science & Statistics

UNIVERSITY OF DUBLIN TRINITY COLLEGE Faculty of Engineering, Mathematics and Science School of Computer Science & Statistics BA (Mod) Enter Course Title Trinity Term 2013 Junior/Senior Sophister ST7002

### DEPARTMENT OF PSYCHOLOGY UNIVERSITY OF LANCASTER MSC IN PSYCHOLOGICAL RESEARCH METHODS ANALYSING AND INTERPRETING DATA 2 PART 1 WEEK 9

DEPARTMENT OF PSYCHOLOGY UNIVERSITY OF LANCASTER MSC IN PSYCHOLOGICAL RESEARCH METHODS ANALYSING AND INTERPRETING DATA 2 PART 1 WEEK 9 Analysis of covariance and multiple regression So far in this course,

### Comparing Nested Models

Comparing Nested Models ST 430/514 Two models are nested if one model contains all the terms of the other, and at least one additional term. The larger model is the complete (or full) model, and the smaller

### Simple Regression and Correlation

Simple Regression and Correlation Today, we are going to discuss a powerful statistical technique for examining whether or not two variables are related. Specifically, we are going to talk about the ideas

### Premaster Statistics Tutorial 4 Full solutions

Premaster Statistics Tutorial 4 Full solutions Regression analysis Q1 (based on Doane & Seward, 4/E, 12.7) a. Interpret the slope of the fitted regression = 125,000 + 150. b. What is the prediction for

### Statistical Models in R

Statistical Models in R Some Examples Steven Buechler Department of Mathematics 276B Hurley Hall; 1-6233 Fall, 2007 Outline Statistical Models Linear Models in R Regression Regression analysis is the appropriate

### Statistiek I. t-tests. John Nerbonne. CLCG, Rijksuniversiteit Groningen. John Nerbonne 1/35

Statistiek I t-tests John Nerbonne CLCG, Rijksuniversiteit Groningen http://wwwletrugnl/nerbonne/teach/statistiek-i/ John Nerbonne 1/35 t-tests To test an average or pair of averages when σ is known, we

### INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA)

INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA) As with other parametric statistics, we begin the one-way ANOVA with a test of the underlying assumptions. Our first assumption is the assumption of

### Hypothesis Testing Level I Quantitative Methods. IFT Notes for the CFA exam

Hypothesis Testing 2014 Level I Quantitative Methods IFT Notes for the CFA exam Contents 1. Introduction... 3 2. Hypothesis Testing... 3 3. Hypothesis Tests Concerning the Mean... 10 4. Hypothesis Tests

### MTH 140 Statistics Videos

MTH 140 Statistics Videos Chapter 1 Picturing Distributions with Graphs Individuals and Variables Categorical Variables: Pie Charts and Bar Graphs Categorical Variables: Pie Charts and Bar Graphs Quantitative

### International Statistical Institute, 56th Session, 2007: Phil Everson

Teaching Regression using American Football Scores Everson, Phil Swarthmore College Department of Mathematics and Statistics 5 College Avenue Swarthmore, PA198, USA E-mail: peverso1@swarthmore.edu 1. Introduction

### Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics

Institute of Actuaries of India Subject CT3 Probability and Mathematical Statistics For 2015 Examinations Aim The aim of the Probability and Mathematical Statistics subject is to provide a grounding in

### Introduction to Regression and Data Analysis

Statlab Workshop Introduction to Regression and Data Analysis with Dan Campbell and Sherlock Campbell October 28, 2008 I. The basics A. Types of variables Your variables may take several forms, and it

### Comparing Means in Two Populations

Comparing Means in Two Populations Overview The previous section discussed hypothesis testing when sampling from a single population (either a single mean or two means from the same population). Now we

### Example: Boats and Manatees

Figure 9-6 Example: Boats and Manatees Slide 1 Given the sample data in Table 9-1, find the value of the linear correlation coefficient r, then refer to Table A-6 to determine whether there is a significant

### Recall this chart that showed how most of our course would be organized:

Chapter 4 One-Way ANOVA Recall this chart that showed how most of our course would be organized: Explanatory Variable(s) Response Variable Methods Categorical Categorical Contingency Tables Categorical

### AP Statistics Section :12.2 Transforming to Achieve Linearity

AP Statistics Section :12.2 Transforming to Achieve Linearity In Chapter 3, we learned how to analyze relationships between two quantitative variables that showed a linear pattern. When two-variable data

### Yiming Peng, Department of Statistics. February 12, 2013

Regression Analysis Using JMP Yiming Peng, Department of Statistics February 12, 2013 2 Presentation and Data http://www.lisa.stat.vt.edu Short Courses Regression Analysis Using JMP Download Data to Desktop

### Generalized Linear Models

Generalized Linear Models We have previously worked with regression models where the response variable is quantitative and normally distributed. Now we turn our attention to two types of models where the

### Assumptions. Assumptions of linear models. Boxplot. Data exploration. Apply to response variable. Apply to error terms from linear model

Assumptions Assumptions of linear models Apply to response variable within each group if predictor categorical Apply to error terms from linear model check by analysing residuals Normality Homogeneity

### Introduction to Stata

Introduction to Stata September 23, 2014 Stata is one of a few statistical analysis programs that social scientists use. Stata is in the mid-range of how easy it is to use. Other options include SPSS,

### An analysis method for a quantitative outcome and two categorical explanatory variables.

Chapter 11 Two-Way ANOVA An analysis method for a quantitative outcome and two categorical explanatory variables. If an experiment has a quantitative outcome and two categorical explanatory variables that

### 11. Analysis of Case-control Studies Logistic Regression

Research methods II 113 11. Analysis of Case-control Studies Logistic Regression This chapter builds upon and further develops the concepts and strategies described in Ch.6 of Mother and Child Health:

### The aspect of the data that we want to describe/measure is the degree of linear relationship between and The statistic r describes/measures the degree

PS 511: Advanced Statistics for Psychological and Behavioral Research 1 Both examine linear (straight line) relationships Correlation works with a pair of scores One score on each of two variables ( and

### 1. The parameters to be estimated in the simple linear regression model Y=α+βx+ε ε~n(0,σ) are: a) α, β, σ b) α, β, ε c) a, b, s d) ε, 0, σ

STA 3024 Practice Problems Exam 2 NOTE: These are just Practice Problems. This is NOT meant to look just like the test, and it is NOT the only thing that you should study. Make sure you know all the material

### Stat 412/512 CASE INFLUENCE STATISTICS. Charlotte Wickham. stat512.cwick.co.nz. Feb 2 2015

Stat 412/512 CASE INFLUENCE STATISTICS Feb 2 2015 Charlotte Wickham stat512.cwick.co.nz Regression in your field See website. You may complete this assignment in pairs. Find a journal article in your field

### Case Study in Data Analysis Does a drug prevent cardiomegaly in heart failure?

Case Study in Data Analysis Does a drug prevent cardiomegaly in heart failure? Harvey Motulsky hmotulsky@graphpad.com This is the first case in what I expect will be a series of case studies. While I mention

### An analysis appropriate for a quantitative outcome and a single quantitative explanatory. 9.1 The model behind linear regression

Chapter 9 Simple Linear Regression An analysis appropriate for a quantitative outcome and a single quantitative explanatory variable. 9.1 The model behind linear regression When we are examining the relationship

### Basic Statistics and Data Analysis for Health Researchers from Foreign Countries

Basic Statistics and Data Analysis for Health Researchers from Foreign Countries Volkert Siersma siersma@sund.ku.dk The Research Unit for General Practice in Copenhagen Dias 1 Content Quantifying association

### Part 2: Analysis of Relationship Between Two Variables

Part 2: Analysis of Relationship Between Two Variables Linear Regression Linear correlation Significance Tests Multiple regression Linear Regression Y = a X + b Dependent Variable Independent Variable

### Multiple Regression - Selecting the Best Equation An Example Techniques for Selecting the "Best" Regression Equation

Multiple Regression - Selecting the Best Equation When fitting a multiple linear regression model, a researcher will likely include independent variables that are not important in predicting the dependent

### 2013 MBA Jump Start Program. Statistics Module Part 3

2013 MBA Jump Start Program Module 1: Statistics Thomas Gilbert Part 3 Statistics Module Part 3 Hypothesis Testing (Inference) Regressions 2 1 Making an Investment Decision A researcher in your firm just

### MULTIPLE LINEAR REGRESSION ANALYSIS USING MICROSOFT EXCEL. by Michael L. Orlov Chemistry Department, Oregon State University (1996)

MULTIPLE LINEAR REGRESSION ANALYSIS USING MICROSOFT EXCEL by Michael L. Orlov Chemistry Department, Oregon State University (1996) INTRODUCTION In modern science, regression analysis is a necessary part

### Simple Linear Regression in SPSS STAT 314

Simple Linear Regression in SPSS STAT 314 1. Ten Corvettes between 1 and 6 years old were randomly selected from last year s sales records in Virginia Beach, Virginia. The following data were obtained,

Lecture 16: Generalized Additive Models Regression III: Advanced Methods Bill Jacoby Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Goals of the Lecture Introduce Additive Models

### SAS Software to Fit the Generalized Linear Model

SAS Software to Fit the Generalized Linear Model Gordon Johnston, SAS Institute Inc., Cary, NC Abstract In recent years, the class of generalized linear models has gained popularity as a statistical modeling

### Exercise 1.12 (Pg. 22-23)

Individuals: The objects that are described by a set of data. They may be people, animals, things, etc. (Also referred to as Cases or Records) Variables: The characteristics recorded about each individual.

### X X X a) perfect linear correlation b) no correlation c) positive correlation (r = 1) (r = 0) (0 < r < 1)

CORRELATION AND REGRESSION / 47 CHAPTER EIGHT CORRELATION AND REGRESSION Correlation and regression are statistical methods that are commonly used in the medical literature to compare two or more variables.