# Model selection in R featuring the lasso. Chris Franck LISA Short Course March 26, 2013

Save this PDF as:

Size: px
Start display at page:

Download "Model selection in R featuring the lasso. Chris Franck LISA Short Course March 26, 2013"

## Transcription

1 Model selection in R featuring the lasso Chris Franck LISA Short Course March 26, 2013

2 Goals Overview of LISA Classic data example: prostate data (Stamey et. al) Brief review of regression and model selection. Description of lasso. Discussion/comparison of approaches

3 Laboratory for Interdisciplinary Statistical Analysis LISA helps VT researchers benefit from the use of Statistics Collaboration: Visit our website to request personalized statistical advice and assistance with: Experimental Design Data Analysis Interpreting Results Grant Proposals Software (R, SAS, JMP, SPSS...) LISA also offers: LISA statistical collaborators aim to explain concepts in ways useful for your research. Great advice right now: Meet with LISA before collecting your data. Educational Short Courses: Designed to help graduate students apply statistics in their research Walk-In Consulting: M-F 1-3 PM GLC Video Conference Room for questions requiring <30 mins Also 3-5 PM Port (Library/Torg Bridge) and 9-11 AM ICTAS Café X All services are FREE for VT researchers. We assist with research not class projects or homework. 3

4 The goal is to demonstrate the lasso technique using real world data. Lasso stands for least absolute shrinkage and selection operator. Continuous subset selection algorithm, can shrink the effect of unimportant predictors, can set effects to zero. Requires more technical work to implement compared to other common methods. Note: The analysis closely follows Tibshirani (1996) and Friedman, Hastie, and Tibshirani (2009).

5 In addition to the lasso, these statistical concepts will be discussed. Exploratory data analysis and graphing. Ordinary least squares regression. Cross validation. Model selection, including forward, backward, stepwise selection and information criteria (e.g. AIC, BIC).

6 The prostate data originally described in Stamey et. al (1989). 97 men who were about to undergo radical prostatectomy. Research goal: measure association between cancer volume and 8 other clinical measures.

7 The clinical measures are Index variable label 1 lcavol log(cancer volume) 2 lweight log(prostate weight volume) 3 age age 4 lbph log(benign prostatic hyperplasia) 5 svi seminal vesicle invasion 6 lcp log(capsular penetration) 7 gleason Gleason score 8 pgg45 percent Gleason scores 4 or 5 y lpsa log(prostate specific antigen)

8 Regression brief review Simple case: We wish to use a single predictor variable x to predict some outcome y using ordinary least squares (OLS). E.g. x= lcavol, y=lpsa Quiz question 1:What do you see in the plot?

9 Here is the same plot with the regression line included

10 What important property does the regression line have? Question 2: Why not use these lines?

11 The simple linear regression model is: y i = β 0 + β 1 x 1i + ε i Which values are known/unknown? Which are data, which are parameters? Which term is the slope? Intercept? Common assumption about error structure (Question 3: fill in the blanks): ε i ~ (, ) Question 4: What is the difference between β 1 and β 1?

12 Frequently there are many predictors that we want to use simultaneously Multiple linear regression model: y i = β 0 + β 1 x 1i + β 2 x 2i + + β p x pp + ε i In this situation each β j represents the partial slope of predictor j = 1,, p. Question 5: Interpretation? In our case we have 8 candidate predictors (see slide 7). Which set should we use to model the response?

13 Cross validation is used to determine whether a model has good predictive ability for a new data set Parameter estimates β 1are chosen on the basis of available data. We expect a good model to perform well on data used to fit (or train ) the model. Could your model perform well on new data (e.g. patients)? If, not, model may be overfit. Cross validation: hold out a portion of the data (called validation set), fit model to the rest of the data (training set), determine if model based on training set performs well in validation set. Metric to assess prediction error: Mean Square Error MMM = 1 n y 2 n i=1 i y i, y i is predicted value of y i based on model.

14 Now complete code section 1 Import the data to Rstudio. View the data. Plot the data, adding regression lines.

15 Variable subset selection uses statistical criteria to identify a set of predictors Variable subset selection: Among a set of candidate predictors, choose a subset to include in the model based on some statistical criterion, e.g. p-values Forward selection: Add variables one at a time starting with the x most strongly associated with y. Stop when no other significant variables are identified

16 Variable subset selection continued Backwards elimination: Start with every candidate predictor in the model. Remove variables one at a time until all remaining variables are significantly associated with response. Stepwise selection: As forward selection, but at each iteration remove variables which are made obsolete by new additions. Stop when nothing new is added or when a term is removed immediately after it was added

17 Full enumeration methods Given a set of candidate predictors, fit every possible model, use some statistical criterion to decide which is best. AII = 2 lll + 2k BBB = 2 lll + n k Where L represents the likelihood function, k is the number of parameters. Both of these criteria consider the likelihood of each model with a penalty for model complexity

18 MANY methods have been proposed to choose and use predictors Shrinkage methods (Ridge regression, Garotte, many recent lasso-related developments) Tree-based methods Forward stagewise selection (different from forward stepwise regression) Maximum adjusted or unadjusted R 2, Mallow s C p Bayes Factor, Likelihood ratio tests AICc, Deviance information criterion (DIC) Many others!

19 The lasso algorithm performs variable selection by constraining the sum of the magnitudes of the coefficients y i = β 0 + β 1 x 1i + β 2 x 2i + + β p x pp + ε i n i=1 p j=1 β lllll = aaaaaa β (y i β 0 x ii β j ) Subject to p j=1 β j < t. The lasso estimator minimizes the sum of squared differences between the observed outcome and the linear model so long as the sum of the absolute value of the coefficients is below some value t. 2

20 Why constrain the sum of the absolute value of the coefficients? We want a parsimonious model, or a model which describes the response well but is as simple as possible. The lasso aims for parsimony using the constraint explained on the previous slide. Since the overall magnitude of the coefficients is constrained, important predictors are included in the model, and less important predictors shrink, potentially to zero.

21 A few other important items An equivalent Lagrangian form of lasso: n β lllll = aaaaaa β { (y i β 0 x ii β j ) i=1 p j=1 2 p + λ β j j=1 Many software packages require specification of λ. } Also, the shrinkage factor s is defined by s = between zero and one. t p j=1 β j, which is Question: As t (or s) increases, what happens to the coefficient estimates? Question: As λ increases, what happens to the coefficient estimates?

22 Now complete code section 2 Fit the lasso model to the prostate data using the lars package Plot the lasso path Observe how the coefficients change as s increases. Obtain estimated coefficients and predicted values for given values of s.

23 The least angle regression algorithm is used to fit the lasso path efficiently. Extremely efficient way to obtain the lasso coefficient estimates. Identifies the variable most associated with response (like forward selection), but then adds only part of the variable at a time, can switch variables before adding all of the first variable. For more detail, see Efron et. al (2004) and Friedman et. al (2009).

24 The lasso path plot illustrates coefficient behavior for various s. Question: How should we decide which s to use?

25 Cross validation is used to both choose s and assess predictive accuracy of model Initial training and validation sets established. Tuning parameter s is chosen based on training set, model is fit based on training set. Performance of the model chosen above is then assessed on the basis of the validation set. Training model used to predict outcomes in validation set. MSE is computed. If training model produces reasonable MSE based on validation set, model is adopted.

26 K-fold cross validation splits data into K=10. Training set then broken into 10 pieces, 10-fold cross validation used to determine value of shrinkage factor s. Model is fit on entire training set at chosen s, coefficients estimates stored, MSE computed.

27 Now complete code section 3 Make a 10 fold cross validation ID vector Make a vector of s values to use. Perform 10-fold cross validation on the training set at the chosen values of s. Determine which value of s minimizes 10 fold cross validation error. Determine how well chosen model performs in validation set. Compare performance of lasso with AIC, BIC

28 S is chosen to minimize MSE in the training set based on k fold cross validation Picture is of average MSE based on 10 holdout sets for various values of s. Vertical bars depict 1 standard error Typically, value of s that is within 1 SE of lowest value is chosen. 10-fold cross validation suggests s=0.4 is a good choice.

29 Other interesting notes Ridge regression is an earlier and similar method to the lasso, which p 2 invokes the constraint β j < t. j=1 This is also a shrinkage or penalization method. Ridge regression will not set any specified predictor coefficients to exactly zero. Lasso is preferable when predictors may be highly correlated. For both ridge regression and lasso, λ cannot be estimated directly from the data using maximum likelihood due to an identifiability issue. This is why cross validation is chosen to fix λ at a constant.

30 Acknowledgements Thanks to the following Dhruva Sharma Scotland Leman Andy Hoege

31 References Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression (with discussion), Annals of Statistics 32(2): Friedman, Jerome; Hastie, Trevor; Tibshirani, Robert ( ). The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition (Springer Series in Statistics) (Kindle Locations ). Springer - A. Kindle Edition. Stamey, T., Kabalin, J., McNeal, J., Johnstone, I., Freiha, F., Redwine, E. and Yang, N. (1989). Prostate specific antigen in the diagnosis and treatment of adenocarcinoma of the prostate II radical prostatectomy treated patients, Journal of Urology 16: Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Royal. Statist. Soc B., Vol. 58, No. 1, pages ).

### Ridge Regression. Patrick Breheny. September 1. Ridge regression Selection of λ Ridge regression in R/SAS

Ridge Regression Patrick Breheny September 1 Patrick Breheny BST 764: Applied Statistical Modeling 1/22 Ridge regression: Definition Definition and solution Properties As mentioned in the previous lecture,

### Modern regression 2: The lasso

Modern regression 2: The lasso Ryan Tibshirani Data Mining: 36-462/36-662 March 21 2013 Optional reading: ISL 6.2.2, ESL 3.4.2, 3.4.3 1 Reminder: ridge regression and variable selection Recall our setup:

### Modern regression 1: Ridge regression

Modern regression 1: Ridge regression Ryan Tibshirani Data Mining: 36-462/36-662 March 19 2013 Optional reading: ISL 6.2.1, ESL 3.4.1 1 Reminder: shortcomings of linear regression Last time we talked about:

### Linear Model Selection and Regularization

Linear Model Selection and Regularization Recall the linear model Y = β 0 + β 1 X 1 + + β p X p + ɛ. In the lectures that follow, we consider some approaches for extending the linear model framework. In

### Regularized Logistic Regression for Mind Reading with Parallel Validation

Regularized Logistic Regression for Mind Reading with Parallel Validation Heikki Huttunen, Jukka-Pekka Kauppi, Jussi Tohka Tampere University of Technology Department of Signal Processing Tampere, Finland

### Model Validation Techniques

Model Validation Techniques Kevin Mahoney, FCAS kmahoney@ travelers.com CAS RPM Seminar March 17, 2010 Uses of Statistical Models in P/C Insurance Examples of Applications Determine expected loss cost

### 5. Multiple regression

5. Multiple regression QBUS6840 Predictive Analytics https://www.otexts.org/fpp/5 QBUS6840 Predictive Analytics 5. Multiple regression 2/39 Outline Introduction to multiple linear regression Some useful

### Yiming Peng, Department of Statistics. February 12, 2013

Regression Analysis Using JMP Yiming Peng, Department of Statistics February 12, 2013 2 Presentation and Data http://www.lisa.stat.vt.edu Short Courses Regression Analysis Using JMP Download Data to Desktop

### Penalized Logistic Regression and Classification of Microarray Data

Penalized Logistic Regression and Classification of Microarray Data Milan, May 2003 Anestis Antoniadis Laboratoire IMAG-LMC University Joseph Fourier Grenoble, France Penalized Logistic Regression andclassification

### Building risk prediction models - with a focus on Genome-Wide Association Studies. Charles Kooperberg

Building risk prediction models - with a focus on Genome-Wide Association Studies Risk prediction models Based on data: (D i, X i1,..., X ip ) i = 1,..., n we like to fit a model P(D = 1 X 1,..., X p )

### Machine Learning Methods for Demand Estimation

Machine Learning Methods for Demand Estimation By Patrick Bajari, Denis Nekipelov, Stephen P. Ryan, and Miaoyu Yang Over the past decade, there has been a high level of interest in modeling consumer behavior

### Lasso on Categorical Data

Lasso on Categorical Data Yunjin Choi, Rina Park, Michael Seo December 14, 2012 1 Introduction In social science studies, the variables of interest are often categorical, such as race, gender, and nationality.

### User-friendly SAS Macro Application for performing all possible mixed model selection - An update

User-friendly SAS Macro Application for performing all possible mixed model selection - An update ABSTRACT George Fernandez, University of Nevada - Reno, Reno NV 89557 A user-friendly SAS macro application

### Penalized regression: Introduction

Penalized regression: Introduction Patrick Breheny August 30 Patrick Breheny BST 764: Applied Statistical Modeling 1/19 Maximum likelihood Much of 20th-century statistics dealt with maximum likelihood

### Multivariate Logistic Regression

1 Multivariate Logistic Regression As in univariate logistic regression, let π(x) represent the probability of an event that depends on p covariates or independent variables. Then, using an inv.logit formulation

### 12/31/2016. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2

PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 Understand linear regression with a single predictor Understand how we assess the fit of a regression model Total Sum of Squares

### A Gene Expression Test to Predict Prostate Cancer Aggressiveness. Use Prolaris as a guide in your medical and surgical management

A Gene Expression Test to Predict Prostate Cancer Aggressiveness Use Prolaris as a guide in your medical and surgical management What is Prolaris? A direct molecular measure of prostate cancer tumor biology

### [3] Big Data: Model Selection

[3] Big Data: Model Selection Matt Taddy, University of Chicago Booth School of Business faculty.chicagobooth.edu/matt.taddy/teaching [3] Making Model Decisions Out-of-Sample vs In-Sample performance Regularization

### Cross Validation techniques in R: A brief overview of some methods, packages, and functions for assessing prediction models.

Cross Validation techniques in R: A brief overview of some methods, packages, and functions for assessing prediction models. Dr. Jon Starkweather, Research and Statistical Support consultant This month

### The general form of the PROC GLM statement is

Linear Regression Analysis using PROC GLM Regression analysis is a statistical method of obtaining an equation that represents a linear relationship between two variables (simple linear regression), or

### 12/31/2016. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2

PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 Understand when to use multiple Understand the multiple equation and what the coefficients represent Understand different methods

### ALL POSSIBLE MODEL SELECTION IN PROC MIXED A SAS MACRO APPLICATION

Libraries Annual Conference on Applied Statistics in Agriculture 2006-18th Annual Conference Proceedings ALL POSSIBLE MODEL SELECTION IN PROC MIXED A SAS MACRO APPLICATION George C J Fernandez Follow this

### Multiple Linear Regression in Data Mining

Multiple Linear Regression in Data Mining Contents 2.1. A Review of Multiple Linear Regression 2.2. Illustration of the Regression Process 2.3. Subset Selection in Linear Regression 1 2 Chap. 2 Multiple

### Lecture on Model Selection Stat 431, Summer 2012

Lecture on Model Selection Stat 431, Summer 2012 Hyunseung Kang July 30/31, 2012 1 Introduction Consider the multiple regression model with p measurements/predictors/independent variables for n individuals

### Regularization and Variable Selection via the Elastic Net

ElasticNet Hui Zou, Stanford University 1 Regularization and Variable Selection via the Elastic Net Hui Zou and Trevor Hastie Department of Statistics Stanford University ElasticNet Hui Zou, Stanford University

### New Work Item for ISO 3534-5 Predictive Analytics (Initial Notes and Thoughts) Introduction

Introduction New Work Item for ISO 3534-5 Predictive Analytics (Initial Notes and Thoughts) Predictive analytics encompasses the body of statistical knowledge supporting the analysis of massive data sets.

### Machine Learning Big Data using Map Reduce

Machine Learning Big Data using Map Reduce By Michael Bowles, PhD Where Does Big Data Come From? -Web data (web logs, click histories) -e-commerce applications (purchase histories) -Retail purchase histories

### Location matters. 3 techniques to incorporate geo-spatial effects in one's predictive model

Location matters. 3 techniques to incorporate geo-spatial effects in one's predictive model Xavier Conort xavier.conort@gear-analytics.com Motivation Location matters! Observed value at one location is

### IN THE CITY OF NEW YORK Decision Risk and Operations. Advanced Business Analytics Fall 2015

Advanced Business Analytics Fall 2015 Course Description Business Analytics is about information turning data into action. Its value derives fundamentally from information gaps in the economic choices

### Collinearity of independent variables. Collinearity is a condition in which some of the independent variables are highly correlated.

Collinearity of independent variables Collinearity is a condition in which some of the independent variables are highly correlated. Why is this a problem? Collinearity tends to inflate the variance of

: Table of Contents... 1 Overview of Model... 1 Dispersion... 2 Parameterization... 3 Sigma-Restricted Model... 3 Overparameterized Model... 4 Reference Coding... 4 Model Summary (Summary Tab)... 5 Summary

### Package bigdata. R topics documented: February 19, 2015

Type Package Title Big Data Analytics Version 0.1 Date 2011-02-12 Author Han Liu, Tuo Zhao Maintainer Han Liu Depends glmnet, Matrix, lattice, Package bigdata February 19, 2015 The

### Causal Leading Indicators Detection for Demand Forecasting

Causal Leading Indicators Detection for Demand Forecasting Yves R. Sagaert, El-Houssaine Aghezzaf, Nikolaos Kourentzes, Bram Desmet Department of Industrial Management, Ghent University 13/07/2015 EURO

### Speaker First Plenary Session THE USE OF "BIG DATA" - WHERE ARE WE AND WHAT DOES THE FUTURE HOLD? William H. Crown, PhD

Speaker First Plenary Session THE USE OF "BIG DATA" - WHERE ARE WE AND WHAT DOES THE FUTURE HOLD? William H. Crown, PhD Optum Labs Cambridge, MA, USA Statistical Methods and Machine Learning ISPOR International

### Least Squares Estimation

Least Squares Estimation SARA A VAN DE GEER Volume 2, pp 1041 1045 in Encyclopedia of Statistics in Behavioral Science ISBN-13: 978-0-470-86080-9 ISBN-10: 0-470-86080-4 Editors Brian S Everitt & David

### Trees and Random Forests

Trees and Random Forests Adele Cutler Professor, Mathematics and Statistics Utah State University This research is partially supported by NIH 1R15AG037392-01 Cache Valley, Utah Utah State University Leo

### Multiple Regression in SPSS This example shows you how to perform multiple regression. The basic command is regression : linear.

Multiple Regression in SPSS This example shows you how to perform multiple regression. The basic command is regression : linear. In the main dialog box, input the dependent variable and several predictors.

### Big Challenges of Big Data - What are the statistical tasks for the precision medicine era?

Big Challenges of Big Data - What are the statistical tasks for the precision medicine era? Oct 18, 2015 Yu Shyr, Ph.D. Vanderbilt Center for Quantitative Sciences Highlights Overview of the BIG data in

### Data Mining and Data Warehousing. Henryk Maciejewski. Data Mining Predictive modelling: regression

Data Mining and Data Warehousing Henryk Maciejewski Data Mining Predictive modelling: regression Algorithms for Predictive Modelling Contents Regression Classification Auxiliary topics: Estimation of prediction

### Model-based boosting in R

Model-based boosting in R Introduction to Gradient Boosting Matthias Schmid Institut für Medizininformatik, Biometrie und Epidemiologie (IMBE) Friedrich-Alexander-Universität Erlangen-Nürnberg Statistical

### Using SAS to Create Sales Expectations for Everyday and Seasonal Products Ellebracht, Netherton, Gentry, Hallmark Cards, Inc.

Paper SA-06-2012 Using SAS to Create Sales Expectations for Everyday and Seasonal Products Ellebracht, Netherton, Gentry, Hallmark Cards, Inc., Kansas City, MO Abstract In order to provide a macro-level

### SAS Software to Fit the Generalized Linear Model

SAS Software to Fit the Generalized Linear Model Gordon Johnston, SAS Institute Inc., Cary, NC Abstract In recent years, the class of generalized linear models has gained popularity as a statistical modeling

### Stepwise Regression. Chapter 311. Introduction. Variable Selection Procedures. Forward (Step-Up) Selection

Chapter 311 Introduction Often, theory and experience give only general direction as to which of a pool of candidate variables (including transformed variables) should be included in the regression model.

### A Comparison of Decision Tree and Logistic Regression Model Xianzhe Chen, North Dakota State University, Fargo, ND

Paper D02-2009 A Comparison of Decision Tree and Logistic Regression Model Xianzhe Chen, North Dakota State University, Fargo, ND ABSTRACT This paper applies a decision tree model and logistic regression

### Section 6: Model Selection, Logistic Regression and more...

Section 6: Model Selection, Logistic Regression and more... Carlos M. Carvalho The University of Texas McCombs School of Business http://faculty.mccombs.utexas.edu/carlos.carvalho/teaching/ 1 Model Building

### DEPARTMENT OF PSYCHOLOGY UNIVERSITY OF LANCASTER MSC IN PSYCHOLOGICAL RESEARCH METHODS ANALYSING AND INTERPRETING DATA 2 PART 1 WEEK 9

DEPARTMENT OF PSYCHOLOGY UNIVERSITY OF LANCASTER MSC IN PSYCHOLOGICAL RESEARCH METHODS ANALYSING AND INTERPRETING DATA 2 PART 1 WEEK 9 Analysis of covariance and multiple regression So far in this course,

### Doing Multiple Regression with SPSS. In this case, we are interested in the Analyze options so we choose that menu. If gives us a number of choices:

Doing Multiple Regression with SPSS Multiple Regression for Data Already in Data Editor Next we want to specify a multiple regression analysis for these data. The menu bar for SPSS offers several options:

### Data Mining - Evaluation of Classifiers

Data Mining - Evaluation of Classifiers Lecturer: JERZY STEFANOWSKI Institute of Computing Sciences Poznan University of Technology Poznan, Poland Lecture 4 SE Master Course 2008/2009 revised for 2010

### Logistic Regression. http://faculty.chass.ncsu.edu/garson/pa765/logistic.htm#sigtests

Logistic Regression http://faculty.chass.ncsu.edu/garson/pa765/logistic.htm#sigtests Overview Binary (or binomial) logistic regression is a form of regression which is used when the dependent is a dichotomy

### Chapter 7: Simple linear regression Learning Objectives

Chapter 7: Simple linear regression Learning Objectives Reading: Section 7.1 of OpenIntro Statistics Video: Correlation vs. causation, YouTube (2:19) Video: Intro to Linear Regression, YouTube (5:18) -

### TOWARD BIG DATA ANALYSIS WORKSHOP

TOWARD BIG DATA ANALYSIS WORKSHOP 邁 向 巨 量 資 料 分 析 研 討 會 摘 要 集 2015.06.05-06 巨 量 資 料 之 矩 陣 視 覺 化 陳 君 厚 中 央 研 究 院 統 計 科 學 研 究 所 摘 要 視 覺 化 (Visualization) 與 探 索 式 資 料 分 析 (Exploratory Data Analysis, EDA)

### 1/27/2013. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2

PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 Introduce moderated multiple regression Continuous predictor continuous predictor Continuous predictor categorical predictor Understand

### On the Creation of the BeSiVa Algorithm to Predict Voter Support

On the Creation of the BeSiVa Algorithm to Predict Voter Support By Benjamin Rogers Submitted to the graduate degree program in Political Science and the Graduate Faculty of the University of Kansas in

### Statistical Machine Learning

Statistical Machine Learning UoC Stats 37700, Winter quarter Lecture 4: classical linear and quadratic discriminants. 1 / 25 Linear separation For two classes in R d : simple idea: separate the classes

### Cross Validation. Dr. Thomas Jensen Expedia.com

Cross Validation Dr. Thomas Jensen Expedia.com About Me PhD from ETH Used to be a statistician at Link, now Senior Business Analyst at Expedia Manage a database with 720,000 Hotels that are not on contract

### e = random error, assumed to be normally distributed with mean 0 and standard deviation σ

1 Linear Regression 1.1 Simple Linear Regression Model The linear regression model is applied if we want to model a numeric response variable and its dependency on at least one numeric factor variable.

### Wednesday PM. Multiple regression. Multiple regression in SPSS. Presentation of AM results Multiple linear regression. Logistic regression

Wednesday PM Presentation of AM results Multiple linear regression Simultaneous Stepwise Hierarchical Logistic regression Multiple regression Multiple regression extends simple linear regression to consider

### Fare and Duration Prediction: A Study of New York City Taxi Rides

Fare and Duration Prediction: A Study of New York City Taxi Rides Christophoros Antoniades, Delara Fadavi, Antoine Foba Amon Jr. December 16, 2016 1 Introduction New York City taxi rides paint a vibrant

### Smart ESG Integration: Factoring in Sustainability

Smart ESG Integration: Factoring in Sustainability Abstract Smart ESG integration is an advanced ESG integration method developed by RobecoSAM s Quantitative Research team. In a first step, an improved

### Variable selection using random forests

Pattern Recognition Letters 31 (2010) January 25, 2012 Outline 1 2 Sensitivity to n and p Sensitivity to mtry and ntree 3 Procedure Starting example 4 Prostate data Four high dimensional classication datasets

### 2. What is the general linear model to be used to model linear trend? (Write out the model) = + + + or

Simple and Multiple Regression Analysis Example: Explore the relationships among Month, Adv.\$ and Sales \$: 1. Prepare a scatter plot of these data. The scatter plots for Adv.\$ versus Sales, and Month versus

### The Do s & Don ts of Building A Predictive Model in Insurance. University of Minnesota November 9 th, 2012 Nathan Hubbell, FCAS Katy Micek, Ph.D.

The Do s & Don ts of Building A Predictive Model in Insurance University of Minnesota November 9 th, 2012 Nathan Hubbell, FCAS Katy Micek, Ph.D. Agenda Travelers Broad Overview Actuarial & Analytics Career

### Classification and Regression Trees

Classification and Regression Trees Bob Stine Dept of Statistics, School University of Pennsylvania Trees Familiar metaphor Biology Decision tree Medical diagnosis Org chart Properties Recursive, partitioning

### Examining a Fitted Logistic Model

STAT 536 Lecture 16 1 Examining a Fitted Logistic Model Deviance Test for Lack of Fit The data below describes the male birth fraction male births/total births over the years 1931 to 1990. A simple logistic

### Regression III: Advanced Methods

Lecture 5: Linear least-squares Regression III: Advanced Methods William G. Jacoby Department of Political Science Michigan State University http://polisci.msu.edu/jacoby/icpsr/regress3 Simple Linear Regression

### Predict the Popularity of YouTube Videos Using Early View Data

000 001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050

### 11. Analysis of Case-control Studies Logistic Regression

Research methods II 113 11. Analysis of Case-control Studies Logistic Regression This chapter builds upon and further develops the concepts and strategies described in Ch.6 of Mother and Child Health:

### All Possible Regressions

Chapter 312 Introduction Often, theory and experience give only general direction as to which of a pool of candidate variables (including transformed variables) should be included in the regression model.

### Data analysis in supersaturated designs

Statistics & Probability Letters 59 (2002) 35 44 Data analysis in supersaturated designs Runze Li a;b;, Dennis K.J. Lin a;b a Department of Statistics, The Pennsylvania State University, University Park,

### SAS Code to Select the Best Multiple Linear Regression Model for Multivariate Data Using Information Criteria

Paper SA01_05 SAS Code to Select the Best Multiple Linear Regression Model for Multivariate Data Using Information Criteria Dennis J. Beal, Science Applications International Corporation, Oak Ridge, TN

### LEARNING OBJECTIVES SCALES OF MEASUREMENT: A REVIEW SCALES OF MEASUREMENT: A REVIEW DESCRIBING RESULTS DESCRIBING RESULTS 8/14/2016

UNDERSTANDING RESEARCH RESULTS: DESCRIPTION AND CORRELATION LEARNING OBJECTIVES Contrast three ways of describing results: Comparing group percentages Correlating scores Comparing group means Describe

### Section Format Day Begin End Building Rm# Instructor. 001 Lecture Tue 6:45 PM 8:40 PM Silver 401 Ballerini

NEW YORK UNIVERSITY ROBERT F. WAGNER GRADUATE SCHOOL OF PUBLIC SERVICE Course Syllabus Spring 2016 Statistical Methods for Public, Nonprofit, and Health Management Section Format Day Begin End Building

### Regression Modeling Strategies

Frank E. Harrell, Jr. Regression Modeling Strategies With Applications to Linear Models, Logistic Regression, and Survival Analysis With 141 Figures Springer Contents Preface Typographical Conventions

### Linear Algebra Methods for Data Mining

Linear Algebra Methods for Data Mining Saara Hyvönen, Saara.Hyvonen@cs.helsinki.fi Spring 2007 Lecture 3: QR, least squares, linear regression Linear Algebra Methods for Data Mining, Spring 2007, University

### Lasso-based Spam Filtering with Chinese Emails

Journal of Computational Information Systems 8: 8 (2012) 3315 3322 Available at http://www.jofcis.com Lasso-based Spam Filtering with Chinese Emails Zunxiong LIU 1, Xianlong ZHANG 1,, Shujuan ZHENG 2 1

### What is correlational research?

Key Ideas Purpose and use of correlational designs How correlational research developed Types of correlational designs Key characteristics of correlational designs Procedures used in correlational studies

### By Hui Bian Office for Faculty Excellence

By Hui Bian Office for Faculty Excellence 1 Email: bianh@ecu.edu Phone: 328-5428 Location: 2307 Old Cafeteria Complex 2 When want to predict one variable from a combination of several variables. When want

### Simple Linear Regression Inference

Simple Linear Regression Inference 1 Inference requirements The Normality assumption of the stochastic term e is needed for inference even if it is not a OLS requirement. Therefore we have: Interpretation

### 1/2/2016. PSY 512: Advanced Statistics for Psychological and Behavioral Research 2

PSY 512: Advanced Statistics for Psychological and Behavioral Research 2 When and why do we use logistic regression? Binary Multinomial Theory behind logistic regression Assessing the model Assessing predictors

### Using JMP with a Specific

1 Using JMP with a Specific Example of Regression Ying Liu 10/21/ 2009 Objectives 2 Exploratory data analysis Simple liner regression Polynomial regression How to fit a multiple regression model How to

### , has mean A) 0.3. B) the smaller of 0.8 and 0.5. C) 0.15. D) which cannot be determined without knowing the sample results.

BA 275 Review Problems - Week 9 (11/20/06-11/24/06) CD Lessons: 69, 70, 16-20 Textbook: pp. 520-528, 111-124, 133-141 An SRS of size 100 is taken from a population having proportion 0.8 of successes. An

### Effective Linear Discriminant Analysis for High Dimensional, Low Sample Size Data

Effective Linear Discriant Analysis for High Dimensional, Low Sample Size Data Zhihua Qiao, Lan Zhou and Jianhua Z. Huang Abstract In the so-called high dimensional, low sample size (HDLSS) settings, LDA

### C19 Machine Learning

C9 Machine Learning 8 Lectures Hilary Term 25 2 Tutorial Sheets A. Zisserman Overview: Supervised classification perceptron, support vector machine, loss functions, kernels, random forests, neural networks

### NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )

Chapter 340 Principal Components Regression Introduction is a technique for analyzing multiple regression data that suffer from multicollinearity. When multicollinearity occurs, least squares estimates

### PREDICTIVE MODELING IN HEALTHCARE COSTS USING REGRESSION TECHNIQUES

PREDICTIVE MODELING IN HEALTHCARE COSTS USING REGRESSION TECHNIQUES Michael Loginov, Emily Marlow, Victoria Potruch University of California, Santa Barbara Introduction Building a model that predicts an

### ID X Y

Dale Berger SPSS Step-by-Step Regression Introduction: MRC01 This step-by-step example shows how to enter data into SPSS and conduct a simple regression analysis to develop an equation to predict from.

### Performance Metrics for Graph Mining Tasks

Performance Metrics for Graph Mining Tasks 1 Outline Introduction to Performance Metrics Supervised Learning Performance Metrics Unsupervised Learning Performance Metrics Optimizing Metrics Statistical

### SPSS Guide: Regression Analysis

SPSS Guide: Regression Analysis I put this together to give you a step-by-step guide for replicating what we did in the computer lab. It should help you run the tests we covered. The best way to get familiar

### Statistical issues in the analysis of microarray data

Statistical issues in the analysis of microarray data Daniel Gerhard Institute of Biostatistics Leibniz University of Hannover ESNATS Summerschool, Zermatt D. Gerhard (LUH) Analysis of microarray data

College Readiness LINKING STUDY A Study of the Alignment of the RIT Scales of NWEA s MAP Assessments with the College Readiness Benchmarks of EXPLORE, PLAN, and ACT December 2011 (updated January 17, 2012)

### Chapter Seven. Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS

Chapter Seven Multiple regression An introduction to multiple regression Performing a multiple regression on SPSS Section : An introduction to multiple regression WHAT IS MULTIPLE REGRESSION? Multiple

### Identifying SPAM with Predictive Models

Identifying SPAM with Predictive Models Dan Steinberg and Mikhaylo Golovnya Salford Systems 1 Introduction The ECML-PKDD 2006 Discovery Challenge posed a topical problem for predictive modelers: how to

### 17. SIMPLE LINEAR REGRESSION II

17. SIMPLE LINEAR REGRESSION II The Model In linear regression analysis, we assume that the relationship between X and Y is linear. This does not mean, however, that Y can be perfectly predicted from X.

### BOOSTED REGRESSION TREES: A MODERN WAY TO ENHANCE ACTUARIAL MODELLING

BOOSTED REGRESSION TREES: A MODERN WAY TO ENHANCE ACTUARIAL MODELLING Xavier Conort xavier.conort@gear-analytics.com Session Number: TBR14 Insurance has always been a data business The industry has successfully

### HLM software has been one of the leading statistical packages for hierarchical

Introductory Guide to HLM With HLM 7 Software 3 G. David Garson HLM software has been one of the leading statistical packages for hierarchical linear modeling due to the pioneering work of Stephen Raudenbush

### Answer: C. The strength of a correlation does not change if units change by a linear transformation such as: Fahrenheit = 32 + (5/9) * Centigrade

Statistics Quiz Correlation and Regression -- ANSWERS 1. Temperature and air pollution are known to be correlated. We collect data from two laboratories, in Boston and Montreal. Boston makes their measurements

### Chapter 15. Mixed Models. 15.1 Overview. A flexible approach to correlated data.

Chapter 15 Mixed Models A flexible approach to correlated data. 15.1 Overview Correlated data arise frequently in statistical analyses. This may be due to grouping of subjects, e.g., students within classrooms,

### Class 6: Chapter 12. Key Ideas. Explanatory Design. Correlational Designs

Class 6: Chapter 12 Correlational Designs l 1 Key Ideas Explanatory and predictor designs Characteristics of correlational research Scatterplots and calculating associations Steps in conducting a correlational