ORTHOGONAL POLYNOMIAL CONTRASTS INDIVIDUAL DF COMPARISONS: EQUALLY SPACED TREATMENTS



Similar documents
SIMPLE LINEAR CORRELATION. r can range from -1 to 1, and is independent of units of measurement. Correlation can be done on two dependent variables.

MEAN SEPARATION TESTS (LSD AND Tukey s Procedure) is rejected, we need a method to determine which means are significantly different from the others.

Topic 9. Factorial Experiments [ST&D Chapter 15]

Random effects and nested models with SAS

Outline. Topic 4 - Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares

NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )

Testing for Lack of Fit

Regression Analysis: A Complete Example

Factoring Polynomials and Solving Quadratic Equations

ANALYSIS OF TREND CHAPTER 5

Experimental Design for Influential Factors of Rates on Massive Open Online Courses

Evaluation of Correlation between Within-Barn Curing Environment and TSNA Accumulation in Dark Air-Cured Tobacco

Unit 3: Day 2: Factoring Polynomial Expressions

POLYNOMIAL AND MULTIPLE REGRESSION. Polynomial regression used to fit nonlinear (e.g. curvilinear) data into a least squares linear regression model.

Lagrange Interpolation is a method of fitting an equation to a set of points that functions well when there are few points given.

Data Mining and Data Warehousing. Henryk Maciejewski. Data Mining Predictive modelling: regression

Name Intro to Algebra 2. Unit 1: Polynomials and Factoring

Unit 6: Polynomials. 1 Polynomial Functions and End Behavior. 2 Polynomials and Linear Factors. 3 Dividing Polynomials

FACTORING POLYNOMIALS

expression is written horizontally. The Last terms ((2)( 4)) because they are the last terms of the two polynomials. This is called the FOIL method.

a 1 x + a 0 =0. (3) ax 2 + bx + c =0. (4)

New SAS Procedures for Analysis of Sample Survey Data

FOIL FACTORING. Factoring is merely undoing the FOIL method. Let s look at an example: Take the polynomial x²+4x+4.

Efficient Curve Fitting Techniques

2. Simple Linear Regression

Alum Rock Elementary Union School District Algebra I Study Guide for Benchmark III

Multiple Linear Regression

POLYNOMIAL FUNCTIONS

MULTIPLE REGRESSION EXAMPLE

Chapter 19 Split-Plot Designs

Stat 5303 (Oehlert): Tukey One Degree of Freedom 1

MULTIPLE LINEAR REGRESSION ANALYSIS USING MICROSOFT EXCEL. by Michael L. Orlov Chemistry Department, Oregon State University (1996)

Section 13, Part 1 ANOVA. Analysis Of Variance

0.4 FACTORING POLYNOMIALS

MATH 4552 Cubic equations and Cardano s formulae

Zero: If P is a polynomial and if c is a number such that P (c) = 0 then c is a zero of P.

Supplementary Materials. Dispersive micro-solid-phase extraction of dopamine, epinephrine and norepinephrine

Chapter 4 and 5 solutions

1.1. Simple Regression in Excel (Excel 2010).

SPSS Guide: Regression Analysis

Algebra II A Final Exam

Sect Solving Equations Using the Zero Product Rule

A Systematic Approach to Factoring

Using Excel (Microsoft Office 2007 Version) for Graphical Analysis of Data

CLUSTER ANALYSIS. Kingdom Phylum Subphylum Class Order Family Genus Species. In economics, cluster analysis can be used for data mining.

Using R for Linear Regression

Elementary Statistics Sample Exam #3

Zeros of Polynomial Functions

ROUTH S STABILITY CRITERION

An Introduction to Statistical Tests for the SAS Programmer Sara Beck, Fred Hutchinson Cancer Research Center, Seattle, WA

SECTION 0.6: POLYNOMIAL, RATIONAL, AND ALGEBRAIC EXPRESSIONS

Curve Fitting. Before You Begin

Partial Fractions: Undetermined Coefficients

ii. More than one per Trt*Block -> EXPLORATORY MODEL

The Method of Least Squares

Section 6.1 Factoring Expressions

One-Way Analysis of Variance

Factoring Polynomials

Multiple Regression in SPSS This example shows you how to perform multiple regression. The basic command is regression : linear.

Zeros of Polynomial Functions

Simple Methods and Procedures Used in Forecasting

JUST THE MATHS UNIT NUMBER 1.8. ALGEBRA 8 (Polynomials) A.J.Hobson

NSM100 Introduction to Algebra Chapter 5 Notes Factoring

One-Way Analysis of Variance: A Guide to Testing Differences Between Multiple Groups

" Y. Notation and Equations for Regression Lecture 11/4. Notation:

Getting Correct Results from PROC REG

Zeros of Polynomial Functions

This is a square root. The number under the radical is 9. (An asterisk * means multiply.)

6 EXTENDING ALGEBRA. 6.0 Introduction. 6.1 The cubic equation. Objectives

Partial Fractions. Combining fractions over a common denominator is a familiar operation from algebra:

6.1 Add & Subtract Polynomial Expression & Functions

How to calculate an ANOVA table

5. Linear Regression

1.3 Algebraic Expressions

is the degree of the polynomial and is the leading coefficient.

Scatter Plot, Correlation, and Regression on the TI-83/84

Algebra 1 Course Title

Notes on Applied Linear Regression

Lesson 1: Comparison of Population Means Part c: Comparison of Two- Means

Partial Fractions. (x 1)(x 2 + 1)

Introduction to General and Generalized Linear Models

Assessing Model Fit and Finding a Fit Model

Application. Outline. 3-1 Polynomial Functions 3-2 Finding Rational Zeros of. Polynomial. 3-3 Approximating Real Zeros of.

ln(p/(1-p)) = α +β*age35plus, where p is the probability or odds of drinking

Please follow the directions once you locate the Stata software in your computer. Room 114 (Business Lab) has computers with Stata software

1.7. Partial Fractions Rational Functions and Partial Fractions. A rational function is a quotient of two polynomials: R(x) = P (x) Q(x).

Roots of Polynomials

Rockefeller College University at Albany

( ) FACTORING. x In this polynomial the only variable in common to all is x.

The ANOVA for 2x2 Independent Groups Factorial Design

Failure to take the sampling scheme into account can lead to inaccurate point estimates and/or flawed estimates of the standard errors.

Nonlinear Regression Functions. SW Ch 8 1/54/

3.3. Solving Polynomial Equations. Introduction. Prerequisites. Learning Outcomes

SOLVING POLYNOMIAL EQUATIONS

STATISTICA Formula Guide: Logistic Regression. Table of Contents

Regression III: Advanced Methods

Veterans Upward Bound Algebra I Concepts - Honors

I n d i a n a U n i v e r s i t y U n i v e r s i t y I n f o r m a t i o n T e c h n o l o g y S e r v i c e s

ALGEBRA 2 CRA 2 REVIEW - Chapters 1-6 Answer Section

Transcription:

ORTHOGONAL POLYNOMIAL CONTRASTS INDIVIDUAL DF COMPARISONS: EQUALLY SPACED TREATMENTS Many treatments are equally spaced (incremented). This provides us with the opportunity to look at the response curve of the data (form of multiple regression). Regression analysis could be performed using the data; however, when there are equal increments between successive levels of a factor, a simple coding procedure may be used to accomplish the analysis with much less effort. This technique is related to the one we will use for linear contrasts. We will partition the factor in the ANOVA table into separate single degree of freedom comparisons. The number of possible comparisons is equal to the number of levels of a factor minus one. For example, if there are three levels of a factor, there are two possible comparisons. The comparisons are called orthogonal polynomial contrasts or comparisons. Orthogonal polynomials are equations such that each is associated with a power of the independent variable (e.g. X, linear; X 2, quadratic; X 3, cubic, etc.). 1 st order comparisons measure linear relationships. 2 nd order comparisons measures quadratic relationships. 3 rd order comparisons measures cubic relationships. Example 1 We wish to determine the effect of drying temperature on germination of barley, we can set up an experiment that uses three equally space temperatures (90 o F, 100 o F, 110 o F) and four replicates. The ANOVA would look like: Sources of variation Rep Treatment Error Total df r-1=3 t-1=2 (r-1)(t-1)=6 rt-1=11 1

Since there are three treatments equally spaced by 10 o F, treatment can be partitioned into two single df polynomial contrasts (i.e. linear and quadratic). The new ANOVA table would look like: Sources of variation df Replicate r-1 Treatment t-1 Linear 1 Quadratic 1 Error (r-1)(t-1) Total rt-1 Sum of squares can be calculated for each of the comparisons using the formula: Q 2 / (k*r) where Q=Σc i Y i., k=σc i 2, c=coefficients, and r=number of replicates. F-tests can be calculated for each of the polynomial contrasts. The coefficients used for calculating sums of squares are: Treatment totals Divisor Number of Degree of treatment polynomial T1 T2 T3 T4 T5 T6 k=c 2 i 2 1-1 +1 2 3 1-1 0 +1 2 2 +1-2 +1 6 4 1-3 -1 +1 +3 20 2 +1-1 -1 +1 4 3-1 +3-3 +1 20 5 1-2 -1 0 +1 +2 10 2 +2-1 -2-1 +2 14 3-1 +2 0-2 +1 10 4 +1-4 +6-4 +1 70. 6 1-5 -3-1 +1 +3 +5 70 2 +5-1 -4-4 -1 +5 84 3-5 +7 +4-4 -7 +5 180 4 +1-3 +2 +2-3 +1 28 5-1 +5-10 +10-5 +1 252 2

Coefficients for these comparisons would be as follows: Drying temperature 90 o 100 o 110 o Linear -1 0 1 Quadratic 1-2 1 Using the formulas SS = Q 2 / (k*r) Q= Σc i Y i. and k=σc i 2 SS Linear = [(-1)(Y 1.) + (0)(Y 2. ) (1)(Y 3. )] 2 (2*4) SS Quadradtic = [(1)(Y 1.) + (-2)(Y 2. ) (1)(Y 3. )] 2 (6*4) Example 2 Effect of row spacing on yield (bu/ac) of soybean. Row spacing (inches) Block 18 24 30 36 42 ΣY.j 1 33.6 31.1 33.0 28.4 31.4 157.5 2 37.1 34.5 29.5 29.9 28.3 159.3 3 34.1 30.5 29.2 31.6 28.9 154.3 4 34.6 32.7 30.7 32.3 28.6 158.9 5 35.4 30.7 30.7 28.1 29.6 154.5 6 36.1 30.3 27.9 26.9 33.4 154.6 Y i. 210.9 189.8 181.0 177.2 180.2 939.1 Y i 35.15 31.63 30.17 29.53 30.03 31.3. Step 1. ANOVA of data as an RCBD: SOV Df SS MS F Rep 5 5.41 Treatment 4 125.66 31.415 8.4997 ** Error 20 73.92 3.696 Total 29 204.99 3

Step 2. Partition Treatment of source of variation into four single degree of freedom orthogonal polynomial contrasts. Row spacing (inches) Contrast 18 24 30 36 42 ΣY i 210.9 189.8 181.0 177.2 180.0 Linear -2-1 0 1 2 Quadratic 2-1 -2-1 2 Cubic -1 2 0-2 1 Quartic 1-4 6-4 1 Step 3. Calculate Sum of Squares for each contrast. Row spacing (inches) Contrast 18 24 30 36 42 ΣY i. 210.9 189.8 181.0 177.2 180.0 Q r c 2 i SS Linear -2-1 0 1 2-74.4 6(10) 92.25 Quadratic 2-1 -2-1 2 52.8 6(14) 33.19 Cubic -1 2 0-2 1-5.7 6(10) 0.54 Quartic 1-4 6-4 1 8.9 6(70) 0.19 Remember: Q=Σc i Y i. e.g. Q for Linear = (-2)210.9 + (-1)189.8 + (0)181.0 + (1)177.2 + (2)180.0 = -74.4 and SS = Q 2 /rk Step 4. Rewrite ANOVA. SOV Df SS MS F Rep 5 5.41 Treatment 4 126.17 31.54 TRT MS/Error MS = 8.59 ** Linear 1 92.25 92.25 Lin. MS/Error MS = 24.14 ** Quadratic 1 33.19 33.19 Quad. MS/Error MS = 9.04 ** Cubic 1 0.54 0.54 Cubic MS/Error MS = 0.15 Quartic 1 0.19 0.19 Quart. MS/Error MS = 0.05 Error 20 73.41 3.67 Total 29 204.99 4

Step 5. Conclusions This analysis shows highly significant linear and quadratic effects for the row spacing treatments. The linear component is the portion of the SS attributable to the linear regression of yield on spacing. i.e. b 1 X Y XY - SSCP Q = = n XY = = 2 r(ss X) r(ss X) r(ss X) r c i Therefore b 1 = -74.4/60 = -1.24 The quadratic component measures the additional improvement due to fitting the X 2 component. i.e. b 2 X Y XY - SSCP Q = = n XY = = 2 r(ss X) r(ss X) r(ss X) r c i Therefore b 2 = 52.8/(6*14) = 0.63. Effect of row spacing on soybean yield 42 40 38 Yield (bu/ac) 36 34 32 30 28 12 18 24 30 36 42 48 Row spacing (inches) 5

Example 3 Given the following data set, partition the A and B main effects, and the AxB interaction using orthogonal polynomial contrasts. Treatment Rep 1 Rep 2 Rep 3 Trt Total a 0 b 0 10 12 14 36 a 0 b 1 16 19 21 56 a 0 b 2 24 27 32 82 a 1 b 0 12 15 17 44 a 1 b 1 19 23 29 71 a 1 b 2 33 34 37 104 a 2 b 0 24 27 29 80 a 2 b 1 39 41 43 123 a 2 b 2 45 47 52 144 Rep Total 222 245 273 740 Step 1. Complete the ANOVA as an RCBD. SOV df SS MS F Rep 2 144.9 72.5 60.4** A 2 1790.3 895.2 746.0** B 2 1607.4 803.7 669.8** AxB 4 58.8 14.7 7.4** Error 16 19.1 1.2 Total 26 3620.5 Step 2. Partition A, B, and AxB into their polynomial components. Contrast a 0 b 0 a 0 b 1 a 0 b 2 a 1 b 0 a 1 b 1 a 1 b 2 a 2 b 0 a 2 b 1 a 2 b 2 Q k A L -1-1 -1 0 0 0 1 1 1 173 6 A Q 1 1 1-2 -2-2 1 1 1 83 18 B L -1 0 1-1 0 1-1 0 1 170 6 B Q 1-2 1 1-2 1 1-2 1-10 18 A L B L 1 0-1 0 0 0-1 0 1 18 4 A L B Q -1 2-1 0 0 0 1-2 1-28 12 A Q B L -1 0 1 2 0-2 -1 0 1-10 12 A Q B Q 1-2 1-2 4-2 1-2 1-28 36 6

Step 3. Calculate SS for each contrast. Contrast Q 2 /rk SS SS A L 173 2 /(3x6) = 1662.7 SS A Q 83 2 /(3x18) = 127.6 SS B L 170 2 /(3x6) = 1605.6 SS B Q -10 2 /(3x18 )= 1.9 SS A L B L 18 2 /(3x4 )= 27.0 SS A L B Q -28 2 /(3x12 )= 21.8 SS A Q B L -10 2 /(3x12) = 2.8 SS A Q B Q -28 2 /(3x36) = 7.3 Step 4. Rewrite the ANOVA. SOV df SS MS F (assuming A and B fixed) Rep 2 144.9 72.5 60.4** A 2 1790.3 895.2 746.0** A L 1 1662.7 1662.7 1385.6** A Q 1 127.6 127.6 106.3** B 2 1607.4 803.7 669.8** B L 1 1605.6 1605.6 1338.0** B Q 1 1.9 1.9 1.6 AxB 4 58.8 14.7 7.4** A L B L 1 27.0 27.0 22.5** A L B Q 1 21.8 21.8 18.2** A Q B L 1 2.8 2.8 2.3 A Q B Q 1 7.3 7.3 6.1* Error 16 19.1 1.2 Total 26 3620.5 Step 5. Conclusions The A L, B L, A L xb L, A L xb Q, and A Q B Q sources of variation all contribute significantly to the model. 7

Analysis of Orthogonal Polynomial Contrasts SAS Commands options pageno=1; data orthpoly; input row rep yield; datalines; 18 1 33.6 24 1 31.1 30 1 33 36 1 28.4 42 1 31.4 18 2 37.1 24 2 34.5 30 2 29.5 36 2 29.9 42 2 28.3 18 3 34.1 24 3 30.5 30 3 29.2 36 3 31.6 42 3 28.9 18 4 34.6 24 4 32.7 30 4 30.7 36 4 32.3 42 4 28.6 18 5 35.4 24 5 30.7 30 5 30.7 36 5 28.1 42 5 29.6 18 6 36.1 24 6 30.3 30 6 27.9 36 6 26.9 42 6 33.4 ;; ods rtf file='polynomial contrast SAS output.rtf'; proc print; title 'Printout of Orthogonal Polynomial Data'; run; Proc glm; Class rep row; Model yield=rep row; *Treatment order-------------------18 24 30 36 42; contrast 'linear' row -2-1 0 1 2; contrast 'quadratic' row 2-1 -2-1 2; contrast 'cubic' row -1 2 0-2 1; contrast 'quartic' row 1-4 6-4 1; title 'Anova with Polynomial Contrasts'; run; ods rtf close; run 8

Printout of Orthogonal Polynomial Data Obs row rep yield 1 18 1 33.6 2 24 1 31.1 3 30 1 33.0 4 36 1 28.4 5 42 1 31.4 6 18 2 37.1 7 24 2 34.5 8 30 2 29.5 9 36 2 29.9 10 42 2 28.3 11 18 3 34.1 12 24 3 30.5 13 30 3 29.2 14 36 3 31.6 15 42 3 28.9 16 18 4 34.6 17 24 4 32.7 18 30 4 30.7 19 36 4 32.3 20 42 4 28.6 21 18 5 35.4 22 24 5 30.7 23 30 5 30.7 24 36 5 28.1 25 42 5 29.6 26 18 6 36.1 27 24 6 30.3 28 30 6 27.9 29 36 6 26.9 30 42 6 33.4 9

Anova with Polynomial Contrasts The GLM Procedure Class Level Information Class Levels Values rep 6 1 2 3 4 5 6 row 5 18 24 30 36 42 Number of Observations Read 30 Number of Observations Used 30

Anova with Polynomial Contrasts The GLM Procedure Source DF Sum of Squares Mean Square F Value Pr > F Model 9 131.0710000 14.5634444 3.94 0.0051 Error 20 73.9186667 3.6959333 Corrected Total 29 204.9896667 R-Square Coeff Var Root MSE yield Mean 0.639403 6.141458 1.922481 31.30333 Source DF Type I SS Mean Square F Value Pr > F rep 5 5.4096667 1.0819333 0.29 0.9113 row 4 125.6613333 31.4153333 8.50 0.0004 Source DF Type III SS Mean Square F Value Pr > F rep 5 5.4096667 1.0819333 0.29 0.9113 row 4 125.6613333 31.4153333 8.50 0.0004 Contrast DF Contrast SS Mean Square F Value Pr > F linear 1 91.26666667 91.26666667 24.69 <.0001 quadratic 1 33.69333333 33.69333333 9.12 0.0068 cubic 1 0.50416667 0.50416667 0.14 0.7158 quartic 1 0.19716667 0.19716667 0.05 0.8197 11