CS 147: Computer Systems Performance Analysis



Similar documents
One-Way ANOVA using SPSS SPSS ANOVA procedures found in the Compare Means analyses. Specifically, we demonstrate

1. What is the critical value for this 95% confidence interval? CV = z.025 = invnorm(0.025) = 1.96

Chapter 7. One-way ANOVA

One-Way Analysis of Variance (ANOVA) Example Problem

Section 13, Part 1 ANOVA. Analysis Of Variance

CHAPTER 13. Experimental Design and Analysis of Variance

The Analysis of Variance ANOVA

One-Way Analysis of Variance: A Guide to Testing Differences Between Multiple Groups

INTERPRETING THE ONE-WAY ANALYSIS OF VARIANCE (ANOVA)

1.5 Oneway Analysis of Variance

One-Way Analysis of Variance

Statistical Models in R

Introduction to Analysis of Variance (ANOVA) Limitations of the t-test

Notes on Applied Linear Regression

1 Basic ANOVA concepts

Statistiek II. John Nerbonne. October 1, Dept of Information Science

MULTIPLE REGRESSION WITH CATEGORICAL DATA

ABSORBENCY OF PAPER TOWELS

CALCULATIONS & STATISTICS

" Y. Notation and Equations for Regression Lecture 11/4. Notation:

Unit 31 A Hypothesis Test about Correlation and Slope in a Simple Linear Regression

UNDERSTANDING THE TWO-WAY ANOVA

Chapter 10. Key Ideas Correlation, Correlation Coefficient (r),

Session 7 Bivariate Data and Analysis

Univariate Regression

Recall this chart that showed how most of our course would be organized:

Multiple-Comparison Procedures

Biostatistics: DESCRIPTIVE STATISTICS: 2, VARIABILITY

Coefficient of Determination

An analysis method for a quantitative outcome and two categorical explanatory variables.

individualdifferences

Lesson 1: Comparison of Population Means Part c: Comparison of Two- Means

Module 3: Correlation and Covariance

Analysis of Variance. MINITAB User s Guide 2 3-1

STATISTICS AND DATA ANALYSIS IN GEOLOGY, 3rd ed. Clarificationof zonationprocedure described onpp

Topic 9. Factorial Experiments [ST&D Chapter 15]

Simple Regression Theory II 2010 Samuel L. Baker

Regression Analysis: A Complete Example

How To Run Statistical Tests in Excel

Chapter 5 Analysis of variance SPSS Analysis of variance

Study Guide for the Final Exam

N-Way Analysis of Variance

Experimental Designs (revisited)

HYPOTHESIS TESTING: CONFIDENCE INTERVALS, T-TESTS, ANOVAS, AND REGRESSION

Testing Research and Statistical Hypotheses

Exercise 1.12 (Pg )

5 Analysis of Variance models, complex linear models and Random effects models

Simple linear regression

Final Exam Practice Problem Answers

Chapter 19 Split-Plot Designs

Section 14 Simple Linear Regression: Introduction to Least Squares Regression

International Statistical Institute, 56th Session, 2007: Phil Everson

Lecture 19: Chapter 8, Section 1 Sampling Distributions: Proportions

Experiment Design and Analysis of a Mobile Aerial Wireless Mesh Network for Emergencies

Module 5: Multiple Regression Analysis

business statistics using Excel OXFORD UNIVERSITY PRESS Glyn Davis & Branko Pecar

UNDERSTANDING ANALYSIS OF COVARIANCE (ANCOVA)

The ANOVA for 2x2 Independent Groups Factorial Design

NCSS Statistical Software Principal Components Regression. In ordinary least squares, the regression coefficients are estimated using the formula ( )

Repeatability and Reproducibility

Class 19: Two Way Tables, Conditional Distributions, Chi-Square (Text: Sections 2.5; 9.1)

Quadratic forms Cochran s theorem, degrees of freedom, and all that

Case Study in Data Analysis Does a drug prevent cardiomegaly in heart failure?

Additional sources Compilation of sources:

Data Analysis Tools. Tools for Summarizing Data

Confidence Intervals for the Difference Between Two Means

4. Continuous Random Variables, the Pareto and Normal Distributions

Solutions to Homework 10 Statistics 302 Professor Larget

Assumptions. Assumptions of linear models. Boxplot. Data exploration. Apply to response variable. Apply to error terms from linear model

LAB 4 INSTRUCTIONS CONFIDENCE INTERVALS AND HYPOTHESIS TESTING

Statistical Confidence Calculations

Difference of Means and ANOVA Problems

Regression III: Advanced Methods

Chapter 4 and 5 solutions

THE FIRST SET OF EXAMPLES USE SUMMARY DATA... EXAMPLE 7.2, PAGE 227 DESCRIBES A PROBLEM AND A HYPOTHESIS TEST IS PERFORMED IN EXAMPLE 7.

ANOVA. February 12, 2015

Two-Sample T-Tests Assuming Equal Variance (Enter Means)

SIMPLE LINEAR CORRELATION. r can range from -1 to 1, and is independent of units of measurement. Correlation can be done on two dependent variables.

Having a coin come up heads or tails is a variable on a nominal scale. Heads is a different category from tails.

Statistics Review PSY379

Linear Models in STATA and ANOVA

Analysis of Variance ANOVA

Multivariate Analysis of Variance (MANOVA)

MULTIPLE REGRESSION AND ISSUES IN REGRESSION ANALYSIS

Regression step-by-step using Microsoft Excel

Projects Involving Statistics (& SPSS)

Multiple regression - Matrices

Analysing Questionnaires using Minitab (for SPSS queries contact -)

Confidence Intervals for Cp

Simple Linear Regression Inference

Simple Linear Regression

Randomized Block Analysis of Variance

Overview of Violations of the Basic Assumptions in the Classical Normal Linear Regression Model

KSTAT MINI-MANUAL. Decision Sciences 434 Kellogg Graduate School of Management

Estimation of σ 2, the variance of ɛ

How To Check For Differences In The One Way Anova

CHAPTER 11 CHI-SQUARE AND F DISTRIBUTIONS

Outline. Topic 4 - Analysis of Variance Approach to Regression. Partitioning Sums of Squares. Total Sum of Squares. Partitioning sums of squares

Once saved, if the file was zipped you will need to unzip it. For the files that I will be posting you need to change the preferences.

Transcription:

CS 147: Computer Systems Performance Analysis One-Factor Experiments CS 147: Computer Systems Performance Analysis One-Factor Experiments 1 / 42

Overview Introduction Overview Overview Introduction Finding Effects Calculating Errors ANOVA Allocation Analysis Verifying Assumptions Unequal Sample Sizes Finding Effects Calculating Errors ANOVA Allocation Analysis Verifying Assumptions Unequal Sample Sizes 2 / 42

3 / 42 Introduction Characteristics of One-Factor Experiments Introduction Characteristics of One-Factor Experiments Characteristics of One-Factor Experiments Useful if there s only one important categorical factor with more than two interesting alternatives Methods reduce to 2 1 factorial designs if only two choices If single variable isn t categorical, should use regression instead Method allows multiple replications Useful if there s only one important categorical factor with more than two interesting alternatives Methods reduce to 2 1 factorial designs if only two choices If single variable isn t categorical, should use regression instead Method allows multiple replications

Introduction Comparing Truly Comparable Options Introduction Comparing Truly Comparable Options Comparing Truly Comparable Options Evaluating single workload on multiple machines Trying different options for single component Applying single suite of programs to different compilers Evaluating single workload on multiple machines Trying different options for single component Applying single suite of programs to different compilers 4 / 42

5 / 42 Introduction When to Avoid It Introduction When to Avoid It When to Avoid It Incomparable factors E.g., measuring vastly different workloads on single system Numerical factors Won t predict any untested levels Regression usually better choice Related entries across level Use two-factor design instead Incomparable factors E.g., measuring vastly different workloads on single system Numerical factors Won t predict any untested levels Regression usually better choice Related entries across level Use two-factor design instead

6 / 42 An Example One-Factor Experiment An Example One-Factor Experiment An Example One-Factor Experiment Choosing authentication server for single-sized messages Four different servers are available Performance measured by response time Lower is better Choosing authentication server for single-sized messages Four different servers are available Performance measured by response time Lower is better

7 / 42 The One-Factor Model The One-Factor Model The One-Factor Model yij = µ + αj + eij yij is i th response with factor set at level j µ is mean response αj is effect of alternative j αj = 0 eij is error term eij = 0 y ij = µ + α j + e ij y ij is i th response with factor set at level j µ is mean response α j is effect of alternative j αj = 0 e ij is error term eij = 0

One-Factor Experiments With Replications One-Factor Experiments With Replications One-Factor Experiments With Replications Initially, assume r replications at each alternative of factor Assuming a alternatives, we have a total of ar observations Model is thus r a a r a yij = arµ + r αj + eij i=1 j=1 j=1 i=1 j=1 Initially, assume r replications at each alternative of factor Assuming a alternatives, we have a total of ar observations Model is thus r a y ij = arµ + r i=1 j=1 a r α j + a j=1 i=1 j=1 e ij 8 / 42

Sample Data for Our Example Sample Data for Our Example Sample Data for Our Example Four alternatives, with four replications each (measured in seconds) A B C D 0.96 0.75 1.01 0.93 1.05 1.22 0.89 1.02 0.82 1.13 0.94 1.06 0.94 0.98 1.38 1.21 Four alternatives, with four replications each (measured in seconds) A B C D 0.96 0.75 1.01 0.93 1.05 1.22 0.89 1.02 0.82 1.13 0.94 1.06 0.94 0.98 1.38 1.21 9 / 42

10 / 42 Finding Effects Computing Effects Finding Effects Computing Effects Computing Effects Need to figure out µ and αj We have various yij s Errors should add to zero: r a eij = 0 i=1 j=1 Similarly, effects should add to zero: a αj = 0 j=1 Need to figure out µ and α j We have various y ij s Errors should add to zero: r a e ij = 0 i=1 j=1 Similarly, effects should add to zero: a α j = 0 j=1

11 / 42 Finding Effects Calculating µ Finding Effects Calculating µ Calculating µ By definition, sum of errors and sum of effects are both zero: r a yij = arµ + 0 + 0 i=1 j=1 And thus, µ is equal to grand mean of all responses µ = 1 r a yij = y ar i=1 j=1 By definition, sum of errors and sum of effects are both zero: r a y ij = arµ + 0 + 0 i=1 j=1 And thus, µ is equal to grand mean of all responses µ = 1 ar r a y ij = y i=1 j=1

12 / 42 Finding Effects Calculating µ for Our Example Finding Effects Calculating µ for Our Example Calculating µ for Our Example Thus, 1 4 4 µ = yij 4 4 i=1 j=1 = 1 16 16.29 = 1.018 Thus, µ = 1 4 4 4 4 i=1 j=1 = 1 16 16.29 = 1.018 y ij

13 / 42 Finding Effects Calculating α j Finding Effects Calculating α j Calculating αj αj is vector of responses One for each alternative of the factor To find vector, find column means y j = 1 r yij r i=1 Separate mean for each j Can calculate directly from observations α j is vector of responses One for each alternative of the factor To find vector, find column means y j = 1 r r i=1 y ij Separate mean for each j Can calculate directly from observations

14 / 42 Finding Effects Calculating Column Mean Finding Effects Calculating Column Mean Calculating Column Mean We know that yij is defined to be yij = µ + αj + eij So, y j = 1 r (µ + αj + eij) r i=1 ( ) = 1 r rµ + rαj + eij r i=1 We know that y ij is defined to be y ij = µ + α j + e ij So, y j = 1 r = 1 r r (µ + α j + e ij ) i=1 ( rµ + rα j + ) r e ij i=1

15 / 42 Finding Effects Calculating Parameters Finding Effects Calculating Parameters Calculating Parameters Sum of errors for any given row is zero, so y j = 1 (rµ + rαj + 0) r = µ + αj So we can solve for αj: αj = y j µ = y j y Sum of errors for any given row is zero, so y j = 1 r (rµ + rα j + 0) = µ + α j So we can solve for α j : α j = y j µ = y j y

Finding Effects Parameters for Our Example Finding Effects Parameters for Our Example Parameters for Our Example Server A B C D Col. Mean.9425 1.02 1.055 1.055 Subtract µ from column means to get parameters: Parameters -.076.002.037.037 Server A B C D Col. Mean.9425 1.02 1.055 1.055 Subtract µ from column means to get parameters: Parameters -.076.002.037.037 16 / 42

17 / 42 Calculating Errors Estimating Experimental Errors Calculating Errors Estimating Experimental Errors Estimating Experimental Errors Estimated response is ŷij = µ + αij But we measured actual responses Multiple responses per alternative So we can estimate amount of error in estimated response Use methods similar to those used in other types of experiment designs Estimated response is ŷ ij = µ + α ij But we measured actual responses Multiple responses per alternative So we can estimate amount of error in estimated response Use methods similar to those used in other types of experiment designs

Calculating Errors Sum of Squared Errors Calculating Errors Sum of Squared Errors Sum of Squared Errors SSE estimates variance of the errors: r a SSE = eij 2 i=1 j=1 We can calculate SSE directly from model and observations Also can find indirectly from its relationship to other error terms SSE estimates variance of the errors: SSE = r a i=1 j=1 e 2 ij We can calculate SSE directly from model and observations Also can find indirectly from its relationship to other error terms 18 / 42

Calculating Errors SSE for Our Example Calculating Errors SSE for Our Example SSE for Our Example Calculated directly: SSE = (.96 (1.018.076)) 2 + (1.05 (1.018.076)) 2 +... + (.75 (1.018 +.002)) 2 + (1.22 (1.018 +.002)) 2 +... + (.93 (1.018 +.037)) 2 =.3425 Calculated directly: SSE = (.96 (1.018.076)) 2 + (1.05 (1.018.076)) 2 +... + (.75 (1.018 +.002)) 2 + (1.22 (1.018 +.002)) 2 +... + (.93 (1.018 +.037)) 2 =.3425 19 / 42

20 / 42 ANOVA Allocation Allocating Variation ANOVA Allocation Allocating Variation Allocating Variation To allocate variation for model, start by squaring both sides of model equation yij 2 = µ 2 + αj 2 + eij 2 + 2µαj + 2µeij + 2αjeij yij 2 = µ 2 + αj 2 + eij 2 i,j i,j i,j i,j + cross-products Cross-product terms add up to zero To allocate variation for model, start by squaring both sides of model equation i,j y 2 ij = µ 2 + α 2 j + e 2 ij + 2µα j + 2µe ij + 2α j e ij yij 2 = µ 2 + αj 2 + i,j i,j i,j + cross-products Cross-product terms add up to zero e 2 ij

21 / 42 ANOVA Allocation Variation In Sum of Squares Terms ANOVA Allocation Variation In Sum of Squares Terms Variation In Sum of Squares Terms SSY = SS0 + SSA + SSE SSY = yij 2 i,j SS0 = SSA = r i=1 r i=1 a µ 2 = arµ 2 j=1 a αj 2 j=1 = r a αj 2 j=1 Gives another way to calculate SSE SSY = SS0 + SSA + SSE SSY = i,j y 2 ij SS0 = SSA = r i=1 j=1 r a µ 2 = arµ 2 a a αj 2 = r αj 2 i=1 j=1 j=1 Gives another way to calculate SSE

22 / 42 ANOVA Allocation Sum of Squares Terms for Our Example ANOVA Allocation Sum of Squares Terms for Our Example Sum of Squares Terms for Our Example SSY = 16.9615 SS0 = 16.58256 SSA =.03377 So SSE must equal 16.9615-16.58256-.03377 = 0.3425 Matches our earlier SSE calculation SSY = 16.9615 SS0 = 16.58256 SSA =.03377 So SSE must equal 16.9615-16.58256-.03377 = 0.3425 Matches our earlier SSE calculation

ANOVA Allocation Assigning Variation ANOVA Allocation Assigning Variation Assigning Variation SST is total variation SST = SSY SS0 = SSA + SSE Part of total variation comes from model Part comes from experimental errors A good model explains a lot of variation SST is total variation SST = SSY SS0 = SSA + SSE Part of total variation comes from model Part comes from experimental errors A good model explains a lot of variation 23 / 42

ANOVA Allocation Assigning Variation in Our Example ANOVA Allocation Assigning Variation in Our Example Assigning Variation in Our Example SST = SSY SS0 = 0.376244 SSA =.03377 SSE =.3425 Percentage of variation explained by server choice: = 100.03377.3762 = 8.97% SST = SSY SS0 = 0.376244 SSA =.03377 SSE =.3425 Percentage of variation explained by server choice: = 100.03377.3762 = 8.97% 24 / 42

25 / 42 ANOVA Analysis Analysis of Variance ANOVA Analysis Analysis of Variance Analysis of Variance Percentage of variation explained can be large or small Regardless of size, may or may not be statistically significant To determine significance, use ANOVA procedure Assumes normally distributed errors Percentage of variation explained can be large or small Regardless of size, may or may not be statistically significant To determine significance, use ANOVA procedure Assumes normally distributed errors

26 / 42 ANOVA Analysis Running ANOVA ANOVA Analysis Running ANOVA Running ANOVA Easiest to set up tabular method Like method used in regression models Only slight differences Basically, determine ratio of Mean Squared of A (parameters) to Mean Squared Errors Then check against F-table value for number of degrees of freedom Easiest to set up tabular method Like method used in regression models Only slight differences Basically, determine ratio of Mean Squared of A (parameters) to Mean Squared Errors Then check against F-table value for number of degrees of freedom

27 / 42 Sum of Squares F- Computed Component % of Variation ij N y SS0 = Nµ 2 1 y y SST = SSY SS0 100 N 1 A e SSE = SST SSA SSA SST SSE SST N = ar a 1 N a se = MSE Mean Square Degrees of Freedom F- Computed MSA MSE F- Table ANOVA Analysis ANOVA Table for One-Factor Experiments ANOVA Analysis ANOVA Table for One-Factor Experiments ANOVA Table for One-Factor Experiments y SSY = y 2 SSA = r α 2 j MSA = SSA a 1 MSE = SSE N a F[ 1 α; a 1, N a] Component Sum of Squares y SSY = y 2 ij % of Variation Degrees of Freedom y SS0 = Nµ 2 1 y y SST = SSY SS0 100 N 1 A e SSA = r α 2 j SSE = SST SSA SSA SST SSE SST N = ar N a 1 N a s e = MSE Mean Square MSA = SSA a 1 MSE = SSE N a MSA MSE F- Table F[ 1 α; a 1, N a]

28 / 42 ANOVA Analysis ANOVA Procedure for Our Example ANOVA Analysis ANOVA Procedure for Our Example ANOVA Procedure for Our Example Com- % of Degrees F- po- Sum of Varia- of Free- Mean Com- F- nent Squares tion dom Square puted Table y 16.96 16 y 16.58 1 y y 0.376 100 15 A.034 9.0 3.011 0.394 2.61 e.342 91.0 12.028 Component Sum of Squares % of Variation Degrees of Freedom Mean Square F- Computed F- Table y 16.96 16 y 16.58 1 y y 0.376 100 15 A.034 9.0 3.011 0.394 2.61 e.342 91.0 12.028

ANOVA Analysis Interpretation of Sample ANOVA ANOVA Analysis Interpretation of Sample ANOVA Interpretation of Sample ANOVA Done at 90% level F-computed is.394 Table entry at 90% level with n = 3 and m = 12 is 2.61 Thus, servers are not significantly different Done at 90% level F-computed is.394 Table entry at 90% level with n = 3 and m = 12 is 2.61 Thus, servers are not significantly different 29 / 42

30 / 42 Verifying Assumptions One-Factor Experiment Assumptions Verifying Assumptions One-Factor Experiment Assumptions One-Factor Experiment Assumptions Analysis of one-factor experiments makes the usual assumptions: Effects of factors are additive Errors are additive Errors are independent of factor alternatives Errors are normally distributed Errors have same variance at all alternatives How do we tell if these are correct? Analysis of one-factor experiments makes the usual assumptions: Effects of factors are additive Errors are additive Errors are independent of factor alternatives Errors are normally distributed Errors have same variance at all alternatives How do we tell if these are correct?

31 / 42 Verifying Assumptions Visual Diagnostic Tests Verifying Assumptions Visual Diagnostic Tests Visual Diagnostic Tests Similar to those done before Residuals vs. predicted response Normal quantile-quantile plot Residuals vs. experiment number Similar to those done before Residuals vs. predicted response Normal quantile-quantile plot Residuals vs. experiment number

32 / 42 Verifying Assumptions Residuals vs. Predicted for Example Verifying Assumptions Residuals vs. Predicted for Example Residuals vs. Predicted for Example 0.4 0.2 0.0-0.2 0.9 1.0 1.1 0.4 0.2 0.0-0.2 0.9 1.0 1.1

33 / 42 Verifying Assumptions Residuals vs. Predicted, Slightly Revised Verifying Assumptions Residuals vs. Predicted, Slightly Revised Residuals vs. Predicted, Slightly Revised 0.4 0.2 0.0-0.2 0.9 1.0 1.1 0.4 0.2 In the alternate rendering, the predictions for server D are shown in blue so they can be distinguished from server C. 0.0-0.2 0.9 1.0 1.1

34 / 42 Verifying Assumptions What Does The Plot Tell Us? Verifying Assumptions What Does The Plot Tell Us? What Does The Plot Tell Us? Analysis assumed size of errors was unrelated to factor alternatives Plot tells us something entirely different Very different spread of residuals for different factors Thus, one-factor analysis is not appropriate for this data Compare individual alternatives instead Use pairwise confidence intervals Analysis assumed size of errors was unrelated to factor alternatives Plot tells us something entirely different Very different spread of residuals for different factors Thus, one-factor analysis is not appropriate for this data Compare individual alternatives instead Use pairwise confidence intervals

Verifying Assumptions Could We Have Figured This Out Sooner? Verifying Assumptions Could We Have Figured This Out Sooner? Could We Have Figured This Out Sooner? Yes! Look at original data Look at calculated parameters Model says C & D are identical Even cursory examination of data suggests otherwise Yes! Look at original data Look at calculated parameters Model says C & D are identical Even cursory examination of data suggests otherwise 35 / 42

Verifying Assumptions Looking Back at the Data Verifying Assumptions Looking Back at the Data Looking Back at the Data A B C D 0.96 0.75 1.01 0.93 1.05 1.22 0.89 1.02 0.82 1.13 0.94 1.06 0.94 0.98 1.38 1.21 Parameters: -.076.002.037.037 A B C D 0.96 0.75 1.01 0.93 1.05 1.22 0.89 1.02 0.82 1.13 0.94 1.06 0.94 0.98 1.38 1.21 Parameters: -.076.002.037.037 36 / 42

37 / 42 Verifying Assumptions Quantile-Quantile Plot for Example Verifying Assumptions Quantile-Quantile Plot for Example Quantile-Quantile Plot for Example 0.4 0.2 0.0-0.2-0.4-2 -1 0 1 2 0.4 0.2 0.0-0.2-0.4-2 -1 0 1 2

Verifying Assumptions What Does This Plot Tell Us? Verifying Assumptions What Does This Plot Tell Us? What Does This Plot Tell Us? Overall, errors are normally distributed If we only did quantile-quantile plot, we d think everything was fine The lesson: test ALL assumptions, not just one or two Overall, errors are normally distributed If we only did quantile-quantile plot, we d think everything was fine The lesson: test ALL assumptions, not just one or two 38 / 42

39 / 42 Verifying Assumptions One-Factor Confidence Intervals Verifying Assumptions One-Factor Confidence Intervals One-Factor Confidence Intervals Estimated parameters are random variables Thus, can compute confidence intervals Basic method is same as for confidence intervals on 2 k r design effects Find standard deviation of parameters Use that to calculate confidence intervals Possible typo in book, p. 336, example 20.6, in formula for calculating αj Also might be typo on p. 335: degrees of freedom is a(r 1), not r(a 1) Estimated parameters are random variables Thus, can compute confidence intervals Basic method is same as for confidence intervals on 2 k r design effects Find standard deviation of parameters Use that to calculate confidence intervals Possible typo in book, p. 336, example 20.6, in formula for calculating α j Also might be typo on p. 335: degrees of freedom is a(r 1), not r(a 1)

Verifying Assumptions Confidence Intervals For Example Parameters Verifying Assumptions Confidence Intervals For Example Parameters Confidence Intervals For Example Parameters se =.158 Standard deviation of µ =.040 Standard deviation of αj =.069 95% confidence interval for µ = (.932, 1.10) 95% CI for α1 = (.225,.074) 95% CI for α2 = (.148,.151) 95% CI for α3 = (.113,.186) 95% CI for α4 = (.113,.186) s e =.158 Standard deviation of µ =.040 Standard deviation of α j =.069 95% confidence interval for µ = (.932, 1.10) 95% CI for α 1 = (.225,.074) 95% CI for α 2 = (.148,.151) 95% CI for α 3 = (.113,.186) 95% CI for α 4 = (.113,.186) 40 / 42

Unequal Sample Sizes Unequal Sample Sizes in One-Factor Experiments Unequal Sample Sizes Unequal Sample Sizes in One-Factor Experiments Unequal Sample Sizes in One-Factor Experiments Don t really need identical replications for all alternatives Only slight extra difficulty See book example for full details Don t really need identical replications for all alternatives Only slight extra difficulty See book example for full details 41 / 42

42 / 42 Unequal Sample Sizes Changes To Handle Unequal Sample Sizes Unequal Sample Sizes Changes To Handle Unequal Sample Sizes Changes To Handle Unequal Sample Sizes Model is the same Effects are weighted by number of replications for that alternative: a rjaj = 0 j=1 Slightly different formulas for degrees of freedom Model is the same Effects are weighted by number of replications for that alternative: a r j a j = 0 j=1 Slightly different formulas for degrees of freedom