MSU Scorecard Analysis - Executive Summary #1



Similar documents
How To Manage A Contract

Contract Management The Mavericks Won t Like This!

1.3 Measuring Center & Spread, The Five Number Summary & Boxplots. Describing Quantitative Data with Numbers

Using SPSS, Chapter 2: Descriptive Statistics

PHASE 3: PLANNING PHASE

Quick Reference Guide Interactive PDF Project Management Processes for a Project

PHASE 3: PLANNING PHASE

Construction Management Standards of Practice

Project Management Body of Knowledge (PMBOK) (An Overview of the Knowledge Areas)

Exploratory Data Analysis. Psychology 3256

consider the number of math classes taken by math 150 students. how can we represent the results in one number?

PROJECT MANAGEMENT BASICS

Chapter 1: Exploring Data

DESCRIPTIVE STATISTICS & DATA PRESENTATION*

Project Management Plan Outline ITSC

We released this document in response to a Freedom of Information request. Over time it may become out of date. Department for Work and Pensions

REQUEST FOR PROPOSAL FOR CONSTRUCTION MANAGEMENT SERVICES. Part I: Proposal Information

KPIs: Measuring Indirect Material Suppliers and Service Providers

Lecture 1: Review and Exploratory Data Analysis (EDA)

Notes. Score. 1 Basic Services 1.1. A few instances; could have been more proactive

Using Technology to Streamline Procurement and the Supply Chain

Understanding, Identifying & Analyzing Box & Whisker Plots

PHASE 8: IMPLEMENTATION PHASE

Contract and Vendor Management Guide

Module 4: Data Exploration

The Project Management Knowledge Areas as defined by PMI (PMBOK, 2004)

Contracting for Services

Schedule Risk Analysis Simulator using Beta Distribution

Project Integration Management

Diagrams and Graphs of Statistical Data

Statistical Analysis of Life Insurance Policy Termination and Survivorship

Practice#1(chapter1,2) Name

Determine whether the data are qualitative or quantitative. 8) the colors of automobiles on a used car lot Answer: qualitative

How to best position your construction company to a surety in 2013

Vendor Performance Evaluation Requirements

APPROXIMATING THE PROCESS CYCLE EFFICIENCY OF NON-PHYSICAL PRODUCTION SYSTEMS

Foundation of Quantitative Data Analysis

PROJECT AND PORTFOLIO MANAGEMENT - NUCLEAR

University of Calgary Mitacs Accelerate Internship Terms & Non Disclosure Agreements

Planning and Construction Management Services

Center: Finding the Median. Median. Spread: Home on the Range. Center: Finding the Median (cont.)

Project Management Plan Template

Relative Culture Strength

Resource Manager Service for provision of ICT contractors CPO

Forecaster comments to the ORTECH Report

Linbeck Construction used the Last Planner system of production control on a $22 million remodel of the Chemistry Building at Rice University.

City of Brantford Vendor Performance Scorecard for Consultant Services

Integration Mgmt / Initiating Process Group 4.1 Develop Project Charter

State of Utah Version of Document E

White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program

Project Management Plan for the VDI Data Center Project

Project Management Concepts, Methods, and Techniques

PROJECT MANAGEMENT METHODOLOGY SECTION 3 -- PLANNING PHASE

Certificate In Project Management (CIPM)

Establish Collaborative Strategies to Better Manage a Global Vendor Network Devise a Proper Float Plan

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

REGIONAL MUNICIPALITY OF WOOD BUFFALO VENDOR / CONTRACTOR PERFORMANCE EVALUATION FOR SERVICES & CONSTRUCTION

Chapter 1: Looking at Data Section 1.1: Displaying Distributions with Graphs

Data exploration with Microsoft Excel: univariate analysis

Lesson 4 Measures of Central Tendency

Interpreting Data in Normal Distributions

Title ACC Calgary Chapter LPM What every in-house lawyer needs to know

Predictive Analytics Tools and Techniques

Alliance Scorecarding and Performance Management at TechCo. A Vantage Partners Case Study

Exploratory data analysis (Chapter 2) Fall 2011

AIPM PROFESSIONAL COMPETENCY STANDARDS FOR PROJECT MANAGEMENT PART C CERTIFIED PRACTISING PROJECT MANAGER (CPPM)

CONTRACT MANAGEMENT POLICY

Dynamic Change Management for Fast-tracking Construction Projects

CONTRACT MANAGEMENT FRAMEWORK

EFFECT OF PROJECT COST AND TIME MONITORING ON PROGRESS OF CONSTRUCTION PROJCT

Overview of Future Purchasing s fundamental and advanced training workshops...

Descriptive Statistics. Purpose of descriptive statistics Frequency distributions Measures of central tendency Measures of dispersion

CSSE 372 Software Project Management: Managing Software Projects with Measures

DESMET: A method for evaluating software engineering methods and tools

A Guide to Renovation and Repair Contract Warranties

CONTRACTOR MANAGEMENT BEST PRACTICES: USING DATA FOR IMPROVED DECISION MAKING

Module 1: INTRODUCTION. Submodule 1: What is Construction Quality Management (CQM)? "PROACTIVE PREVENTION vs. REACTIVE INSPECTION"

AIPM PROFESSIONAL COMPETENCY STANDARDS FOR PROJECT MANAGEMENT PART B CERTIFIED PRACTISING PROJECT PRACTITIONER (CPPP)

Policies and Procedures for CONTRACTOR PERFORMANCE EVALUATION

Balancing supplier risk versus reward. kpmg.com

How To Understand The Role Of Higher Education Research In China

Statistics Revision Sheet Question 6 of Paper 2

Using Metrics in Outsourcing

PROJECT QUALITY MANAGEMENT

University of Washington Investment Policy Statement of the Fund Review Committee

Benchmarking Claims Management Performance

Descriptive Statistics

Best Practices in Reporting 360-Degree Feedback Russell Lobsenz, Ph.D., Karen Caruso, Ph.D., and Amanda Seidler, MA November, 2004

Brillig Systems Making Projects Successful

First Midterm Exam (MATH1070 Spring 2012)

Committed to Exceptional Portfolio Management

Transcription:

MSU Scorecard Analysis - Executive Summary #1 MSU Center for Construction Project Performance Assessment and Improvement (C2P2ai) Funding Provided by MSU office of Vice President of Finance and Operations March 6, 2009 Tim Mrozowski A.I.A., Professor Tariq Abdelhamid, PhD, Associate Professor Samarth Jain, Graduate Research Assistant Don Schafer, Graduate Research Assistant Sagata Bhawani, Graduate Research Assistant Jeremiah Williams, Graduate Research Assistant Kim Nguyen, Graduate Research Assistant Construction Management School of Planning Design and Construction Michigan State University ()

Introduction and Project Overview This report outlines analysis of the MSU Construction Vendor Scorecard that was undertaken by the Center for Construction Project Performance, Assessment and Improvement (C2P2ai), School of Planning Design and Construction, Michigan State University. This activity is part of a larger C2P2ai project that began in June 2007 titled Vendor Performance Assessment Methods. The purpose of the vendor performance research was to survey the literature on current research and applications of vendor performance assessment methods used in the construction industry and measure the status quo of the vendor performance assessment methods being implemented by the Michigan State University. The research conducted an exploratory study to benefit efforts aimed at developing a vendor performance scorecard for Michigan State University for its construction projects through the collaborative effort of the research team and members of CPA and EAS. The research investigated contractor/vendor performance measurement systems being employed by universities, which primarily included: Michigan State University, Kent State, University of Connecticut, and University of Colorado at Boulder. MSU Scorecard Michigan State University developed its own vendor performance scorecard based on the model of scorecards used in other industries. The framework of the scorecard was a modified version of the one used by the popular blue chip corporation, Intel. The research team, during the literature review process, also reviewed the MSU scorecard and gave its recommendations to the University. This modified scorecard was subsequently implemented by the construction representatives of MSU. The effects of the modifications are yet to be measured by the research team. The MSU scorecard comprises of five critical success factors that according to MSU are important for the success of its construction projects, namely, quality, schedule, cost, management system, and close-out. These success factors further form five sections on which the general contractor s performance would be assessed. Each success factor has been assigned a relative weight in terms of its contribution to the overall project success as established by MSU. Therefore, these relative weights define the importance of each success factor being established through mutual discussion of construction management staff of the Physical Plant and review of performance of previous construction projects. For instance, the maximum weight as assigned to quality is 30% in the overall performance assessment, which implies that quality contributes the most in the success of construction projects for MSU. Under each success factor, there are multiple performance indicators that collectively define the success factors. For instance, quality has been defined by; a) timely closure of rework items, b) interruptions to operations, c) responsiveness to MSU comments/feedback, and d) qualitative quality rating assessment. These performance indicators are to be measured by the reviewer either qualitatively or quantitatively and assigned a score between 1 to 4, with 1 representing poor performance and 4 representing excellent performance. For measuring each performance indicator, the scorecard suggests brief guidelines that direct the reviewer to any relevant data before making the decision. C2P2ai, SPDC Page 2 of 8

This scorecard is primarily used to assess general contractor s performance in construction projects, which will further establish benchmarks for contractor s prequalification for MSU construction projects. For a typical construction project up to $5 million, the scorecard will be used during two stages of the project; 1) substantial completion, and 2) final payment to general contractor. For projects having value of more than $5 million and duration of more than 6 months, the scorecard will be used on quarterly basis to obtain a better feedback data for improving the performance during project execution. The scorecard has recently been implemented in construction projects and was filled by the construction representative of MSU. Contractor Scorecard Application and Results The EAS group evaluated the performance of 40 contractors using the MSU contactor scorecard, based on work in 120 projects. The results were analyzed and the following box plot was produced as an aggregation of all the results. The detailed analysis of the scorecards will be listed in the final report of the Vendor Performance Assessment Methods project. The line shown in the box plot figure connects all the mean scores under each category. The Graph indicates that MSU believes the contractors evaluated in this sample underperform most on scheduling aspect and least on Project Management aspect. The average total score was 2.57 (out of four) with a standard deviation of ±0.64. Figure 1. Boxplot of the scores received in each of the scorecard main five dimensions The in-depth analysis of each of the performance categories in the contractors scorecard revealed the following trends as collectively viewed by EAS staff: Quality o On average contractors performed better in avoiding interruption to other operations on site when compared to their ability of timely finishing of rework items. C2P2ai, SPDC Page 3 of 8

Scheduling o This was one of the worst performance aspects. It was found that contractors performance in meeting project schedules and generating schedule reports was average. Performance against milestones was slightly better. o The poor performances in this category were addressed by after EAS staff held discussions with other project partners. Cost o The contractors seem to perform better on preventing cost escalations with project changes compared to documenting processing on those changes. This may be a result of a lack of communication between owners and contractors on requirement of change document processing. Project Management o o The contractors performances were the best in this category when compared with all other categories. MSU seemed most satisfied with the contractors commitment and responsiveness to MSU. The contractors were rated slightly over average on project coordination and managing RFIs and other documents. Project Closeout o In spite of very poor performances in this category, MSU rated the contractors performance to be above average. o The results indicated that performance could be improved by using commissioning on projects to achieve better results with warranty and other equipment issues. o Punch list items and closeout documentation completion were mentioned as items requiring more attention from the project participants. It is important to note that evaluating performance must consider both average values as well as the standard deviation of a set of data. The standard deviation (sigma) is (or should be) invariably associated with the calculation of the mean (average) value for a particular set of data. Reporting sigma with the mean value gives an indication of how all the data points vary from the mean. This is important because the mean value alone is misleading as demonstrated by the brilliant analogy of the person that had his/her two feet in a hot oven and the head in a bucket of ice but was on average doing ok (Fellows and Liu 2003). The following graph indicates the distribution and frequency of scores in the Total Scores category. The values do not indicate score of any particular company but are collective of all companies evaluated by MSU. Figure 2 illustrates that overall performance of the 16 contractors evaluated by MSU varies in a significant way. The figure actually tends to a bimodal distribution, especially with the two high peaks around the 1.9 and 3 scores, respectively. C2P2ai, SPDC Page 4 of 8

Frequency 12 10 Mean 2.57 Median 2.62 Mode 2.00 Stdev 0.64 Min 0.94 Max 3.94 8 Frequency 6 4 2 0 1.18 1.35 1.65 1.82 1.94 2.06 2.12 2.24 2.35 2.47 2.54 2.65 2.71 2.76 2.85 2.91 2.97 3.06 3.18 3.35 3.53 3.65 3.76 4.19 Average Total Scores Figure 2. Average Total Performance Scores C2P2ai, SPDC Page 5 of 8

Owner Scorecard Application and Results The second segment of the report discusses the evaluation of MSU as an owner by different contractors over a range of construction projects, that include renovation, new construction, and maintenance projects. The evaluation was performed for a total of 16 projects spread between 11 contractors. The following graph in Figure 3 shows the distribution and frequency of average total scores that MSU received by its contractors. The values do not reflect the performance and score of any particular employee in MSU but of the whole MSU team. The histogram shown in Figure 3 illustrates. The average total score was 3.33 (out of four) with a standard deviation of ±0.38 Figure 3. Frequency distribution of the average scores received for the owner performance Over all, MSU is rated to be in the top 50% of the owners that the contractors have worked with when it comes to quality issues, cost category, scheduling aspects, management systems category, and project closeout. The detailed analysis of the scorecards will be listed in the final report of the Vendor Performance Assessment Methods project. It is worth noting the following: Quality aspects o On a majority of occasions, MSU performed fairly on an average in design management category and communication of design expectations. o The quality of bidding documents was rated fair on an average implying that MSU does a good job in preparing bid documents and answers most questions to clear any ambiguity. o MSU received high ratings on an average under the understanding work category which is a reflection of MSU as an experienced and learned owner. Under the quality category MSU is rated the highest in this category. o MSU is also rated as a good owner when it comes to interacting with the contractor and being responsive to their feedback and suggestions. Under the scheduling category; C2P2ai, SPDC Page 6 of 8

o MSU fared as a fair owner. Apart from a few instances MSU is rated to be a fair owner on an average in making timely decisions that allow contractors to keep their schedule. o Contractors have also rated highly MSU s ability to self perform work on time allowing the contractors to keep on schedule. o MSU is rated as a fair owner in estimating the initial schedule, majority of the contractors think that they specify a reasonable schedule. In the Cost category, MSU is shown to be an inconsistent performer. o With an exception of few instances, the majority of contractors think that MSU does a good job in paying the invoices within 30 days. o MSU is rated as an above fair owner in dealing with evaluating the change requests and handling any disputes fairly. o The time taken to process changes is the worst performing category of MSU as an owner. The Management Systems category is the highest performing category for MSU as an owner. o On an average, MSU performs between fair to good against all performance indicators namely, acting in a fair and reasonable manner during the project, setting and enforcing the contract terms in a fair and reasonable manner, the effectiveness of communication between MSU s construction teams allowing the contractor to work unhindered, and enforcing a fair procurement process (almost all the contractors rated MSU as a good owner in this category). In the closeout category, MSU is rated as a above fair performance owner in all sub categories. o MSU is rated above fair on an average in the punch list category implying that they promptly identify the punch list items and inspect completed work. o With an exception of one instance, the contractors have also rated MSU as fair to good performing owner in closing out the project and releasing the final payment in time The scorecards that were received by the C2P2ai team for both vendor performance and MSU performance belonged to separate projects. It would be helpful to compare the ratings of MSU as an owner and the vendors on the same projects in order to understand the perspective of different parties by comparing ratings of each performance indicator on the same project. The analysis in this report reveals that there is certainly room for improvement in vendor and MSU performance on construction projects. Investigations of improvement initiatives have been taking place in the form of joint research projects between C2P2ai, CPA, Physical Plant, and HFS. In general, the accumulation of scorecard data over time will be beneficial for MSU to establish a base for benchmark performance on its construction projects, whether it s for MSU s performance or that of its vendors. C2P2ai, SPDC Page 7 of 8

References: Alarcón L. et.al. (2000). Learning from Collaborative Benchmarking in the Construction Industry. Chilean Chamber of Construction, Santiago, Chile. Attalla M. et.al. (2003). Reconstruction of the Building Infrastructure: Two Performance Prediction Models. Journal of Construction Engineering and Management, Vol. 9, No. 1, March, 2003. Ballard H. (2000). Last Planner System of Production Control. Doctoral Thesis, University of Birmingham, May 2000. Bassioni et.al. (2004). Performance Measurement in Construction. Journal of Management in Engineering, Vol. 20, No. 2, April 2004. Fellows, R. and Liu A. (2003). Research Methods for Construction. 2nd Edition, Blackwell Publishing, Malden, MA Mitropoulos P. (2003). Planned Work Ready : A Proactive Metric for Project Control. Webb School of Construction, Arizona State University. Nassar N. (2005). An Integrated Framework for Evaluation, Forecasting and Optimization of Performance of Construction Projects. University of Alberta, Canada, 2005 C2P2ai, SPDC Page 8 of 8