Vendor Performance Evaluation Canadian Public Procurement Council Oct. 27-29, 2014 Agenda 1) Objectives and 2) Evaluation 3) Evaluation Templates and 4) Use of Vendor Score 2 13:00 14:00 1
Objectives and The City procures over $500 million yearly in construction. In 2015 the City is launching an evaluation system for construction contracts as a result of: Council Direction. Auditor General recommendation in Construction Services Audit. Best practices. Evaluations and scores stored in online database. 3 Objectives The objectives of performance evaluation are to: Improve communication between vendors and the City. Provide feedback with the goal of performance excellence. Create and track standard Key Performance Indicators (KPI). Support the Contract Administration process used to address non-performance. 4 13:00 14:00 2
Development Reviewed reports and evaluation systems in use: FEDERAL: PROVINCIAL: MUNICIPAL: Additional: Office of the Procurement Ombudsman 2010/14 Reports CMC Canada 2012 VPM Study PWGSC, Defence Construction Canada MTO, Infrastructure Ontario Calgary, Toronto, Hamilton, Oakville, Mississauga 5 Development Consulted with numerous industry organizations: CONTRACTORS: Ottawa Construction Association / Canadian Construction Association General Contractors Association of Ottawa National Capital Heavy Construction Association CONSULTANTS: Consulting Engineers of Ontario Ontario Association of Architects / Ottawa Regional Society of Architects Ontario Association of Landscape Architects Construction industry was supportive of an evaluation system. 6 13:00 14:00 3
Basic Outline Vendors notified up front on eligible contracts. Operations departments go through the evaluation process: Start-up meeting to discuss expectations. Progress meetings to discuss performance to date and improvement areas. Interim evaluations may be used to help give feedback. Debriefing discuss formal evaluation at project completion. Appeal vendors may request a score review. 7 Evaluation Design Detailed process being built with focus on consistency and fairness. Design has incorporated key industry feedback: 1.) Industry wanted better defined scoring criteria. There are standard templates that describe each criteria and performance level. A start-up meeting will further clarify the criteria as it relates to project specific expectations. 8 13:00 14:00 4
Evaluation Design 2.) Industry wanted multiple controls to ensure consistency and fairness. Project Managers (the evaluators) will receive technical and process training. All evaluations are reviewed and approved by higher level managers. Evaluations based on supporting documentation. There are more rigorous reviews for very high scores and those below satisfactory. trends will be monitored internally. 9 Evaluation Design 3.) Industry wanted to give written feedback in the online system prior to approving or appealing the evaluation. The system enables open text vendor feedback. 4.) Industry wanted a chance to review and discuss the evaluation prior to it being entered into the system. The performance evaluation will be a mandatory topic at monthly update meetings and project completion. Feedback will be ongoing. 10 13:00 14:00 5
Evaluation Design 5.) Industry wanted the evaluation to reflect certain extra costs and delays outside their control. Vendors will not be penalized for things outside their control (requested scope changes etc.). Regular communication and documentation will ensure these issues are addressed at the time of occurrence and reflected as such in the evaluation. 11 Evaluation Design 6.) Industry wanted an appeals process. Appeals incorporated into evaluation process. Vendors given fair period (15~ calendar days) to file an appeal. Evaluation score suspended until final decision rendered. General contractors, design consultants, and contract administrators all contribute to the success of a construction project. They are all subject to evaluation to help ensure consistency and fairness. 12 13:00 14:00 6
Templates KPI 3 templates: general contractor, design, and contract admin. A rating guide describes each KPI with 5 performance levels and a corresponding rating: Performance Level (Highest to Lowest) KPI Rating (% of total possible Points) Outstanding 100% Commendable 85% Satisfactory 70% Needs Improvement 50% Not Satisfactory 25% 13 General Contractor Template 8 KPI for general contractors with the following point weightings: Key Performance Indicators Points Overall Project Management 15 Supervision 10 Quality 15 Health and Safety 15 Cooperation and Client Relations 10 Cost Control 10 Site Management 10 Schedule Management 15 Total 100 14 13:00 14:00 7
General Contractor Template Sample general contractor KPI: 15 General Contractor Template Sample general contractor KPI: 16 13:00 14:00 8
Design Consultant Template 6 KPI for design consultants with the following point weightings: Key Performance Indicators Points Overall Project Management 20 Schedule Management 15 Budget Management 20 Quality of Design 20 Issue and Risk Management 15 Communication and Cooperation 10 Total 100 17 Design Consultant Template Sample design consultant KPI: 18 13:00 14:00 9
Design Consultant Template Sample design consultant KPI: 19 Contract Administration Template 8 KPI for contract administrators with the following point weightings: Key Performance Indicators Points Overall Project Management 15 Schedule Monitoring 15 Cost Control 15 Technical Support 10 Oversight of Contract Compliance 15 Issue and Risk Management 10 Communication and Cooperation 10 Records Management 10 Total 100 20 13:00 14:00 10
Contract Administration Template Sample contract admin KPI: 21 Contract Administration Template Sample contract admin KPI: 22 13:00 14:00 11
Individual evaluation scores and overall Vendor Score (OVS) will be tracked. OVS is a weighted-average over the last 3 years: (Year 3 refers to the most recent year, also used by MTO and IO) Low scores will have supporting comments and documentation. 23 Individual evaluations and OVS scores fall into 5 performance levels: Performance Level Score Range Outstanding 90-100% Commendable 80-89% Satisfactory 70-79% Needs Improvement 50-69% Not Satisfactory <50% Those with an OVS below Satisfactory (in yellow) will be monitored more closely. 24 13:00 14:00 12
Rationale for Using Past Performance The initial lowest priced bidder isn t always best value or even the lowest bid (cost overruns later). City mandate is to get Best Value on purchases. Best value is the optimal balance of performance and cost. Evaluations reflect the vendor s ability to perform - deliver good quality/service, meet deadlines, control costs etc. 25 Use of Vendor Score General Contractors Overall Vendor Score will (in the future) be a criteria in tender evaluations. May bypass the lowest bidder for a better performing contractor. Reduces pressure to make trade-offs from competing on just price alone. Tender Evaluation Example: Price /70 Vendor Score (OVS) /30 Total Evaluation Score /100 26 13:00 14:00 13
Use of Vendor Score General Contractors Bid Vendor Score Contractor 1 1.0 million 60% Contractor 2 1.1 million 90% Price Component (/70 Points): Points awarded based on Standard Deviation. Lowest responsive bid gets full points. Other bids lose points relative to how much they exceed the lowest bid. 27 Use of Vendor Score General Contractors Price Component Calculation (/70 Points): Contractor 1: $1.0M lowest bid. Gets 70 points. Contractor 2: $1.1M 10% over lowest. Gets 63 points (10% less points). Vendor Score Component (/30 Points): Vendor Score (OVS) x total possible points. Contractor 1: 60% x 30 = 18 points. Contractor 2: 90% x 30 = 27 points. 28 13:00 14:00 14
Use of Vendor Score General Contractors Results: Evaluation Score Contractor 2 bid higher but still wins given their much better overall vendor score. Bid Contractor 1 88 points 1.0 million (70 pts) Contractor 2 (BEST VALUE WINNER) 90 points 1.1 million (63 pts) Vendor Score 60% (18 pts) 90% (27 pts) 29 Use of Vendor Score - Consultants The overall vendor score would be included as an additional criteria in Requests for Proposals for design and contract admin. Sensitivity analysis conducted to determine the weighting for this criteria in each RFP. In practice weight can also be fixed. RFP Weighting Analysis: From a sample of City engineering RFPs (80% technical portion, 20% financial), the average score difference between the winner and runner-up is 8.3%. 30 13:00 14:00 15
Use of Vendor Score - Consultants Scenario: Proponent A has full points and an 8.3% lead going into the vendor score component. Proponent A has an OVS of 60%. What, OVS would runner-up Proponent B need to win the RFP? Proponent A RFP Evaluation (/100 Points) Weighting for Technical + Vendor Score Vendor Score Financial Points Points RFP Rating 10 90/90 60% x 10 = 6 96/100 25 75/75 60% x 25 = 15 90/100 50 50/50 60% x 50 = 30 80/100 31 Use of Vendor Score - Consultants Proponent B RFP Evaluation (/100 Points) Technical + Points Needed to Financial Points Tie Proponent A 10 82.5/90 13.5 Not Possible 25 68.8/75 21.2 >84.9% 50 45.9/50 34.2 >68.3% Weighting for Vendor Score OVS Needed to Win Observation: At low weightings, only a significantly better performer (Proponent B with much higher OVS) could still win the RFP. With higher weightings, small differences have a larger impact. 32 13:00 14:00 16
Use of Vendor Score - Consultants Putting the vendor performance criteria after the initial pass/fail technical component is ideal in practice. Having it in the technical component leads to a loss of control over the pass/fail on the technical. Technical rating is also intended to be project-specific. Wider application of vendor scores in contract awards requires a more supported and detailed evaluation system to be defensible. Consider phasing-in only with an established system. 33 Timeline Next Steps Oct. Dec. 2014: Providing industry updates and collecting ongoing feedback. Nov. 2014: System development completed. Dec. 2014: System acceptance testing conducted. Internal training begins. Communication to vendors to register on the MERX system. 34 13:00 14:00 17
Timeline Next Steps Jan. 2015: Vendor training offered online and in 2 sessions. Throughout 2015: Feedback on process in action collected through industry meetings. Jan. 2016: Formal review conducted determine when score can be phased into bid evaluations. 35 QUESTIONS? 36 13:00 14:00 18