Sound Transit Internal Audit Report - No. 2014-3

Similar documents
Sound Transit Internal Audit Report - No

Audit of the Management of Projects within Employment and Social Development Canada

Project Management Office Charter

Audit of Project Management Governance. Audit Report

Roles, Activities and Relationships

PHASE 8: IMPLEMENTATION PHASE

U.S. Postal Service s DRIVE 25 Improve Customer Experience

Concept of Operations for Line of Business Initiatives

Value to the Mission. FEA Practice Guidance. Federal Enterprise Architecture Program Management Office, OMB

Quick Guide: Meeting ISO Requirements for Asset Management

Program Lifecycle Methodology Version 1.7

Risk management and the transition of projects to business as usual

Department of Administration Portfolio Management System 1.3 June 30, 2010

THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY MANAGEMENT OF PERFORMANCE INFORMATION POLICY AND PROCEDURES DOCUMENT

U.S. Department of Education Federal Student Aid

Why are PMO s are Needed on Large Projects?

Revised October 2013

PROJECT MANAGEMENT PLAN Outline VERSION 0.0 STATUS: OUTLINE DATE:

MNLARS Project Audit Checklist

Manag. Roles. Novemb. ber 20122

Development, Acquisition, Implementation, and Maintenance of Application Systems

Assessing Your Information Technology Organization

Information Technology Project Oversight Framework

Template for IT Project Plan. Template for IT Project Plan. [Project Acronym and Name]

A&CS Assurance Review. Accounting Policy Division Rule Making Participation in Standard Setting. Report

Program Management Professional (PgMP) Examination Content Outline

Enterprise Performance Life Cycle Management. Guideline

Final Report. Audit of the Project Management Framework. December 2014

Project Management for Process Improvement Efforts. Jeanette M Lynch CLSSBB Missouri Quality Award Examiner Certified Facilitator

July 2012 Report No An Audit Report on The ReHabWorks System at the Department of Assistive and Rehabilitative Services

Business Continuity Position Description

Office of the Auditor General AUDIT OF IT GOVERNANCE. Tabled at Audit Committee March 12, 2015

PROJECT MANAGEMENT FRAMEWORK

ISO :2005 Requirements Summary

MGMT 4135 Project Management. Chapter-16. Project Oversight

Frank P.Saladis PMP, PMI Fellow

Computing Services Network Project Methodology

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Best Practices Statement Project Management. Best Practices for Managing State Information Technology Projects

PHASE 3: PLANNING PHASE

IT Baseline Management Policy. Table of Contents

PHASE 3: PLANNING PHASE

The Agile PMO. Contents. Kevin Thompson, Ph.D., PMP, CSP Agile Practice Lead cprime, Inc E. Third Avenue, Suite 205 Foster City, CA 94404

TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

CORPORATE INFORMATION AND TECHNOLOGY STRATEGY

State of California. Contents. California Project Management Office Project Management Framework. Project Management. Framework.

Dallas IIA Chapter / ISACA N. Texas Chapter. January 7, 2010

IT Standards & Contract Management

ID Task Name Time Pred

Presented by. Denis Darveau CISM, CISA, CRISC, CISSP

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

Based on 2008 Survey of 255 Non-IT CEOs/Executives

Project Management Office (PMO) Charter

Internal Audit Report ITS CHANGE MANAGEMENT PROCESS. Report No. SC-11-11

CMS Policy for Configuration Management

PROJECT MANAGEMENT PLAN <PROJECT NAME>

MoP Glossary of Terms - English

P3M3 Portfolio Management Self-Assessment

ITS Project Management

Positive Train Control (PTC) Program Management Plan

How quality assurance reviews can strengthen the strategic value of internal auditing*

Gateway review guidebook. for project owners and review teams

Central Agency for Information Technology

Project Management Professional (PMP)

2012 National BDPA Technology Conference. Defining Project and PMO Metrics

Practical Approaches to Achieving Sustainable IT Governance

Treasury Board of Canada Secretariat (TBS) IT Project Manager s Handbook. Version 1.1

ENTERPRISE PROJECT MANAGEMENT OFFICE

APPENDIX X1 - FIFTH EDITION CHANGES

Project Management Guidebook

Project organisation and establishing a programme management office

EXECUTIVE SUMMARY...5

Audit of Veterans Health Administration Blood Bank Modernization Project

Description of Program Management Processes (Initiating, Planning) 2011 PROGstudy.com. All rights reserved

Overview of: A Guide to the Project Management Body of Knowledge (PMBOK Guide) Fourth Edition

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

AUDIT REPORT. The National Nuclear Security Administration s Network Vision Initiative

Establishing a Quality Assurance and Improvement Program

<name of project> Software Project Management Plan

IT SYSTEM LIFE-CYCLE AND PROJECT MANAGEMENT

WHY DO I NEED A PROGRAM MANAGEMENT OFFICE (AND HOW DO I GET ONE)?

Chapter 4. The IM/IT Portfolio Management Office

Internal Audit. Audit of HRIS: A Human Resources Management Enabler

Appendix 3: Project Management Substation Guidelines (General Process Flow Template)

Alliance Scorecarding and Performance Management at TechCo. A Vantage Partners Case Study

to Asset Management Policy and Guidance Draft Version July 2015

INFORMATION TECHNOLOGY PROJECT EXECUTION MODEL GUIDE FOR SMALL AND MEDIUM PROJECTS

Project Management Plan for

Assessment of NCTD Program Management Framework for Positive Train Control Program

UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL. December 10, 2015

Gartner, Inc. DIR-SDD-2042

GAO DATA CENTER CONSOLIDATION. Strengthened Oversight Needed to Achieve Cost Savings Goal. Report to Congressional Requesters

UNITED NATIONS OFFICE FOR PROJECT SERVICES. ORGANIZATIONAL DIRECTIVE No. 33. UNOPS Strategic Risk Management Planning Framework

HHS OCIO Policy for Information Technology (IT) Enterprise Performance Life Cycle (EPLC)

Enabling IT Performance & Value with Effective IT Governance Assessment & Improvement Practices. April 10, 2013

Transcription:

Sound Transit Internal Audit Report - No. 2014-3 IT Project Management Report Date: Dec. 26, 2014 Table of Contents Page Background 2 Audit Approach and Methodology 2 Summary of Results 4 Findings & Management Response 6 Appendix: Table summarizing process maturity assessments 15 Audit Timeline Audit Entrance Meeting May 6, 2014 First draft report Dec. 26, 2014 Audit Exit Meeting Jan. 6, 2015 Management Technical Review received Jan. 6, 2015 Revised report issued Jan. 6, 2015 Final Management responses received Feb. 20, 2015 Final report issued Feb. 20, 2015 Presented to Audit & Reporting Committee Mar. 19, 2015 Page 1 of 16

Background Sound Transit Information Technology Division utilizes a project management process modeled after the Sound Transit Phase Gate process. Phase Gate is a process enabling project reviews to determine readiness of projects to advance to the next stage. Each stage is a gate that represents key transition and/or decision points in a project s progression through planning, execution, closure and production. The process provides the Technology Governance Team (TGT) 1 with visibility into projects and control over key project decisions - specifically scope, schedule, and budget. In the third quarter of 2012, a new Project Management Office (PMO) organizational structure was implemented. A PMO Manager was hired and the following changes were made to the IT project management process: 1. Employees assigned as project managers directly report to the PMO Manager. 2. Implementation of Primavera to track IT projects scope, schedule and budget. 3. Development of a standard weekly project status report for distribution to all project participants. 4. Development of a Sound Transit IT PMO Integrated Project Delivery Methodology 5. Creation of a standardized presentation to the Technology Governance Team (TGT) which includes budget information was completed in June 2013. 6. Creation of the IT PMO Project Work Products Workbook report which is completed for each project by the project manager during the life of the project was completed in August 2013. IT management selected the COBIT 5 framework 2 in 2013 and is currently integrating it into division strategic planning and performance monitoring. COBIT 5 defines business processes and describes a manageable and logical structure of internal controls to monitor the efficiency and effectiveness of the processes. The IT Project Management process is covered by COBIT process BAI01 Manage Programs and Projects, which describes a methodology to ensure IT projects align with enterprise strategy and how to Initiate, plan, control, and execute and close out projects, including performing postimplementation review. Audit Approach & Methodology Internal Audit approached the IT Project Management audit by gaining an understanding of the PMO process described above. Additionally, IA reviewed the IT Management Control Phase One audit issued on June 27, 2013. The previous audit identified changes made to the governance of the IT Division. We met with the IT Division management to discuss audit scope, objectives, timing and obtain a general knowledge of current practices. Based on analysis of the data gathered and discussion with management, the following objectives were developed: 1. Evaluate the maturity of IT project management program using the COBIT 5 model. 2. Determine whether projects are appropriately documented and reported to the Technology Governance Team (TGT) during the Phase Gate process with regards to the scope, schedule and budget. 3. Evaluate the Project Management Office s use and reporting of performance metrics. 1 The TGT is a cross-agency team, jointly accountable for shared plans that are optimized to support the agency s top priorities: 1. Approving ST s IT Business Plan and budget 2. Managing ST s IT Investment Portfolio and especially top 20 projects 3. Approving IT Business Cases at phase Gate 2 and 3 4. Overseeing the IT department scorecard 5. Nominating and overseeing Business and IT liaisons. Source: Technology Governance Team charter 2 COBIT (an acronym for 'Control objectives for information and related technology') describes a set of processes for managing information technology. The processes are defined; inputs and outputs are identified along with key activities, objectives, performance measures and a maturity model. This framework supports governance by aligning business goals with IT goals and processes. Page 2 of 16

During the fieldwork phase of the audit, all collected information was examined, including the PMO methodology, performance metrics, TGT presentations and project management documents. All information collected was used to formulate conclusions and recommendations. The final phase was reporting. All information was summarized and organized. Preliminary results were communicated with management, findings were clarified, and conclusions and recommendations are presented. The report was provided for appropriate Sound Transit personnel for review and comment. The report was revised to include the required management responses. The Internal Audit Division conducted this audit in accordance with generally accepted government auditing standards. These standards require that the audit is adequately planned, performed, and supervised and that sufficient, appropriate evidence is obtained by applying review procedures which assess whether a reasonable basis for the findings and conclusions contained herein. Page 3 of 16

Summary of Results Internal Audit performed procedures to test each objective. Following is a summary of the test results: Objective One: Evaluate the maturity of IT project management program using the COBIT 5 model. Summary of Results: COBIT 5 has five subjects used to capture the main functions of an IT department and 37 processes that identify the risk and internal controls to manage the program. These processes were reviewed and it was determined, and confirmed with management that nine of these pertain to project management. Six of the nine project management processes reviewed received a rating of three (out of five), but the rating was only partially achieved. A three rating means that controls within the process are in place and documented. The process received a partial rating because the PMO does not self-monitor compliance. The remaining three project management processes received a rating of two, but these ratings were also qualified as partially achieved. A two rating means that controls within the processes are in place but they are not documented. (Refer to Appendix A Table summarizing process maturity assessments.) The ratings in the current assessment are higher than the IT Division s maturity self-assessment performed in 2013. This indicates the division made improvements in documenting the processes involved in project approval, project management, project status updates and project reviews. These changes were made recently, therefore the effectiveness of the new policy and procedures could not be evaluated in this engagement. (Refer to Finding Five) Objective Two: Determine whether projects are appropriately documented and reported to the Technology Governance Team (TGT) during the Phase Gate process with regards to the scope, schedule and budget. Summary of Results: Six projects were evaluated in order to address this objective, four of which were completed. The project management process went through changes during the time period the projects tested were undertaken, therefore not all decisions were documented as described in current PMO policies. The IT SCADA Program and Enterprise Asset Management System projects did not fully adhere to the project management process. Actual cost for the IT SCADA Program has not been provided to the TGT. The budget for the EAMS project went through significant changes before Gate 3 approval and no documented explanation was provided to the TGT. The schedule for the project changed significantly after the baseline was created and the status updates to the TGT did not include detailed explanations. (Refer to Finding Two) The scope, schedule and budget for three of the four completed projects tested were identified during the initial phase of the projects and regular status updates were provided to the project planning team and the TGT. However, the status updates did not provide detail to allow the TGT to adequately monitor the projects. Two of these projects were not delivered on time and within budget. These issues were not reported to the TGT at project close out. (Refer to Finding One) Formal post-implementation reviews were not performed and project justification was not compared to the actual benefits realized from the project. (Refer to Finding Three) Objective Three: Evaluate the Project Management Office s use and reporting of performance metrics. Summary of Results: The Project Management Office currently uses a single stoplight performance metric to report status, and the metric is only reported for projects in the execution Page 4 of 16

phase. The stoplight metric is not defined with regards to scope, schedule and budget. Further, the metric is reported as an aggregate of all projects, which has the potential to not adequately alert management to project delivery problems in a timely manner. Additional performance metrics are available that would provide management with a better understating projects status. (Refer to Finding Four) Page 5 of 16

Finding One: Condition: System acceptance and contract closeout should be formalized. The IT Division has not established and documented a system acceptance and contract closeout process for IT projects. System acceptance requires IT and the business endusers to review the results of an IT project and confirm that all requirements have been met prior to final payment. System testing is performed and documented; however, final acceptance is not adequately documented prior to final payment. Further, project closeout does not include formal review and approval by business end-users. Audit testing identified projects for which the date and terms of system acceptance were not documented. Vendor contracts require payments based on final acceptance dates; however, because of the lack of clarity regarding acceptance dates, testing was unable to determine whether payments were properly made. IT system acceptance and contract closeout process is not performed consistently. IT projects tracked and reported to the TGT were removed from the active project list without explanation during the period covered by this engagement. Another key aspect of system acceptance, system documentation, is not clearly defined. System documentation provided for projects audited was inconsistent with regard to key information, such as, source, purpose, dates, and sign-off. Criteria: Cause: Effect: The COBIT 5 3 framework strongly recommends that system acceptance and contract closeout are performed consistently across all IT projects and documented. The following are industry standards within COBIT 5 applicable to project closeout: 1. Process EDM2: Ensure Benefits Delivered 2. Process APO5: Manage Portfolio 3. Process APO6: Manage Budget and Costs 4. Process BAI01: Manage Programs and Projects 5. Process BAI04: Manage availability and Capacity 6. Process MEA1: Monitor, Evaluate and Assess Performance and Conformance Contract closeout requires performance of specific actions by the Project Manager, Project Sponsor and business end-users before final payment. IT currently does not have an adequately documented contract closeout process or procedures. The Procurement & Contracts Division has developed a Project Manager s Checklist to facilitate the agency s contract closeout process; however, it is currently not used by IT. The TGT lacks assurance that system acceptance and contract closeout processes are completed before final payment, and that the completed project met the criteria established for the scope, schedule, budget and business case requirements established at Gate 3. This can lead to the agency not receiving all deliverables included in contracts. Vendor contracts require payments based on final acceptance dates, thus testing was unable to verify whether payments were properly made including payment of annual maintenance fees, given that Sound Transit s obligation to begin paying such fees upon system acceptance. We recommend the Information Technology Division consider the following: 3 Refer to Appendix A for additional discussion and analysis of COBIT 5 processes, and assessment of Sound Transit IT Division s maturity level with respect to applicable processes. Page 6 of 16

1.1. Create policy and procedures concerning system acceptance and contract closeout. This process should build upon agency documents currently used for project acceptance and contract closeout. This process should also include mechanisms to receive and incorporate feedback from business end users. 1.2. Clearly define the system acceptance terms, including key deliverables, documents and evidence of acceptance. Note that contract terms generally require payment based on system acceptance. 1.3. Create formal and standardized system documentation for all major systems at ST. Management Response The IT PMO concurs with the recommendations and will take the following actions: Incorporate close out requirements into the Charter Document. This allows the reader a view of the project from concept to final production, including: 1. Review and approval of business requirements by business owners 2. Review and approval of service level agreements, system requirements and maintenance requirements by systems owners 3. Clear path to trace final payments 4. Lessons learned to identify deltas between initial concepts and final delivered project. These actions are planned to be implemented by Q2 2015. Page 7 of 16

Finding Two: Condition: Project reporting should include baseline scope, schedule and budget and should provide actuals against baseline. The IT Division has not clarified reporting requirements, including the appropriate reporting of project scope, schedule and budget to the Technology Governance Team (TGT). At Gate 3, the TGT approves projects based on estimated scope, schedule, budget, and business case justification. Subsequently, at Gate 5, a baseline scope, schedule and budget are determined and approved by the Chief Information Officer and Project Sponsor; however the IT Phase Gate process does not require communication of the baseline to the TGT. The PMO Manager provides status updates for all projects in the Execution Phase (gates five to seven) to the TGT and IT Management on a quarterly basis. This update is based on a stoplight system (green, yellow and red); however, the rating system is not detailed. (Further, the stoplight rating system is not defined, as reported separately in finding four.) Criteria: Cause: Effect: The COBIT 5 framework strongly recommends that scope; schedule and budget are tracked and reported during the life of the project. The following are industry standards within COBIT 5 applicable to project reporting: 1. Process EDM4: Ensure Resource Optimization 2. Process APO5: Manage Portfolio 3. Process APO6: Manage Budget and Costs 4. Process BAI01: Manage Programs and Projects 5. Process BAI04: Manage availability and Capacity 6. Process MEA1: Monitor, Evaluate and Assess Performance and Conformance IT PMO policy and procedures does not require project managers to track and report the scope, schedule and budget of each project during the life of the project. Internal staff cost are not tracked by IT. The green, yellow and red rating system is not well defined to allow the TGT to provide consistent oversight. The TGT lack assurance that projects are delivered within scope, schedule, and budget. The lack of information regarding specifics on project implementation does not allow the TGT to perform adequate oversight of projects they approved. We recommend the Information Technology Division consider the following: 2.1 Improve reporting to the TGT during the execution phase (Gate 5 when the baseline scope, schedule and budget are established through Gate 7 when the project is put in production). The current reporting process does not provide a sufficient level of detail regarding scope, schedule and budget to facilitate the TGT s oversight role. IA made a similar recommendation in the IT Management Control (Phase One) audit. 2.2 Track IT staff costs in the budget. According to IT PMO Manager, the staff cost for the four projects were estimated to be between ten and forty percent of the actual costs. 2.3 Report actual costs for each project. 2.4 Provide additional information in status updates to TGT. The information should include budget information and clear definitions of the color codes used to indicate issues with the project schedule. IA made a similar recommendation in the IT Management Control (Phase One) audit. 2.5 Consolidate IT project budgets within IT to allow for better management. Currently, some IT projects are budgeted within IT and others are budgeted within the requesting department which inhibits the Page 8 of 16

ability of the PMO to track the cost during the implementation of the project. PMO suggested recommendation. 2.6 Track budgets and actual project costs within Primavera. The PMO tracks budgets and actuals where possible, but is not consistent as the information sources are disparate. PMO suggested recommendation. Management Response The IT PMO concurs with the recommendations and has previously recognized this discrepancy and has been busy putting the infrastructure and discipline in place to address this at the time of the audit. Improvements in process: The following will be included within Primavera P6 as an enterprise method of consolidating all project information: Project overall Green, Yellow, Red stoplight indicators based on the following o Addition of Scope, Schedule, and Cost Each will have a Green, Yellow, Red indicator Each will include explanations Close coordination with IT Business Manager to coordinate all project costs Project Costs in Primavera. These will be summary level budget/cost information enough for the PM to monitor their project costs. These will be reconciled with E1. Inclusion of internal staff loaded rate Inclusion of Budget costs using top-down/bottom-up project cost monitoring Inclusion of Actuals These actions are planned to be implemented by Q3 2015. Page 9 of 16

Finding Three: Condition: Criteria: Cause: Effect: A post-implementation review should be documented and reported to the TGT for major IT projects. The IT Department has not established a post-implementation project review process. At the end of each project the Project Manager documents the technical details of the implementation within the IT PMO Integrated Project Work Products Workbook. This workbook is not shared with the TGT. When projects are complete they are removed from the status update provided to the TGT without explanation. The TGT is not informed of how the project implementation measures against the approved baseline concerning scope, schedule, budget and business case requirements. Feedback from the business end users that developed the business case requirements are not provided to the TGT. The COBIT 5 framework strongly recommends a post-implementation review be performed and documented for all major IT projects. The following are industry standards within COBIT 5 applicable to post-implementation reviews: 1. Process EDM2: Ensure Benefits Delivered 2. Process APO5: Manage Portfolio 3. Process APO6: Manage Budget and Costs 4. Process BAI01: Manage Programs and Projects 5. Process BAI04: Manage availability and Capacity 6. Process MEA1: Monitor, Evaluate and Assess Performance and Conformance IT PMO policy and procedures does not require a post-implementation review. The agency staff that presents the business case for IT projects does not provide documented feedback regarding how the project was delivered compared to expectations established during Gate 3 approval. The TGT lacks assurance that projects are delivered within scope, schedule, budget and business case requirements, which would enable oversight of projects. Further, the agency lacks assurance the Information Technology investment supports its strategic objectives. We recommend the Information Technology Division consider the following: 3.1 Create policy and procedures concerning a post-implementation review of all significant projects. The post-implementation review should include information and participation from business end-users. 3.2 Include PMO in the intake and planning phase of project development life cycle. 3.3 Formally document, disseminate and train all participants in the IT project management process. Management Response The IT PMO concurs with the recommendations and has previously recognized this discrepancy and been busy putting the infrastructure and discipline in place to address this at the time of the audit. Improvements in process: Lessons Learned (Post-Implementation) document has been included into the Project Charter. This gives the reader a view of the project from initial plan to final production in one package. This includes, change pages, and deltas between original plan and major changes that has led to the final deliverables and what generated these deltas. These lessons learned pages can easily be pulled/transferred into any other medium. These actions are planned to be implemented by Q3 2015. Page 10 of 16

Finding Four: Condition: Criteria: Cause: Effect: Performance Management Office performance metrics should be improved. The IT PMO reports the following performance metric quarterly, 85% of projects in green status at end year. This performance metric only includes projects in the execution phase of IT Phase Gate, which is from Gate 5 to Gate 6. Further, the meaning of Green Status is not defined. The performance metric is reported in the Quarterly Performance Report FIT Department. Additional performance metrics are suggested by the COBIT 5 Internal Control Framework. This metric is not clearly defined with regards to scope, schedule and budget. A single metric is unlikely to provide management with adequate information to oversee projects. Additional performance metrics are likely to provide management with a better understanding of the progress made in delivering projects status. We recommend the Information Technology Division consider the following: 4.1 Provide definitions of the red, yellow, green status indicators used to report project status to TGT, clarifying how to interpret their intention with regard to scope, schedule, and budget. 4.2 Include performance metrics that cover the following areas in the project management process: a. Budget to actual: project costs should be accounted for each project in the portfolio and rolled up at the portfolio level. b. Schedule Performance Metric: Used to indicate if the current project is expected to be completed by the initial project completion date. c. On Time Project Delivery Percentage: Reports how often completed projects meet the initial project completion date. d. End user satisfaction metric: Provides an opportunity for the business end-users to provide feedback concerning the improvements to the business process the project was initiated to improve. e. Project benefits realization: Reported by project sponsors and operational managers. All projects list the expected business benefits before Gate 3 approval. After each project is completed, the benefits should be assessed and the extent they realize benefits anticipated in the initial Business Case should be reported. Management Response The IT PMO concurs with the recommendations and has previously recognized this discrepancy and been busy putting the infrastructure and discipline in place to address this at the time of the audit. Improvements in process: The following will be included within Primavera P6 as an enterprise method of consolidating all project information: Project overall Green, Yellow, Red stoplight indicators based on the following o Addition of Scope, Schedule, and Cost Each will have a Green, Yellow, Red indicator Each will include explanations Close coordination with IT Business Manager to coordinate all project costs Page 11 of 16

Project Costs in Primavera. These will be summary level budget/cost information enough for the PM to monitor their project costs. These will be reconciled with E1. Inclusion of internal staff loaded rate Inclusion of Budget costs using top-down/bottom-up project cost monitoring Inclusion of Actuals The progression of metrics improvement has already begun. The effort will be going through an iterative improvement process, as previous experience has shown that hard metrics applied to each unique project, produces skewed results. Thus, in order to refine metrics for accuracy, a program of study will be implemented to address each unique aspect of the aforementioned. This is also inter-related with the previously outlined, improvements, and they will be the root to improving the aforementioned. These actions are planned to be implemented by Q4 2015. Page 12 of 16

Finding Five: Condition: Evaluation of Project Management Maturity Level. This audit evaluated the maturity of nine business processes that contribute to project management. According to COBIT 5 standards and ISO rating methodology, six of the nine business processes are currently rated Level 3, according to COBIT 5 definitions (see Table One below). The remaining three processes are currently at a rating of Level 2. We also noted these process maturity levels are qualified as partially implemented, which indicates that controls within the processes are not adequate to ensure predictable outcomes. The key to achieving Level 4 predictability lies in improved documentation and developing formal processes that validate the effectiveness of control processes. Criteria: For their 2012 self-assessment the IT Division used the COBIT 4.1 maturity assessment rating system. This audit utilized the following COBIT 5 maturity assessment ratings. Table One COBIT 5 Process Capability Ratings (based on ISO/IEC 15504) Level 5: Optimizing process The level 4 predictable process is continuously improved to meet relevant current and projected business goals. Level 4: Predictable process The level 3 established process now operates within defined limits to achieve its process outcomes. Level 3: Established process The level 2 managed process is now implemented using a defined process that is capable of achieving its process outcomes. Level 2: Managed process The level 1 performed process is now implemented in a managed fashion (planned, monitored and adjusted) and its work products are appropriately established, controlled and maintained. Level 1: Performed process The implemented process achieves its process purpose. Level 0: Incomplete process The process is not implemented or fails to achieve its purpose. Cause: Effect: Controls were not consistently followed. Management evaluations were not adequately documented. A self-evaluation structure is currently not in place. A control framework is in place and functioning; however, not at a level that would ensure predictable outcomes. We recommend the Information Technology Division consider the following: 5.1 Create an effective internal control and risk management environment by implementing procedures that self-evaluate the effectiveness of internal controls: a. Document the control evaluation process and procedures. b. Perform and document evaluation of controls. c. Implement improvements that address identified control weaknesses as applicable. 5.2 Codify project management practices into a formal policy and procedure manual. Management Response The IT PMO concurs with the recommendations. The following processes are currently in progress at the time of the audit: Reviews of continued/incremental process improvement to be done at regular intervals Page 13 of 16

Risk Management registers are already built in to Primavera, the enterprise tool for Project Management to better manage risks The following recommendation will be done Codify project management practices into a formal policy and procedure manual o This already exists, but not in a formal document, aligned with this audit. It will be written thus These actions are planned to be implemented by Q4 2015. Page 14 of 16

Appendix A: Table summarizing process maturity assessments. This table describes each of the nine business processes applicable to project management and their purpose; the IT Division s initial rating performed in 2013, followed by the Internal Audit Division s current rating. The Internal Audit Division s current assessment is generally higher than the IT Division s self-assessment, which indicates a significant improvement in the processes involved in project approval, project management, project communications and project reviews. Changes made to these processes are new; therefore their effectiveness could not be evaluated fully during this engagement. Further, as described in this report, the fact that the IT Division has not developed self-monitoring processes is the driver of the partially achieved ratings in our assessment. Subject Areas Evaluate, Direct and Monitor (EDM) Align, Plan and Organize (APO) Process/ Identifier Ensure benefits delivery (EDM2) Ensure resource optimization (EDM4) Manage portfolio (APO5) Manage budget and costs (APO6) Process Title Description and Purpose Process Description: Optimize the value contribution to the business from the business processes, IT services and IT assets resulting from investments made by IT at acceptable costs. Process Purpose Statement: Secure optimal value from IT-enabled initiatives, services and assets; costefficient delivery of solutions and services; and a reliable and accurate picture of costs and likely benefits so that business needs are supported effectively and efficiently. IT Division Rating (2013) Process Description: Ensure that adequate and sufficient IT-related capabilities (people, process and technology) are available to support enterprise objectives effectively at optimal cost. Process Purpose Statement: Ensure that the resource needs of the enterprise are met in the optimal manner, 2.11 IT costs are optimized, and there is an increased likelihood of benefit realization and readiness for future change. Process Description: Execute the strategic direction set for investments in line with the enterprise architecture vision and the desired characteristics of the investment and related services portfolios, and consider the different categories of investments and the resources and funding constraints. Evaluate, prioritize and balance programs and services, managing demand within resource and funding constraints, based on their alignment with strategic objectives, enterprise worth and risk. Move selected programs into the active services 2.76 portfolio for execution. Monitor the performance of the overall portfolio of services and programs, proposing adjustments as necessary in response to program and service performance or changing enterprise priorities. Process Purpose Statement: Optimize the performance of the overall portfolio of programs in response to program and service performance and changing enterprise priorities and demands. Process Description: Manage the IT-related financial activities in both the business and IT functions, covering budgeting, cost and benefit management, and prioritization of spending through the use of formal budgeting practices and a fair and equitable system of allocating costs to the enterprise. Consult stakeholders to identify and control the total costs and benefits within the context of the IT strategic and tactical plans, and initiate corrective action where needed. 1.81 Process Purpose Statement: Foster partnership between IT and enterprise stakeholders to enable the effective and efficient use of IT-related resources and provide transparency and accountability of the cost and business value of solutions and services. Enable the enterprise to make informed decisions regarding the use of IT solutions and services. 1.81 Internal Audit Rating (2014) 3.00 Achieved 4 3.00 Achieved 3.00 Achieved 3.00 Achieved 4 Each attribute is rated using a standard rating scale defined in the ISO/IEC 15504 standard. These ratings consist of: N Not achieved. There is little or no evidence of achievement of the defined attribute in the assessed process. P achieved. There is some evidence of an approach to, and some achievement of, the defined attribute in the assessed process. Some aspects of achievement of the attribute may be unpredictable. L Largely achieved. There is evidence of a systematic approach to, and significant achievement of, the defined attribute in the assessed process. Some weakness related to this attribute may exist in the assessed process. F Fully achieved. There is evidence of a complete and systematic approach to, and full achievement of, the defined attribute in the assessed process. No significant weaknesses related to this attribute exist in the assessed process.

Subject Areas Build, Acquire & Implement (BAI) Process/ Identifier Manage programs and projects (BAI1) Manage requirement definition (BAI2) Manage solutions and identification and build (BAI3) Manage availability and capacity (BAI4) Process Title Description and Purpose Process Description: Manage all programs and projects from the investment portfolio in alignment with enterprise strategy and in a coordinated way. Initiate, plan, control, and execute programs and projects, and close with a post-implementation review. Process Purpose Statement: Realize business benefits and reduce the risk of unexpected delays, costs and value erosion by improving communication to and involvement of business and end users, ensuring the value and quality of project deliverables, and maximizing their contribution to the investment and services portfolio. Process Description: Identify solutions and analyze requirements before acquisition or creation to ensure that they are in line with enterprise strategic requirements covering business processes, applications, information/data, infrastructure and services. Co-ordinate with affected stakeholders the review of feasible options including relative costs and benefits, risk analysis, and approval of requirements and proposed solutions. Process Purpose Statement: Create feasible optimal solutions that meet enterprise needs while minimizing risk. IT Division Rating (2013) Process Description: Establish and maintain identified solutions in line with enterprise requirements covering design, development, procurement/sourcing and partnering with suppliers/vendors. Manage configuration, test preparation, testing, requirements management and maintenance of business processes, applications, information/data, infrastructure and services. 1.72 Process Purpose Statement: Establish timely and cost-effective solutions capable of supporting enterprise strategic and operational objectives. Process Description: Balance current and future needs for availability, performance and capacity with costeffective service provision. Include assessment of current capabilities, forecasting of future needs based on business requirements, analysis of business impacts, and assessment of risk to plan and implement actions to meet the identified requirements. 2.00 Process Purpose Statement: Maintain service availability, efficient management of resources, and optimization of system performance through prediction of future performance and capacity requirements. 2.33 2.06 Internal Audit Rating (2014) 3.00 Achieved 3.00 Achieved 2.00 Achieved 3.00 Achieved Monitor, Evaluate and Assess (MEA) Monitor, evaluate and assess performance and conformance (MEA1) Process Description: Collect, validate and evaluate business, IT and process goals and metrics. Monitor that processes are performing against agreed-on performance and conformance goals and metrics and provide reporting that is systematic and timely. Process Purpose Statement: Provide transparency of performance and conformance and drive achievement of goals. 1.56 2.00 Achieved Page 16 of 16