Practice Guide. Selecting, Using, and Creating Maturity Models: A Tool for Assurance and Consulting Engagements
|
|
|
- Ezra Stokes
- 10 years ago
- Views:
Transcription
1 Practice Guide Selecting, Using, and Creating Maturity Models: A Tool for Assurance and Consulting Engagements July 2013
2
3 Table of Contents Executive Summary... 1 Introduction... 2 Example of Maturity Model Use by Internal Auditors... 3 Selecting Maturity Models... 4 Building and Using Maturity Models... 5 A Commonly Accepted Internal Control Environment Maturity Model Key Points for Review Maturity Model Examples Additional Resources About the Authors and Reviewers
4 Executive Summary Maturity models establish a systematic basis of measurement for describing the as is state of a process. A process s maturity can then be compared to management s expectations or contrasted with the maturity of other similar processes for benchmarking purposes. Insights also can be derived from the model for determining improvement options that help a process to satisfy its intended objectives over time. A maturity model describes process components that are believed to lead to better outputs and better outcomes. A low level of maturity implies a lower probability of success in consistently meeting an objective while a higher level of maturity implies a higher probability of success. The organization s risk tolerance should be considered when determining the level of maturity that management expects to have in place. Auditors may want to use maturity models as criteria to assess business processes as part of assurance engagements, thus providing an easy-to-communicate understanding of the governance, risk, or control environment under review. In the absence of defined criteria for a process, the auditor can work with management to define adequate criteria using a maturity model. This practice guide provides guidance on the uses of maturity models, identifies considerations for their selection, and provides instructions on how to build them. Care must be taken to appropriately apply maturity models in assurance or consulting engagements, including validating their applicability to the process under review. Components of existing maturity models are provided for use as is or as the foundation for a model tailored specifically to an organization s process. 1 /
5 Introduction Organizations may use a maturity model to describe their developmental state or their processes in relation to established expectations of control and management. The classification mechanisms within a maturity model can help organizations simplify the determination of when control and process management is acceptable, or alternatively to identify the actions necessary to improve the maturity of the organization or process. Outcome metrics (e.g., financial return, program compliance, sales, and customer satisfaction) provide in many cases the ultimate criteria for measuring the success of a process. However, management and auditors may want to understand how well the processes leading to those outcomes are designed and functioning. Unfortunately, an assessment of the adequacy of efforts to achieve a given set of outcomes can be difficult to develop given the many variables that drive business performance. An appropriately constructed maturity model can make such an assessment more consistent and repeatable. The concept for maturity models grew out of total quality management programs, which emphasized continuous improvement. One of the most well known models is the Capability Maturity Model (CMM) developed by Carnegie Mellon University to help improve software development. While many variations of maturity models exist, all models generally have levels from 0 to 5 that describe an organization, management process, control set, or other element of an organization s operations (i.e., they describe inputs or processes believed to lead to better execution and improved consistency of outcomes). Level 0 is usually some variation of nonexistent or ad hoc execution while level 5 is usually considered a high maturity, sustainable, and/or optimized process. Level 5 may not be an organization s goal, as the cost to achieve level 5 may at times exceed the benefits. In other words, management s risk tolerance may be high enough to allow for the process to be less exact or consistent, or it may not be strategically important enough to invest in certain processes to consistently achieve level 5. Maturity models when appropriately designed provide: A framework for envisioning the future, the desired state, and the development of improvement plans. Benchmarks for the organization to compare its processes internally or externally. A mechanism to provide insight into the improvement path from an immature to a mature process. A disciplined method that comparatively is easy to understand and implement. As suggested by the word maturity, an organization s governance, risk, and control processes evolve over time and may move up or down the maturity scale (the 0 to 5 scale noted previously). Standard 2210.A3 of the International Standards for the Professional Practice of Internal Auditing (Standards) is important for auditors to understand and apply when using maturity models. It states: Adequate criteria are needed to evaluate governance, risk management, and controls. Internal auditors must ascertain the extent to which management and/or the board has established adequate criteria to determine whether objectives and goals have been accomplished. If adequate, internal auditors must use such criteria in their evaluation. If inadequate, internal auditors must work with management and/or the board to develop appropriate evaluation criteria. When using or developing maturity models, the auditor should determine whether management and/or the board has established adequate criteria in the selection and application of the model. This practice guide expands on this concept in subsequent sections. Generally, however, consider the following two points: / 2
6 An auditor planning to use a maturity model in an assurance engagement should first consider whether the model is fit for purpose. 1 Assuming the model is used correctly, is its predictive ability relevant to the business objective being measured? For instance, a maturity model that assesses compliance elements would not be appropriate to provide a perspective on how well an operational business objective is managed. An auditor planning to use a maturity model in an assurance engagement should independently determine what maturity level of the model is adequate to meet an objective. For instance, level 5 of a customer satisfaction maturity model may not be necessary to achieve a desired business outcome. However, an auditor after understanding the model and its design may not agree with a management position that a level 1 process is acceptable for the objective of meeting customer needs. Example of Maturity Model Use by Internal Auditors Assume that an organization has just established a community investment group to make donations to charities and other worthwhile causes. The organization expects it Situation Organization Needs a Clear Opinion Potential Implications by Reporting Methodology Pass/Fail Satisfactory/Unsatisfactory Reporting Provides a clear understanding of the auditor s opinion. Maturity Model Level Reporting Unless stated explicitly, readers may not have clarity on what is good enough. When is achievement at a certain level acceptable versus not acceptable? Management Buy-in Is Important to Cultivate Audit of a Complex or Undefined Process Compliance Objective Operational Objective Aspirational/Continuous Improvement Objective Yes/no verdicts may be difficult to deliver. They may also be counterproductive in terms of the time to share, negotiate, and confirm the pass/fail opinion versus the time discussing the actual improvement options. Harder to apply a clear pass/fail approach. Provides a clear opinion on whether compliance is met. More difficult for management and the auditor to identify the exact process deficiencies that constitute a fail versus pass. May be impossible to create useful pass/fail criteria, as all processes over time may succeed or fail depending on the ease or difficulty of set expectations. Focuses discussion on the level of consistent execution on a continuum allowing for discussion of continuous improvement options. Allows for plotting of distribution along a continuum of process expectations. Given the expectation of meeting compliance requirements, anything less than the highest level of maturity could be misconstrued as a concern. Allows management an easier role for communicating a level of expectation of maturity. Allows for a maturity level that all processes can reach yet also includes higher potential levels of maturity that drive aspirational performance. 1 Pöppelbuß, Jens and Röglinger, Maximilian, What Makes a Useful Maturity Model? A Framework of General Design Principles for Maturity Models and Its Demonstration in Business Process Management (2011). 3 /
7 will take two years to develop all the policies and procedures necessary to perform as intended. The long-range vision is for the community investment program to be recognized as one of the top 100 in the country. After performing an audit planning risk assessment, the chief audit executive (CAE) has included this subject in the annual plan six months after the group has been formed. Potential internal audit objectives could be to evaluate: Whether the community investment controls are compliant with relevant laws and regulations. Whether an adequate strategic plan is in place to identify and evaluate the impact of charities that are provided donations. Through the use of a maturity model, the audit function could validate that current charity evaluation efforts are adequate (hypothetically, adequate is level 3 maturity, where an understanding and survey of charity outcomes is in place) but recommend that a sustained level 4 be in place (hypothetically, level 4 calls for proactive monitoring of charity reporting and periodic management validation of charity results). This continuum of maturity would be in contrast to a pass/ fail or satisfactory/unsatisfactory rating process. The maturity model lends itself to providing the criteria, the plotting of the current condition, and the recommendation to move to the next level if such a recommendation is warranted. In the example, the model provides a great method for assessing a process that is under development (in this case, the community investment group). The use of maturity models will not be the best evaluation method for auditors to deploy in all cases. When determining whether to use a maturity model, evaluate the situations described below and consider the potential implications of different evaluation and reporting methods. Selecting Maturity Models Management may have defined a maturity model for its use within the organization. If so, the internal auditor could adopt that model as a tool after carefully evaluating the relevance and adequacy of the model to the assessment or opinion being provided. Alternatively, there are numerous maturity models available for use from industry groups and associations. These models also must be assessed as fit for purpose before use. Maturity models involve a certain level of subjectivity; therefore, caution is warranted when providing assurance to management that a process is adequately controlled based on an assessment driven by a maturity model. The auditor should ensure the model is fit for purpose and properly implemented. Models may be used to describe the as is state of the process, provide prescriptive guidelines on improvement, or compare one process implementation to another. 1 The use of a maturity model versus other audit techniques and methodologies should not alter the level of proficiency and due professional care auditors employ. A maturity model should not be deployed as a checklist, supplanting the auditor s responsibility for independently and objectively identifying unmitigated risk and the potential inadequacy of control. The model should provide a framework and guide for discussion of governance, risk, and control maturity. In selecting a maturity model, auditors should understand the management objective and the appropriateness of the model in supporting that management objective. Consider the following: What is the desired management outcome? For example, does management want to assess systems development lifecycle success, sales process excellence, or environmental safety? What quantitative metrics or qualitative statements describe the desired management outcome? / 4
8 Is the model under consideration appropriate for driving the management outcome? The model should have been built by credible subject matter experts either inside or outside the organization who understand the correlation between certain process functions and the organization s desired outcome. The level of diligence in confirming the predictability of the model will vary from an internally developed model created by experienced business leaders inside the organization to an externally developed and researched model that considers experiences across many organizations. Either approach could be appropriate for a given situation. The two key factors to control for in the selection of a model are: Would following a model improve the probability that the outcome would be achieved? Alternatively, would the model encourage actions that are counterproductive or focus management s attention on process improvements that do not correlate to driving the desired outcome? Would management have a false sense of confidence that the outcome would be achieved if an assessment using the model shows a high state of process maturity? Although following the model and increasing a process s maturity is expected to improve the chances of a successful outcome, there still may be substantial risk and uncertainty that the outcome will be achieved. Will use of the model provide the appropriate level of confidence? Disclose the source of the model. Auditors should disclose in their report the source of the model, how the model was constructed, who participated in the construction of the model, and why the auditor and management, as appropriate believes the selected model is valid for the process and objective under review. Building and Using Maturity Models Auditors with the appropriate proficiency, or in conjunction with management or outside experts, can construct models that are fit for purpose. Auditors who have limited experience with maturity models or who want to explore more detailed research into their design, should consider reviewing the research paper, What Makes a Useful Maturity Model? 2 Building a model involves three steps: 1. Determine the purpose of the model and its components. 2. Determine the scale. 3. Develop the expectations for each component level. Using a maturity model involves these additional steps: 1. Set targets for each component. 2. Assess the level of maturity by component. 3. Consider what the model may have missed. 4. Report conclusions. 5. Revisit the model regularly. For the purposes of this section, start by skimming Example 3, Public Sector Internal Audit Capability Maturity Model (see page 24) and then return here to continue with the discussion on building a model. Building a Maturity Model Step 1 Determine the purpose of the model and its components. The objective to be addressed should first be defined in the same way as if the auditor were going to select a model that was already built. Auditors should consider these questions: 2 Pöppelbuß, Jens and Röglinger, Maximilian, What Makes a Useful Maturity Model? A Framework of General Design Principles for Maturity Models and Its Demonstration in Business Process Management (2011). 5 /
9 What does management want to assess (e.g., systems development lifecycle success, sales process excellence, or environmental safety)? What business processes are involved? Will the model be applied across many different types of management processes to improve general compliance, controls, or organizational governance? Is internal audit assessing an industry or company specific set of tasks that require some degree of specialized process knowledge, tools, techniques, or skills? How can internal audit state the expected outcome from the process in terms of metrics or a qualitative statement? With the objective in mind, the components that drive that objective are then identified. This is the most important part of the model s development in that the auditor is identifying the critical elements that based on the model builders judgment will improve the probability of achievement of the objective and outcome. Auditors will want to document their plan for developing the model outlining the research and data gathering techniques (such as facilitation of subject matter experts) that help determine which components should be part of the model. Auditors should consider the following when selecting components: Will the component if managed consistently improve the probability of achieving the outcome? If a component is not included, will that negatively decrease the probability of achieving the outcome? Use caution here to focus on including the critical few components that deserve attention, improvement, and consistent execution versus everything that management could be doing to oversee the process. Can the model builder evaluate the correlation between the component and the desired outcome? Is that correlation based on a study or research comparing processes with high and low maturity to the outcomes across those processes? Alternatively, do subject matter experts and experienced professionals who could include management and auditors believe the component contributes to increasing the probability of achieving the outcome? What research, evidence, or subject matter expertise can internal audit rely on in making this determination? In the reference model Example 3 one component is Professional Practices. One can assume the authors of that model felt that a higher state of maturity in following professional practices contributes to the desired outcome of public sector internal audit functions. This component apparently was part of the critical few areas that, if left out or not consistently managed, would be detrimental to the management objective. Finally, one can assume that research shows a distinction between the level of outcomes achieved by public sector internal audit functions without high maturity in professional practices and those with high maturity. That research might be quantitatively driven through statistical correlation or qualitatively driven through interviews with subject matter experts and CAEs in the field. The components will vary based on the management objective. When compliance is the objective, specific compliance controls, governance expectations, relevant regulatory skill sets, and other elements may be important components of the model. If assessment of the general control environment is the objective, then basic segregation of duties, control mapping, and risk assessment concepts may be important components. In an assessment of an organization s field sales offices, certain practices on sales prospect tracking or market analysis may be considered key components. Components are those categories of process attributes relevant and necessary to meet or to at least improve the likelihood of meeting the objective being assessed. Turning back to Example 3, the research study on public sector internal audit capability found these components relevant: / 6
10 Services & Role of IA People Management Professional Practices Performance Management & Accountability Organizational Relationships & Culture Governance Structures Practitioners participating in The IIA s research that created model Example 3 determined that an assessment of the capability of a public sector audit function needed to consider these six components. These components are the drivers of success or failure, capability building or capability destruction, for the internal audit function under review. Caution: Determining the components could range from an exercise as simple as a single meeting to gather perspectives from experienced subject matter experts in the organization to an extensive fully funded empirical research study that determines through statistical analysis across many processes and organizations what components truly impact the desired outcome under review. Auditors should be clear to assess the level of predictability they want their model to have. In most cases, a formal gathering of subject matter experts in the organization may be adequate for the component selection necessary to provide insights and improvements to the organization and some reasonable level of assurance regarding furthering the process objective. Step 2 Determine the scale. Once the components are identified, the auditor should determine what scale will be used. The examples that are shown in this practice guide use a level 0 or level 1 as the base level going up to level 5 as the highest level of maturity. Generally the lowest level is an absence of controls and process discipline while level 5 is reserved only for those very few processes that exhibit an optimized or best practice execution. In the reference model, Example 3, the Public Sector Internal Audit Capability Maturity Model, five levels are used: Level 5 Optimizing Level 4 Managed Level 3 Integrated Level 2 Infrastructure Level 1 Initial Caution: When developing the model, the auditor should carefully consider the words used to title each level. Best Practice, for instance, is a catch phrase that can be misapplied and cause confusion. Every level may not need to be Best Practice, as that is beyond the risk tolerance needs of the organization. The titles should help convey the achievement expected at each level. Step 3 Develop the expectations for each component level. The next step is to define the expectations regarding what should be in place for a process to have met a given level for each component being assessed. Example 1, Process Capability Maturity Model has six components that the model developers felt are key to general process governance: Strategic Planning/Financial Management Customer/Stakeholder Expectations Risk Metrics Human Capital Process Management and Self-assessment Using Customer/Stakeholder Expectations, one can review the expectations set for each level. In this grid, each level builds on the level before meaning that to achieve level 4 it is expected that requirements in levels 1-3 have also been demonstrated. 7 /
11 LEVEL 1 LEVEL 2 LEVEL 3 LEVEL 4 LEVEL 5 Reactive Repeatable Defined & Managed Sustained Optimized Customer/ Stakeholder Expectations Stakeholder expectations are identified or tracked informally. Process decisionmaking is based on stakeholder expectations and feedback. Key stakeholders are identified. Expectations for critical to quality satisfaction are documented. Process success in meeting expectations and feedback is monitored. Stakeholder feedback is collected via surveys, focus groups, and innovative voice of the customer methodologies. Rework/mistakes impacting stakeholder expectations have improvement projects underway. Stakeholder feedback validates that the process meets or exceeds stakeholder expectations. Proactive initiatives are in place to minimize or eliminate rework/ mistakes. In this example, the model builders created a progressively higher set of expectations culminating in level 5 Optimized a process whose stakeholders confirm the process meets their expectations and management has proactive efforts to reduce mistakes. In this case, the model builders built the model with the intention that all processes should achieve level 3 while only critical processes are expected to expend the effort to reach level 5. To build out this component, the team that created the model would have considered the range of options for managing customer and stakeholder expectations and then created the expectations within each level. Just as with the determination of which components to use, the actual requirements within a component may be determined through in-depth research or through facilitated conversation with subject matter experts. A maturity model focused on general processes, such as the one used in this example, will generally be applicable to any process. A maturity model focused on a specific industry or function may require specific diligence and demonstrated achievements regarding specialized people, process, and technology. Considerations during this step include: How well does each level build on the previous level? How well do the expectations in each level align to the expectation to have a process meet a certain level of maturity say level 3 versus level 5? For the expectations in each box, will a process or organization that achieves that requirement have a reasonable chance of achieving the outcome envisioned for that level say being Defined and Managed for level 3 or Sustained for level 4? Are the expectations for a given level consistent across components? For instance, are the requirements for level 3 for this component Customer/ Stakeholder Expectations appropriately equivalent to the level 3 requirements for Human Capital? Caution: The same level of diligence that was applied in determining what key components should exist in the model should be applied when setting the expectations within each level of the model. Determine the key requirements versus everything that could be done. / 8
12 The auditor s model should now resemble the model below with specific components and expectations by level inserted. The auditor may have selected more components or a different number of levels for the model. LEVEL 1 LEVEL 2 LEVEL 3 LEVEL 4 LEVEL 5 Component 1 Component 1 / Level 1 Component 1 / Level 2 Component 1 / Level 3 Component 1 / Level 4 Component 1 / Level 5 Component 2 Component 2 / Level 1 Component 2 / Level 2 Component 2 / Level 3 Component 2 / Level 4 Component 2 / Level 5 Component 3 Component 3 / Level 1 Component 3 / Level 2 Component 3 / Level 3 Component 3 / Level 4 Component 3 / Level 5 Using a Maturity Model Step 4 Set a target for each component. Once the scale and components are defined, the next step is to determine the organization s target maturity level for each component. Generally speaking, cost/benefit analysis shows that not all components of a process should operate at the highest level of maturity. In conjunction with the assessment of an organization s risk appetite, the target maturity for some components may be for instance to a Level 3 or Level 4. The organization may not want to expend the resources to move those components to a high level of maturity and accepts the risk that the process s objectives have a higher probability of failure as a result. Auditors should refer to The IIA s Standards regarding risk and communicating risk acceptance for further guidance. Consider the following contrasting examples: Management has built a sales office maturity model for assessing the process maturity of its 100 sales offices. Management expects all sales offices to achieve level 5 (optimized) over time, but allows each office to prioritize what will be addressed first across the components being assessed. However, any component assessed at level 1 or 2 is considered a red flag, requiring an immediate intervention plan by regional leadership. The organization has adopted a general process governance maturity model. All processes are expected to evaluate their adherence to the model; however, only level 3 achievement is expected for all processes. Each management function determines which specific processes are critical to the organization, and thus require a 4 or 5 level of maturity. Non-critical processes may be specifically excluded from the requirement to deploy the resources to reach the highest state of maturity as that would not be an optimized allocation of resources across the organization. Caution: Auditors should not assume that managers should seek to obtain the highest level of maturity for all maturity model components across all processes being assessed. These may be too costly or risk adverse for the organization. The goal of a model is to present the range of possibilities, assess the current maturity of the process, and then set goals for improvements where such improvements make sense and are in alignment with organizational objectives. 9 /
13 Step 5 Assess the level of maturity by component. Finally, the auditors assess the process itself through observation, inquiry, re-performance, and other appropriate tests to validate the current maturity of the process. Most models are built with the presumption that to achieve a given level, all the requirements of that level and all lower levels have been achieved. The task is no different than any other audit, with the maturity model serving as the criteria in the assessment. One method of assessment an audit function could use would be to have management of the process or function under review conduct a self-assessment, including a collection of any evidence of performance. The audit function would then validate that assessment. Step 6 Consider what the model may have missed. All maturity models are built on the research, understanding, and perspectives gained from the evaluation of previous business process implementations not an evaluation of the current execution of the process under review. Moreover, no model can consider all the circumstances that may mitigate the risk that an outcome will not be achieved. Care should be taken not to apply the model as a simple checklist. Auditors should always conduct their work in a way that will allow for the identification of significant risks to the organization s objectives. Accordingly, use of a maturity model does not preclude an auditor from the responsibility to consider for the specific process under review what the model may be missing in terms of risk mitigation and control guidance. Auditors must apply due professional care in determining the level of analysis beyond just the application of the model necessary to fulfill their engagement scope. That scope should be documented as noted in the next step. Step 7 Report on conclusions. As noted previously, the basis for selection of the model as well as details on how the model was designed should be clearly disclosed in any reporting for which a model is the basis of the assessment. The purpose of the model that on which the model is providing a perspective should be clear. If management has determined the level of maturity that is considered adequate, the auditor should independently determine whether management has established adequate criteria in the selection and application of the model. (See Standard 2210.A3) Auditors and management must be cautious however not to overstate the probability that a given level of process maturity will achieve a specified outcome over time. Any language that purports to guarantee or ensure achievement of a specific outcome given that the process has met a given state of maturity should be avoided. Auditors should determine how the actual output metrics of the process under review should be provided in the report and validated as appropriate. For instance, a manufacturing process may be assessed at a high level of maturity but customers continue to reject manufactured parts. These facts may not invalidate the appropriateness of the maturity model; however, reporting on simply the model assessment a high level of maturity may be misleading to a reader without the context of the actual outcome metrics. As noted in the previous step, auditors have an obligation to think outside the model regardless of how well constructed it may be to consider whether the specific circumstances under review may lead to other gaps in governance, risk, or control implementation. If that is the case, the auditor should discuss such gaps in the report. Alternatively, if the engagement scope was to simply apply the model without any additional consideration of unmitigated risks, that focused scope should be clearly disclosed. Here is an example of such a disclosure. / 10
14 Component 1 Component 2 Component 3 LEVEL 1 LEVEL 2 LEVEL 3 LEVEL 4 LEVEL 5 Component 1 / Level 1 Component 2 / Level 1 Component 3 / Level 1 Component 1 / Level 2 Component 2 / Level 2 Component 3 / Level 2 Component 1 / Level 3 Component 2 / Level 3 Component 3 / Level 3 Component 1 / Level 4 Component 2 / Level 4 Component 3 / Level 4 Component 1 / Level 5 Component 2 / Level 5 Component 3 / Level 5 Purple Arrow = Target Level Orange Arrow = Current Level Our evaluation was limited to the application of the maturity model to x process. This maturity model was based on research conducted by x and enhanced using subject matter experts identified by management. We did not conduct additional analysis designed to identify additional unmitigated risks that could impact the probability of the process achieving management s objectives. If we had conducted such additional analysis, other gaps may have come to our attention. This model (above) shows two colors as an example of one reporting scheme. One color represents the expected level of achievement while the other represents the current level. Where gaps exist, the auditor will want to work with management to develop recommendations for improvement. Step 8 Revisit the model regularly. After applying the model, internal audit will want to revisit how each of the model elements (levels, components, and expectations) when implemented appears to be achieving the desired process outcomes. Is the expectation to achieve a certain level too high? Alternatively, does the assessment seem too easy and not driving improvements that raise the bar on expected process resiliency? Over time, the auditor will want to understand any process outcome misses and tie that learning into the improvement of the model itself. Was the miss an indication that changes in expectations in a given component at a given level should be considered to increase the probability of achieving the objective going forward? A Commonly Accepted Internal Control Environment Maturity Model Included on page 12 is the internal control environment maturity model from COBIT 4.1 (Control Objectives for Information Technology) released by ISACA 3. While ISACA has released subsequent versions of COBIT including COBIT 5 this model still provides a useful reference for considering the maturity of a control environment. ISACA s development of the model in COBIT 4.1 involved research of a variety of maturity models. Accordingly, internal auditors may use the model as a basis for their assessment of the maturity of internal control structures or development of their own maturity models. The model uses just one component (Internal Control Environment) and 6 levels ranging from nonexistent to optimized. 3 COBIT 4.1, 2007 IT Governance Institute, Appendix 111, p All rights reserved. Used with permission. 11 /
15 CobIt 4.1 MATURITY LEVEL STATUS OF THE INTERNAL CONTROL ENVIRONMENT 0 Nonexistent There is no recognition of the need for internal control. Control is not part of the organization s culture or mission. There is a high risk of control deficiencies and incidents. 1 Initial/ad hoc There is some recognition of the need for internal control. The approach to risk and control requirements is ad hoc and disorganized, without communication or monitoring. Deficiencies are not identified. Employees are not aware of their responsibilities. 2 Repeatable but intuitive Controls are in place but are not documented. Their operation is dependent on knowledge and motivation of individuals. Effectiveness is not adequately evaluated. Many control weaknesses exist and are not adequately addressed; the impact can be severe. Management actions to resolve control issues are not prioritized or consistent. Employees may not be aware of their responsibilities. 3 Defined process Controls are in place and are adequately documented. Operating effectiveness is evaluated periodically and there are an average number of issues. However, the evaluation process is not documented. Although management is able to deal predictably with most control issues, some control weaknesses persist and impacts could still be severe. Employees are aware of their responsibilities for control. 4 Managed and measurable There is an effective internal control and risk management environment. A formal, documented evaluation of controls occurs frequently. Many controls are automated and regularly reviewed. Management is likely to detect most control issues, but not all issues are routinely identified. There is consistent follow-up to address identified control weaknesses. A limited, tactical use of technology is applied to automate controls. 5 Optimized An enterprisewide risk and control program provides continuous and effective control and risk issues resolution. Internal control and risk management are integrated with enterprise practices, supported with automated real-time monitoring with full accountability for control monitoring, risk management, and compliance enforcement. Control evaluation is continuous, based on self-assessments and gap and root cause analyses. Employees are proactively involved in control improvements. / 12
16 Key Points For Review Given the care that must be taken when applying maturity models, auditors should review these key points over the course of an engagement. Key Points for Review An auditor planning to use a maturity model in an assurance engagement should first consider whether the model is fit for purpose. An auditor planning to use a maturity model in an assurance engagement should independently determine what maturity level of the model is adequate to meet an objective. Maturity models involve a certain level of subjectivity; therefore, caution is warranted when providing assurance to management that a process is adequately controlled based on an assessment driven by a maturity model. Ask yourself these questions when considering using a model. Would following a model improve the probability that the outcome would be achieved? Would management have a false sense of confidence that the outcome would be achieved if an assessment using the model shows a high state of process maturity? Auditors should disclose in their report the source of the model, how the model was constructed, who participated in the construction of the model, and why the auditor and management, as appropriate believes the selected model is valid for the process and objective under review. Auditors should clearly assess the level of predictability they want their model to have. Auditors should not assume that managers should seek to obtain the highest level of maturity for all maturity model components across all processes being assessed. These may be too costly or risk adverse for the organization. Care should be taken not to apply the model as a simple checklist. Auditors should always conduct their work in a way that will allow for the identification of significant risks to the organization s objectives. Accordingly, use of a maturity model does not preclude an auditor from the responsibility to consider for the specific process under review what the model may be missing in terms of risk mitigation and control guidance. Auditors and management must be cautious not to overstate the probability that a given level of process maturity will achieve a specified outcome over time. Any language that purports to guarantee or ensure achievement of a specific outcome given that the process has met a given state of maturity should be avoided. Auditors should determine how the actual output metrics of the process under review should be provided in the report and validated as appropriate. After applying the model, internal audit will want to periodically revisit how each of the model elements (levels, components, and expectations) appears to be achieving the desired process outcomes. 13 /
17 Maturity Model Examples The following three models are examples that auditors can use as provided or leverage in the development of their own maturity models. The example models each use six components and five levels of maturity to address their objectives: Example 1: Fortune 100 Company Process Capability Maturity Model Example 2: Compliance and Ethics Program Maturity Model Example 3: Public Sector Internal Audit Capability Maturity Model Example 1: Process Capability Maturity Model A Fortune 100 company took the concept of the maturity model and tailored it to the organization s environment in the following example. The objective of the model is to address the overall process capability maturity across six process components: Strategic Planning/Financial Management, Customer/Stakeholder Expectations, Risk, Metrics, Human Capital, and Process Management/Selfassessment). This framework has been successfully applied for both high-level process reviews and detailed subprocess reviews. The model was constructed using input from experienced audit professionals as well as members of an internal process consortium. Management sets a target for each component (level 1 to level 5) and conducts a self-assessment. The internal audit function then independently audits the process and opines on the level of maturity. Management and internal audit agree on the artifacts that demonstrate each level of maturity. Level 1 Level 2 Level 3 Level 4 Level 5 Reactive Repeatable Defined & Managed Sustained Optimized Notes Suggested Minimal Target Level ROI hurdle rate does not justify all processes achieving this level. General Description Process is not formalized. Inconsistent execution. Process is more formalized (documented). Repeatable execution. Management understands overall process. Process is fully defined and executed consistently. Adequate metrics are defined to allow for quality assurance/ self-assessment capabilities. Management decisionmaking and continuous improvement projects are based on data, metrics, and formal quality assurance/selfassessment feedback. Perfect service levels are achieved. Independently verified as best in class. Innovative ideas and techniques are piloted on an ongoing basis. / 14
18 Process Capability Maturity Model Level 1 Level 2 Level 3 Level 4 Level 5 Reactive Repeatable Defined & Managed Sustained Optimized Strategic Planning/ Financial Management Initiatives are identified and tasks assigned. Initiatives are reevaluated annually. Project milestones are monitored. Resources are allocated and tracked. Business unit/ department/ process strategic planning includes 1-3 year initiatives based on stakeholder expectations. Financial, process, human resource, and risk management elements are included in the planning. Strategic planning supports corporate strategic plan in terms of customer growth, segment profit margin, competitive advantage and strategic intent fulfillment. Prioritization of resources and initiatives is based on ROI or governance/ compliance requirements. 1-3 year strategic planning initiatives consistently meet their milestone goals. Financial and stakeholder expectations are met. The strategic plan incorporates alternatives and options for long-term (3-6 year) industry and regulatory changes. Customer/ Stakeholder Expectations Stakeholder expectations are identified or tracked informally. Process decisionmaking is based on stakeholder expectations and feedback. Key stakeholders are identified. Expectations critical to quality satisfaction are documented. Process success in meeting expectations and feedback is monitored. Stakeholder feedback is collected via surveys, focus groups, and innovative voice of the customer methodologies. Rework/mistakes impacting stakeholder expectations have improvement projects underway. Stakeholder feedback validates that the process meets or exceeds stakeholder expectations. Proactive initiatives are in place to minimize or eliminate rework/mistakes. Risk Limited or no risk assessment occurring. At least annually, a review of process risks is performed. Risk is considered in project plans and initiatives. A comprehensive risk assessment process is developed that covers strategic, financial, compliance, and operational risks. Potential risk hazards or opportunities are formally evaluated for likelihood and impact. Management formally articulates risk tolerance. Specific mitigation plans are implemented based on the assessment and cost/ benefit analysis. The risk assessment is reviewed and updated as appropriate throughout the year. Resource allocation ROI incorporates risk assessment into the prioritization process. Risks are mitigated below the risk tolerance goals set by management. 15 /
19 Level 1 Level 2 Level 3 Level 4 Level 5 Reactive Repeatable Defined & Managed Sustained Optimized Metrics Few or no metrics are identified, tracked, or reported. Key metrics are identified and measurement elements are accurate. Methods are in place to track and report to management on a continuous basis. Key metrics with target performance indicators are identified for financial, compliance, strategic, operational, human resources, and stakeholder attributes (balanced scorecard). Measurement of actual performance to target metrics is accurate and communicated to management and associates. Key metrics, targets, and measurement systems are reevaluated and validated continuously for process changes, resource changes, and corporate strategy initiatives. Specific improvement initiatives are developed and prioritized for metrics not meeting performance goals. Key metric targets are reached consistently for all areas. Proactive activities are implemented so gaps are not incurred between actual and target. Human Capital A resource development process does not exist or is informal. A resource training process does not exist or is informal. The development process is formalized and documented for all levels of associates. Role descriptions and expectations are documented and communicated. Training programs are implemented. A formalized resource development process is executed consistently. A formalized training program for all levels is established and its completion is tracked. A formalized succession plan and recruiting plan are in place. Compensation correlates to documented performance management expectations and contributions. Target metrics on workforce efficiency and effectiveness are identified and measurement methods are in place for actual results. Continuous improvement projects are initiated for gaps between actual performance and targeted metric. Key metric targets are reached consistently for all areas. Proactive activities are implemented so gaps are not incurred between actual and target. / 16
20 Process Capability Maturity Model Level 1 Level 2 Level 3 Level 4 Level 5 Reactive Repeatable Defined & Managed Sustained Optimized Process Management and Selfassessment Processes and procedures are not documented or known informally. Policies, high-level procedure documents, and basic templates exist that drive repeatable processes. Controls are identified and noted in the documentation. Documentation is maintained, communicated, and accurate. Standard evidence is available, including a process control management system, process narratives, and process flows. Documentation is readily available for outside audit without advance notice. Key controls and control execution standards are tracked for current and new processes/ products. Formal quality assurance through self-assessment is executed regularly for key processes. Record retention policies are in place and monitored. Process documentation and controls are proactively developed and validated before new systems, products, or initiatives. Proactive initiatives are taken based upon gaps identified through selfassessments. 17 /
21 Example 2: Compliance and Ethics Program Maturity Model This is adapted from a model published by The IIA s Research Foundation (IIARF) that applies to an organization s compliance and ethics program. Compliance and Ethics Program Maturity Attributes 4 Attribute Initial Repeatable Defined Mature World Class 1. Code of Ethics How effectively does the code outline management s expectations regarding ethical conduct? There is no formally documented code of ethics. In general, there are no other means of communicating management s expectations regarding ethical conduct. A code of ethics has been developed, but it may not be comprehensive or current. Experienced employees generally understand management s expectations regarding ethical conduct, but new employees may not have any way of determining those expectations. A comprehensive code of ethics exists, was approved by the board, and is reviewed every two to three years to determine what updates are needed. All employees must sign off annually that they are in compliance with the code of ethics. New employees must sign a document asserting that they have read and understand the code. The code of ethics is reviewed as appropriate by outside legal counsel to ensure it remains current and appropriate. The code of ethics is reviewed annually and updated as necessary. All employees must complete annual questionnaires that ask more probing questions regarding compliance with the code of ethics. Specific compliance and ethics policies are in place to support and provide additional guidance on key components of the code. Periodic focus groups and/or surveys are conducted with a representative sample of employees to assess their understanding of the code of ethics and their perceptions on the level of compliance throughout the organization. 4 Adapted from the IIA Research Foundation. Internal Auditing: Assurance & Consulting Services. Altamonte Springs, Fla.: IIARF, / 18
22 Compliance and Ethics Program Maturity Model Compliance and Ethics Program Maturity Attributes 4 Attribute Initial Repeatable Defined Mature World Class 2. Culture and Consistency How does the organization perceive management s commitment to compliance? The organization seems indifferent to compliance and ethics. The program was developed by very few individuals with no outside input. There are perceptions of disciplinary inconsistencies and playing favorites. People are promoted without formal consideration of ethical conduct. Events of noncompliance are typically learned from complaints versus monitoring or audit activities. There are perceptions that compliance and ethics are important. The program was developed to address legal ramifications of noncompliance. Discipline is generally left to the discretion of business and department managers and, as such, is not always consistent. Although ethical conduct seems to be considered, it s not a part of job descriptions. Events of noncompliance are generally reported timely, but there are few efforts to report events before they become noncompliant. The perception is that senior management takes compliance and ethics seriously and walks the talk. The program was developed with input from legal, human resources, and internal audit. Human resources is consulted to make sure disciplinary actions are appropriate and compliant with regulations. Job descriptions include expectations for ethical conduct. Many employees raise compliance questions before they become a problem. Compliance and ethics are topics at organization and departmentlevel meetings, ensuring a consistent cultural message. The program was developed with input from various employee groups. Disciplinary decisions involve an appropriate mix of human resources, legal, and compliance personnel to ensure appropriateness and consistency. Job descriptions and interviews formally cover ethical conduct. Employees feel empowered to raise questions about compliance matters. Periodic surveys or focus groups are conducted to assess the perception of the compliance and ethics culture and make adjustments when needed. Periodic input is solicited from employees to help improve the program. Disciplinary actions are reviewed by an independent group (e.g., internal audit) to support the consistency of such actions. People are recognized for demonstrating ethical conduct. Some employees make recommendations for improving the compliance program. 4 Adapted from the IIA Research Foundation. Internal Auditing: Assurance & Consulting Services. Altamonte Springs, Fla.: IIARF, /
23 Compliance and Ethics Program Maturity Attributes 4 Attribute Initial Repeatable Defined Mature World Class 3. Awareness How aware are employees and outside stakeholders of the compliance and ethics program and its requirements? Employees are generally aware that the program exists but are not sure how to get information. Employees aren t familiar with specific requirements. Employees don t know who manages the compliance and ethics program. Stakeholders know nothing about the program. Employees are aware the program exists, went through training once, and intuitively know some of the requirements contained in the program. Employees know who the chief compliance officer is, but not others involved in managing the compliance and ethics program. Stakeholders assume a program exists but don t know anything about it. There is widespread employee awareness of the program. All employees went through training in the last three years. Employees know who the chief compliance officer and the compliance managers are. Stakeholders are aware a program exists and can find references on the company s website. Annual training reinforces the program, with individual modules delivered in more depth. Employees know which individuals are responsible for key compliance areas. Compliance with the program and ethical expectations are covered in the contracts with vendors. Communications occur regularly to remind/ update employees on program expectations. The program is part of external sustainability reporting conducted annually. / 20
24 Compliance and Ethics Program Maturity Model Compliance and Ethics Program Maturity Attributes 4 Attribute Initial Repeatable Defined Mature World Class 4. Structure and Accountability How effective is the structure for managing the program and enforcing accountability? There is no formal compliance and ethics program structure. Independent oversight is nonexistent or ad hoc. Accountability is not defined. Investigations are ad hoc. Compliance risks are not understood. A compliance officer has been designated, but the responsibilities of the position are not welldeveloped. Oversight and monitoring are inconsistent and reactionary. Accountability is broadly understood, but not formally documented. Investigations are typically conducted by the appropriate personnel. Compliance risks are generally understood but not formally documented. A compliance and ethics structure has been established, with accountability assigned to officers responsible for compliance areas. Oversight is defined from a senior management and board perspective. Monitoring is established, including internal audit and others. There is a focal point for determining who should conduct investigations. Compliance risks and scenarios are documented. Reporting by compliance area officers to the chief compliance officer is timely and consistent. The applicable board committee receives quarterly updates on compliance and ethics matters. Internal audit has a consistent plan for auditing all compliance risks. A formal investigation protocol exists that outlines appropriate resources to use (internal vs. external), documentation requirements, and how investigations are closed. A formal compliance risk assessment has been completed. An integrated monitoring plan has been implemented that involves the chief compliance officer, compliance area officers, and internal audit. Sensitive or significant investigations are conducted in accordance with the protocol by individuals trained in forensic and investigation techniques. Compliance risk scenarios have been identified, assessed, and mapped to compliance controls, and are updated at least annually. 4 Adapted from the IIA Research Foundation. Internal Auditing: Assurance & Consulting Services. Altamonte Springs, Fla.: IIARF, /
25 Compliance and Ethics Program Maturity Attributes 4 Attribute Initial Repeatable Defined Mature World Class 5. Process Automation and Integration How effectively are compliance and ethics controls and processes standardized, integrated, and automated? There are no formal compliance and ethics controls and procedures, although many employees know intuitively how to act. There is no formal protocol for employees or outsiders to report suspected events of non-compliance. Information/data related to compliance and ethics is not available. There are some compliance and ethics controls and procedures, but they are not consistent across the organization nor formally documented. There is limited testing of the controls and procedures in place. Employees generally understand that they can contact legal or human resources if they suspect an event of noncompliance. Information/data related to compliance and ethics events is difficult to compile. Compliance and ethics controls and procedures are well documented and standardized across the organization. Compliance and ethics controls and procedures are tested periodically to identify gaps or weaknesses. An external hotline is in place through which employees or outsiders can report suspected events of non-compliance. Some compliance and ethics controls are integrated with other business processes and automated to the extent supported by existing systems. Some standard reports are prepared related to compliance and ethics events. Compliance and ethics controls and procedures are an integral part of business processes. Many compliance and ethics controls address key compliance risks as part of a governance, risk, and compliance (GRC) view of the program. There are multiple avenues through which employees or outsiders can report suspected events of noncompliance, and all follow a consistent protocol for gathering information on the event and escalating it. A consistent test plan is used to ensure compliance and ethics controls and procedures operate effectively. Technology is used to aid in the identification and investigation of compliance and ethics events. The company has established an integrated GRC program that ensures compliance risks are managed to be consistent with the organization s risk appetite. Event management software is used to ensure all key data is gathered and the resolution of events is completely and consistently documented. GRC software is used to provide integrated information on the program. Integrated technology routines are run regularly to prevent or detect timely potential compliance and ethics events. / 22
26 Compliance and Ethics Program Maturity Model Compliance and Ethics Program Maturity Attributes 4 Attribute Initial Repeatable Defined Mature World Class 6. Goals and Metrics How is success of the compliance and ethics program measured? No formal goals or metrics exist or are contemplated. Although goals and metrics are not formalized, employees generally understand that the absence of compliance and ethics events is indicative of a successful program. Broad compliance and ethics goals are established and communicated. Broad metrics exist to measure the nature and frequency of compliance and ethics events. Specific compliance and ethics goals are integrated into the annual goal setting process for each compliance area. Metrics are established for each compliance area. All employees have individual compliance and ethics goals. Metrics are integrated into the overall performance measurement process. 4 Adapted from the IIA Research Foundation. Internal Auditing: Assurance & Consulting Services. Altamonte Springs, Fla.: IIARF, /
27 Example 3: Public Sector Internal Audit Capability Maturity Model In addition to applying the maturity model to different processes within the organization, internal audit also can perform an assessment of its own processes by tailoring the maturity model framework. The example below is adapted from a model published from The IIARF, which was built for assessing public sector internal audit departments but can easily be adapted and applied to all sectors. Internal Audit Capability Model Matrix 5 Services & Role of IA People Management Professional Practices Performance Management & Accountability Organizational Relationships & Culture Governance Structures Level 5 Optimizing Internal audit is recognized as key agent of change. Leadership involvement with professional bodies. Workforce projection. Continuous improvement in professional practices. Strategic internal audit planning. Public reporting of internal audit effectiveness. Effective and ongoing relationships. Independence, power, and authority of the internal audit activity. Level 4 Managed Overall assurance on governance, risk management, and control. Internal audit contributes to management development. Internal audit actively supports professional bodies. Workforce planning. Audit strategy leverages organization s management of risk. Integration of qualitative and quantitative performance measures. CAE advises and influences top-level management. Independent oversight of the internal audit activity. CAE reports to top-level authority. Level 3 Integrated Advisory services. Performance and value-formoney audits. Team building and competency. Professionally qualified staff. Workforce coordination. Quality management framework. Risk-based audit plans. Performance measures. Cost information. Internal audit management reports. Coordination with other review groups. Integral component of management team. Management oversight of the internal audit activity. Funding mechanisms. 5 Adapted from the IIA Research Foundation. Internal Audit Capability Model (IA-CM) For the Public Sector. Altamonte Springs, Fla.: IIARF, / 24
28 Public Sector Internal Audit Capability Maturity Model Internal Audit Capability Model Matrix 5 Services & Role of IA People Management Professional Practices Performance Management & Accountability Organizational Relationships & Culture Governance Structures Level 2 Infrastructure Level 1 Initial Compliance auditing. Individual professional development. Skilled people are identified and recruited. Professional practices and process framework. Audit plan is based on management and stakeholder priorities. Internal audit operating budget. Internal audit business plan. Managing within the internal audit activity. Full access to the organization s information, assets, and people. Reporting relationships established. Ad hoc and unstructured; isolated single audits or reviews of documents and transactions for accuracy and compliance; outputs dependent upon the skills of specific individuals holding the position; no specific professional practices established other than those provided by professional associations; funding approved by management, as needed; absence of infrastructure; auditors likely part of a larger organizational unit; no established capabilities; therefore, no specific key process areas. 5 Adapted from the IIA Research Foundation. Internal Audit Capability Model (IA-CM) For the Public Sector. Altamonte Springs, Fla.: IIARF, /
29 Additional Resources Internal auditors may refer to other maturity models for insights when developing their own models. The following are a few examples. IIA Path to Quality Model (PTQM) The PTQM provides a framework for the CAE to assess the current state of the internal audit activity s quality capability, target an appropriate level of quality capability for the activity, and present the steps along a path for the audit activity to reach its quality capability target. Categories consist of: Beginning (1), Emerging (2), Conforming (3), Leveraging (4), Leading (5). For further in-depth analysis of maturity models, review the paper titled: What Makes a Useful Maturity Model? A Framework of General Design Principles for Maturity Models and Its Demonstration in Business Process. Jens Pöppelbuß, European Research Center for Information Systems, University of Münster, Maximilian Röglinger, FIM Research Center, University of Augsburg. RIMS Risk Maturity Model is a tool for executives in risk management and others charged with risk management responsibilities to develop sustainable enterprise risk management programs. Levels include: Ad Hoc (1), Initial (2), Repeatable (3), Managed (4), and Leadership (5). Software Engineering Institute (SEI) Capability Maturity Models (CMM), an analytical adaptation of maturity modeling for software engineering processes, people capability, process integration and other uses. Categories consist of: Initial (1), Managed (2), Defined (3), Quantitatively Managed (4), and Optimizing (5). The International Standards Organization (ISO) and the International Electrotechnical Commission (IEC) have developed the ISO/IEC 15504, which is the reference model for the maturity models (consisting of capability levels that in turn consist of the process attributes and further consist of generic practices) against which the assessors can place the evidence they collect during their assessment, so the assessors can give an overall determination of the organization s capabilities for delivering products (software, systems, IT services). The six levels include: Incomplete (0), Performed (1), Managed (2), Established (3), Predictable (4) and Optimizing (5). / 26
30 About the Authors and Reviewers Author: James Rose, CIA, CRMA, CPA Reviewers: Maria E. Mendes, CIA, CCSA Steven Jameson, CIA, CCSA, CFSA, CRMA 27 /
31
32 About the Institute Established in 1941, The Institute of Internal Auditors (IIA) is an international professional association with global headquarters in Altamonte Springs, Fla., USA. The IIA is the internal audit profession s global voice, recognized authority, acknowledged leader, chief advocate, and principal educator. About Practice Guides Practice Guides provide detailed guidance for conducting internal audit activities. They include detailed processes and procedures, such as tools and techniques, programs, and step-by-step approaches, as well as examples of deliverables. Practice Guides are part of The IIA s IPPF. As part of the Strongly Recommended category of guidance, compliance is not mandatory, but it is strongly recommended, and the guidance is endorsed by The IIA through formal review and approval processes. For other authoritative guidance materials provided by The IIA, please visit our website at Disclaimer The IIA publishes this document for informational and educational purposes. This guidance material is not intended to provide definitive answers to specific individual circumstances and as such is only intended to be used as a guide. The IIA recommends that you always seek independent expert advice relating directly to any specific situation. The IIA accepts no responsibility for anyone placing sole reliance on this guidance. Copyright Copyright 2013 The Institute of Internal Auditors. For permission to reproduce, please contact The IIA at [email protected]. global headquarters T: Maitland Ave. F: Altamonte Springs, FL USA W:
ENTERPRISE RISK MANAGEMENT FRAMEWORK
ENTERPRISE RISK MANAGEMENT FRAMEWORK COVENANT HEALTH LEGAL & RISK MANAGEMENT CONTENTS 1.0 PURPOSE OF THE DOCUMENT... 3 2.0 INTRODUCTION AND OVERVIEW... 4 3.0 GOVERNANCE STRUCTURE AND ACCOUNTABILITY...
Establishing a Quality Assurance and Improvement Program
Chapter 2 Establishing a Quality Assurance and Improvement Program O v e rv i e w IIA Practice Guide, Quality Assurance and Improvement Program, states that Quality should be built in to, and not on to,
Business Continuity Position Description
Position Description February 9, 2015 Position Description February 9, 2015 Page i Table of Contents General Characteristics... 2 Career Path... 3 Explanation of Proficiency Level Definitions... 8 Summary
INTERNATIONAL STANDARDS FOR THE PROFESSIONAL PRACTICE OF INTERNAL AUDITING (STANDARDS)
INTERNATIONAL STANDARDS FOR THE PROFESSIONAL PRACTICE OF INTERNAL AUDITING (STANDARDS) Revised: October 2012 i Table of contents Attribute Standards... 3 1000 Purpose, Authority, and Responsibility...
INTERNATIONAL STANDARDS FOR THE PROFESSIONAL PRACTICE OF INTERNAL AUDITING (STANDARDS)
INTERNATIONAL STANDARDS FOR THE PROFESSIONAL PRACTICE OF INTERNAL AUDITING (STANDARDS) Introduction to the International Standards Internal auditing is conducted in diverse legal and cultural environments;
Data Analysis: The Cornerstone of Effective Internal Auditing. A CaseWare Analytics Research Report
Data Analysis: The Cornerstone of Effective Internal Auditing A CaseWare Analytics Research Report Contents Why Data Analysis Step 1: Foundation - Fix Any Cracks First Step 2: Risk - Where to Look Step
Practice guide. quality assurance and IMProVeMeNt PrograM
Practice guide quality assurance and IMProVeMeNt PrograM MarCh 2012 Table of Contents Executive Summary... 1 Introduction... 2 What is Quality?... 2 Quality in Internal Audit... 2 Conformance or Compliance?...
Standards for the Professional Practice of Internal Auditing
Standards for the Professional Practice of Internal Auditing THE INSTITUTE OF INTERNAL AUDITORS 247 Maitland Avenue Altamonte Springs, Florida 32701-4201 Copyright c 2001 by The Institute of Internal Auditors,
DATA ANALYSIS: THE CORNERSTONE OF EFFECTIVE INTERNAL AUDITING. A CaseWare IDEA Research Report
DATA ANALYSIS: THE CORNERSTONE OF EFFECTIVE INTERNAL AUDITING A CaseWare IDEA Research Report CaseWare IDEA Inc. is a privately held software development and marketing company, with offices in Toronto
Integrated Risk Management:
Integrated Risk Management: A Framework for Fraser Health For further information contact: Integrated Risk Management Fraser Health Corporate Office 300, 10334 152A Street Surrey, BC V3R 8T4 Phone: (604)
Internal Audit. Audit of HRIS: A Human Resources Management Enabler
Internal Audit Audit of HRIS: A Human Resources Management Enabler November 2010 Table of Contents EXECUTIVE SUMMARY... 5 1. INTRODUCTION... 8 1.1 BACKGROUND... 8 1.2 OBJECTIVES... 9 1.3 SCOPE... 9 1.4
ASAE s Job Task Analysis Strategic Level Competencies
ASAE s Job Task Analysis Strategic Level Competencies During 2013, ASAE funded an extensive, psychometrically valid study to document the competencies essential to the practice of association management
A Risk-Based Audit Strategy November 2006 Internal Audit Department
Mental Health Mental Retardation Authority of Harris County ENTERPRISE RISK MANAGEMENT A Framework For Assessing, Evaluating And Measuring Our Agency s Risk A Risk-Based Audit Strategy November 2006 Internal
RSA ARCHER OPERATIONAL RISK MANAGEMENT
RSA ARCHER OPERATIONAL RISK MANAGEMENT 87% of organizations surveyed have seen the volume and complexity of risks increase over the past five years. Another 20% of these organizations have seen the volume
Office of the Auditor General AUDIT OF IT GOVERNANCE. Tabled at Audit Committee March 12, 2015
Office of the Auditor General AUDIT OF IT GOVERNANCE Tabled at Audit Committee March 12, 2015 This page has intentionally been left blank Table of Contents Executive Summary... 1 Introduction... 1 Background...
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire
Operational Risk Management - The Next Frontier The Risk Management Association (RMA)
Operational Risk Management - The Next Frontier The Risk Management Association (RMA) Operational risk is not new. In fact, it is the first risk that banks must manage, even before they make their first
CSR / Sustainability Governance and Management Assessment By Coro Strandberg Principal, Strandberg Consulting www.corostrandberg.
Introduction CSR / Sustainability Governance and Management Assessment By Coro Strandberg Principal, Strandberg Consulting www.corostrandberg.com June 2015 Companies which adopt CSR or sustainability 1
Camber Quality Assurance (QA) Approach
Camber Quality Assurance (QA) Approach Camber s QA approach brings a tested, systematic methodology, ensuring that our customers receive the highest quality products and services, delivered via efficient
White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program
White Paper from Global Process Innovation by Jim Boots Fourteen Metrics for a BPM Program This white paper presents 14 metrics which may be useful for monitoring progress on a BPM program or initiative.
Leveraging a Maturity Model to Achieve Proactive Compliance
Leveraging a Maturity Model to Achieve Proactive Compliance White Paper: Proactive Compliance Leveraging a Maturity Model to Achieve Proactive Compliance Contents Introduction............................................................................................
B o a r d of Governors of the Federal Reserve System. Supplemental Policy Statement on the. Internal Audit Function and Its Outsourcing
B o a r d of Governors of the Federal Reserve System Supplemental Policy Statement on the Internal Audit Function and Its Outsourcing January 23, 2013 P U R P O S E This policy statement is being issued
Practice Guide COORDINATING RISK MANAGEMENT AND ASSURANCE
Practice Guide COORDINATING RISK MANAGEMENT AND ASSURANCE March 2012 Table of Contents Executive Summary... 1 Introduction... 1 Risk Management and Assurance (Assurance Services)... 1 Assurance Framework...
The IIA Global Internal Audit Competency Framework
About The IIA Global Internal Audit Competency Framework The IIA Global Internal Audit Competency Framework (the Framework) is a tool that defines the competencies needed to meet the requirements of the
P3M3 Portfolio Management Self-Assessment
Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Portfolio Management Self-Assessment P3M3 is a registered trade mark of AXELOS Limited Contents Introduction
How quality assurance reviews can strengthen the strategic value of internal auditing*
How quality assurance reviews can strengthen the strategic value of internal auditing* PwC Advisory Internal Audit Table of Contents Situation Pg. 02 In response to an increased focus on effective governance,
Transforming Internal Audit: A Maturity Model from Data Analytics to Continuous Assurance
ADVISORY SERVICES Transforming Internal Audit: A Model from Data Analytics to Assurance kpmg.com Contents Executive summary 1 Making the journey 2 The value of identifying maturity levels 4 Internal audit
Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience
Copyright 2014 Carnegie Mellon University The Cyber Resilience Review is based on the Cyber Resilience Evaluation Method and the CERT Resilience Management Model (CERT-RMM), both developed at Carnegie
Effective Internal Audit in the Financial Services Sector
Effective Internal Audit in the Financial Services Sector Recommendations from the Committee on Internal Audit Guidance for Financial Services: How They Relate to the Global Institute of Internal Auditors
THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY ENTERPRISE RISK MANAGEMENT FRAMEWORK
THE SOUTH AFRICAN HERITAGE RESOURCES AGENCY ENTERPRISE RISK MANAGEMENT FRAMEWORK ACCOUNTABLE SIGNATURE AUTHORISED for implementation SIGNATURE On behalf of Chief Executive Officer SAHRA Council Date Date
DEVELOPING AN EFFECTIVE INTERNAL AUDIT TECHNOLOGY STRATEGY
DEVELOPING AN EFFECTIVE INTERNAL AUDIT TECHNOLOGY STRATEGY SEPTEMBER 2012 DISCLAIMER Copyright 2012 by The Institute of Internal Auditors (IIA) located at 247 Maitland Ave., Altamonte Springs, Fla., 32701,
Guidance Note: Corporate Governance - Board of Directors. March 2015. Ce document est aussi disponible en français.
Guidance Note: Corporate Governance - Board of Directors March 2015 Ce document est aussi disponible en français. Applicability The Guidance Note: Corporate Governance - Board of Directors (the Guidance
CRM SUCCESS GUIDELINES
CRM SUCCESS GUIDELINES Provided to You By: Integrated Sales Management, Inc. Helping You Grow! CRM Success Guidelines Customer Relationship Management (CRM) has evolved dramatically recently as many companies
Internal Auditing Guidelines
Internal Auditing Guidelines Recommendations on Internal Auditing for Lottery Operators Issued by the WLA Security and Risk Management Committee V1.0, March 2007 The WLA Internal Auditing Guidelines may
Business Logistics Specialist Position Description
Specialist Position Description March 23, 2015 MIT Specialist Position Description March 23, 2015 Page i Table of Contents General Characteristics... 1 Career Path... 2 Explanation of Proficiency Level
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into
The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material,
RISK BASED AUDITING: A VALUE ADD PROPOSITION. Participant Guide
RISK BASED AUDITING: A VALUE ADD PROPOSITION Participant Guide About This Course About This Course Adding Value for Risk-based Auditing Seminar Description In this seminar, we will focus on: The foundation
Enabling IT Performance & Value with Effective IT Governance Assessment & Improvement Practices. April 10, 2013
Enabling IT Performance & Value with Effective IT Governance Assessment & Improvement Practices April 10, 2013 Today's Agenda: Key Topics Defining IT Governance IT Governance Elements & Responsibilities
Global Technology Audit Guide. Auditing IT Governance
Global Technology Audit Guide Auditing IT Governance Global Technology Audit Guide (GTAG ) 17 Auditing IT Governance July 2012 GTAG Table of Contents Executive Summary... 1 1. Introduction... 2 2. IT
Feature. Developing an Information Security and Risk Management Strategy
Feature Developing an Information Security and Risk Management Strategy John P. Pironti, CISA, CISM, CGEIT, CISSP, ISSAP, ISSMP, is the president of IP Architects LLC. He has designed and implemented enterprisewide
Department of Audit and Compliance. Quality Self-Assessment
Department of Audit and Compliance Quality Self-Assessment November 2014 CONTENTS EXECUTIVE SUMMARY... 2 PURPOSE OF SELF-ASSESSMENT... 4 SELF-ASSESSMENT SCOPE OF WORK... 4 RESULTS OF SELF-ASSESSMENT WORK...
SECTION B DEFINITION, PURPOSE, INDEPENDENCE AND NATURE OF WORK OF INTERNAL AUDIT
SECTION B DEFINITION, PURPOSE, INDEPENDENCE AND NATURE OF WORK OF INTERNAL AUDIT Through CGIAR Financial Guideline No 3 Auditing Guidelines Manual the CGIAR has adopted the IIA Definition of internal auditing
DATA QUALITY MATURITY
3 DATA QUALITY MATURITY CHAPTER OUTLINE 3.1 The Data Quality Strategy 35 3.2 A Data Quality Framework 38 3.3 A Data Quality Capability/Maturity Model 42 3.4 Mapping Framework Components to the Maturity
Business Analyst Position Description
Analyst Position Description September 4, 2015 Analysis Position Description September 4, 2015 Page i Table of Contents General Characteristics... 1 Career Path... 2 Explanation of Proficiency Level Definitions...
IT Risk & Security Specialist Position Description
Specialist Position Description February 9, 2015 Specialist Position Description February 9, 2015 Page i Table of Contents General Characteristics... 1 Career Path... 2 Explanation of Proficiency Level
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)
PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3) 1st February 2006 Version 1.0 1 P3M3 Version 1.0 The OGC logo is a Registered Trade Mark of the Office of Government Commerce This is a Value
INTERNAL AUDITING S ROLE IN SECTIONS 302 AND 404
INTERNAL AUDITING S ROLE IN SECTIONS 302 AND 404 OF THE U.S. SARBANES-OXLEY ACT OF 2002 May 26, 2004 Copyright 2004 by, 247 Maitland Avenue, Altamonte Springs, Florida, 32701-4201, USA Internal Auditing
Human Resources Management Program Standard
Human Resources Management Program Standard The approved program standard for Human Resources Management program of instruction leading to an Ontario College Graduate Certificate delivered by Ontario Colleges
Instructional Designer Standards: Competencies & Performance Statements
Standards Set 2012 ibstpi Instructional Designer Standards: Competencies & Performance Statements The 2012 ibstpi Instructional Designer Competencies and Performance statements are copyrighted by the International
Internal Auditing: Assurance, Insight, and Objectivity
Internal Auditing: Assurance, Insight, and Objectivity WHAT IS INTERNAL AUDITING? INTERNAL AUDITING business people all around the world are familiar with the term. But do they understand the value it
Strategic Communications Audits
Strategic Communications Audits Prepared for the Communications Consortium Media Center Julia Coffman October 2004 Nonprofit organizations are now continuously being challenged to be more strategic in
Internal Audit Practice Guide
Internal Audit Practice Guide Continuous Auditing Office of the Comptroller General, Internal Audit Sector May 2010 Table of Contents Purpose...1 Background...1 Definitions...2 Continuous Auditing Professional
The Role of the Board in Enterprise Risk Management
Enterprise Risk The Role of the Board in Enterprise Risk Management The board of directors plays an essential role in ensuring that an effective ERM program is in place. Governance, policy, and assurance
DoD CIVILIAN LEADER DEVELOPMENT FRAMEWORK COMPETENCY DEFINITIONS. Leading Change
DoD CIVILIAN LEADER DEVELOPMENT FRAMEWORK COMPETENCY DEFINITIONS Leading Change Definition: This core competency involves the ability to bring about strategic change, both within and outside the organization,
11/12/2013. Role of the Board. Risk Appetite. Strategy, Planning and Performance. Risk Governance Framework. Assembling an effective team
Role of the Board Risk Appetite Strategy, Planning and Performance Risk Governance Framework Assembling an effective team Role of the CEO Accountability and Disclosure 1 Board members should act on a fully
LEVERAGING COSO ACROSS THE THREE LINES OF DEFENSE
Committee of Sponsoring Organizations of the Treadway Commission Governance and Internal Control LEVERAGING COSO ACROSS THE THREE LINES OF DEFENSE By The Institute of Internal Auditors Douglas J. Anderson
Internal Audit Standards
Internal Audit Standards Department of Public Expenditure & Reform November 2012 Copyright in material supplied by third parties remains with the authors. This includes: - the Definition of Internal Auditing
State of Minnesota. Enterprise Security Strategic Plan. Fiscal Years 2009 2013
State of Minnesota Enterprise Security Strategic Plan Fiscal Years 2009 2013 Jointly Prepared By: Office of Enterprise Technology - Enterprise Security Office Members of the Information Security Council
Linking Risk Management to Business Strategy, Processes, Operations and Reporting
Linking Risk Management to Business Strategy, Processes, Operations and Reporting Financial Management Institute of Canada February 17 th, 2010 KPMG LLP Agenda 1. Leading Practice Risk Management Principles
Assurance Engagements
IFAC International Auditing and Assurance Standards Board March 2003 Exposure Draft Response Due Date June 30, 2003 Assurance Engagements Proposed International Framework For Assurance Engagements, Proposed
Metrics by design A practical approach to measuring internal audit performance
Metrics by design A practical approach to measuring internal audit performance September 2014 At a glance Expectations of Internal Audit are rising. Regulatory pressure is increasing. Budgets are tightening.
COSO 2013: WHAT HAS CHANGED & STEPS TO TAKE TO ENSURE COMPLIANCE
COSO 2013: WHAT HAS CHANGED & STEPS TO TAKE TO ENSURE COMPLIANCE COMMITTEE OF SPONSORING ORGANIZATIONS (COSO) 2013 The Committee of Sponsoring Organizations (COSO) Internal Controls Integrated Framework,
Raise Your Voice, Raise Your Skills
ASHHRA s Initiative Raise Your Voice, Raise Your Skills Tool 00 Please fax d Tool to ASHHRA at --77 Attn: Jamie Macander ASHHRA s Initiative Raise Your Voice, Raise Your Skills Introduction Changes are
Leveraging CMMI framework for Engineering Services
Leveraging CMMI framework for Engineering Services Regu Ayyaswamy, Mala Murugappan Tata Consultancy Services Ltd. Introduction In response to Global market demand, several OEMs adopt Global Engineering
Applying Integrated Risk Management Scenarios for Improving Enterprise Governance
Applying Integrated Risk Management Scenarios for Improving Enterprise Governance János Ivanyos Trusted Business Partners Ltd, Budapest, Hungary, [email protected] Abstract: The term of scenario is used
The College of New Jersey Enterprise Risk Management and Higher Education For Discussion Purposes Only January 2012
The College of New Jersey Enterprise Risk Management and Higher Education For Discussion Purposes Only Agenda Introduction Basic program components Recent trends in higher education risk management Why
A&CS Assurance Review. Accounting Policy Division Rule Making Participation in Standard Setting. Report
A&CS Assurance Review Accounting Policy Division Rule Making Participation in Standard Setting Report April 2010 Table of Contents Background... 1 Engagement Objectives, Scope and Approach... 1 Overall
Operational Risk Management Program Version 1.0 October 2013
Introduction This module applies to Fannie Mae and Freddie Mac (collectively, the Enterprises), the Federal Home Loan Banks (FHLBanks), and the Office of Finance, (which for purposes of this module are
PRACTICE GUIDE. Formulating and Expressing Internal Audit Opinions
PRACTICE GUIDE Formulating and Expressing Internal Audit Opinions 2 of 23 Table of Contents 1. Executive Summary... 1 2. Introduction... 2 3. Planning the Expression of an Opinion... 3 3.1 Expressing an
Internal Control over Financial Reporting Guidance for Smaller Public Companies
Internal Control over Financial Reporting Guidance for Smaller Public Companies Frequently Asked Questions Internal Control over Financial Reporting Guidance for Smaller Public Companies Frequently Asked
September 2015. IFAC Member Compliance Program Strategy, 2016 2018
September 2015 IFAC Member Compliance Program Strategy, 2016 2018 This Strategy is issued by the International Federation of Accountants (IFAC ) with the advice and oversight of the Compliance Advisory
Request for Proposal. Supporting Document 3 of 4. Contract and Relationship Management for the Education Service Payroll
Request for Proposal Supporting Document 3 of 4 Contract and Relationship December 2007 Table of Contents 1 Introduction 3 2 Governance 4 2.1 Education Governance Board 4 2.2 Education Capability Board
Research Data Management Framework: Capability Maturity Guide
ANDS Guides Research Data Management Framework: Capability Maturity Guide Introduction The outline set out below shows five levels of attainment or maturity which institutions may achieve in managing their
Value to the Mission. FEA Practice Guidance. Federal Enterprise Architecture Program Management Office, OMB
Value to the Mission FEA Practice Guidance Federal Enterprise Program Management Office, OMB November 2007 FEA Practice Guidance Table of Contents Section 1: Overview...1-1 About the FEA Practice Guidance...
CRISC Glossary. Scope Note: Risk: Can also refer to the verification of the correctness of a piece of data
CRISC Glossary Term Access control Access rights Application controls Asset Authentication The processes, rules and deployment mechanisms that control access to information systems, resources and physical
Alliance Scorecarding and Performance Management at TechCo. A Vantage Partners Case Study
Alliance Scorecarding and Performance Management at TechCo A Vantage Partners Case Study With the assistance of Vantage Partners, TechCo, a microelectronics company based in California, developed and implemented
PRACTICE ADVISORIES FOR INTERNAL AUDIT
Société Française de Réalisation, d'etudes et de Conseil Economics and Public Management Department PRACTICE ADVISORIES FOR INTERNAL AUDIT Tehnical Assistance to the Ministry of Finance for Development
Generally Accepted Recordkeeping Principles
Generally Accepted Recordkeeping Principles Information Governance Maturity Model Information is one of the most vital strategic assets any organization possesses. Organizations depend on information to
EMBEDDING SUSTAINABILITY SELF-ASSESSMENT
EMBEDDING SUSTAINABILITY SELF-ASSESSMENT Embedding Sustainability Self-Assessment This document is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. You are free to share
Workshop agenda. Data Quality Metrics and IT Governance. Today s purpose. Icebreaker. Audience Contract. Today s Purpose
Workshop agenda Strategic Data Quality Management Data Quality Metrics and IT Governance Today s purpose data quality metrics Conclusion Presenter: Micheal Axelsen Director Information Systems Consulting
Business Intelligence Engineer Position Description
Business Intelligence Position Description February 9, 2015 Position Description February 9, 2015 Page i Table of Contents General Characteristics... 1 Career Path... 2 Explanation of Proficiency Level
The New International Standard on the Practice of Risk Management A Comparison of ISO 31000:2009 and the COSO ERM Framework
The New International Standard on the Practice of Risk Management A Comparison of ISO 31000:2009 and the COSO ERM Framework Dorothy Gjerdrum, ARM-P, Chair of the ISO 31000 US TAG and Executive Director,
Internal Audit Capability Model (IA-CM)
R E S E A R C H Internal Audit Capability Model (IA-CM) For the Public Sector Author??? The Institute of Internal Auditors Disclosure Copyright 2009 by The Institute of Internal Auditors Research Foundation
Convercent Predictive Analytics
September 2015 Convercent Predictive Analytics Innovation in User Experience for Issue Reporting & Management SOLUTIONPERSPECTIVE Governance, Risk Management & Compliance Insight 2015 GRC 20/20 Research,
IT Governance. What is it and how to audit it. 21 April 2009
What is it and how to audit it 21 April 2009 Agenda Can you define What are the key objectives of How should be structured Roles and responsibilities Key challenges and barriers Auditing Scope Test procedures
Measurement Information Model
mcgarry02.qxd 9/7/01 1:27 PM Page 13 2 Information Model This chapter describes one of the fundamental measurement concepts of Practical Software, the Information Model. The Information Model provides
Board of Directors and Management Oversight
Board of Directors and Management Oversight Examination Procedures Examiners should request/ review records, discuss issues and questions with senior management. With respect to board and senior management
CDC UNIFIED PROCESS PRACTICES GUIDE
Document Purpose The purpose of this document is to provide guidance on the practice of Quality Management and to describe the practice overview, requirements, best practices, activities, and key terms
4 Testing General and Automated Controls
4 Testing General and Automated Controls Learning Objectives To understand the reasons for testing; To have an idea about Audit Planning and Testing; To discuss testing critical control points; To learn
Manage Release and Deployment
Manage Release and Deployment Description Once development is complete, new and enhanced services need to be made operational. Effective requires planning, scheduling and controlling the implementation
FINANCIAL SERVICES FLASH REPORT
FINANCIAL SERVICES FLASH REPORT OCC Finalizes Its Heightened Standards for Large Financial Institutions September 15, 2014 Transforming Heightened Expectations to Minimum Standards On September 2, 2014,
MNsure Compliance Program Strategic Plan. December 17, 2014
MNsure Compliance Program Strategic Plan December 17, 2014 Page 2 of 12 TABLE OF CONTENTS Introduction... 3 Compliance Program Mission... 3 Compliance Department Mission... 3 Regulatory Profile... 4 Key
Beyond risk identification Evolving provider ERM programs
Beyond risk identification Evolving provider ERM programs March 2016 At a glance PwC conducted research to assess the state of enterprise risk management (ERM) within healthcare providers and found many
Strategic Outcome- Based Metrics for the Federal Government
RG Perspective Strategic Outcome- Based Metrics for the Federal Government 11 Canal Center Plaza Alexandria, VA 22314 HQ 703-548-7006 Fax 703-684-5189 www.robbinsgioia.com 2013 Robbins Gioia, Inc. 1. Introduction
Internal Audit Quality Assessment. Presented To: World Intellectual Property Organization
Internal Audit Quality Assessment Presented To: World Intellectual Property Organization April 2014 Table of Contents List of Acronyms 3 Page Executive Summary Opinion as to Conformance to the Standards,
An Effective Approach to Transition from Risk Assessment to Enterprise Risk Management
Bridgework: An Effective Approach to Transition from Risk Assessment to Enterprise Risk Management @Copyright Cura Software. All rights reserved. No part of this document may be transmitted or copied without
MKS Integrity & CMMI. July, 2007
& CMMI July, 2007 Why the drive for CMMI? Missed commitments Spiralling costs Late delivery to the market Last minute crunches Inadequate management visibility Too many surprises Quality problems Customer
WHY DO I NEED A PROGRAM MANAGEMENT OFFICE (AND HOW DO I GET ONE)?
WHY DO I NEED A PROGRAM MANAGEMENT OFFICE (AND HOW DO I GET ONE)? Due to the often complex and risky nature of projects, many organizations experience pressure for consistency in strategy, communication,
Leveraging data analytics and continuous auditing processes for improved audit planning, effectiveness, and efficiency. kpmg.com
Leveraging data analytics and continuous auditing processes for improved audit planning, effectiveness, and efficiency kpmg.com Leveraging data analytics and continuous auditing processes 1 Executive
