IAM/IAG Maturity Assessment Dos & Don'ts Thursday, May 15th, 15:30 16:00 Dr. Horst Walther Senior Analyst KuppingerCole hw@kuppingercole.com
IAM/IAG Maturity Assessment Dos & Don ts Rating the maturity of IAM/IAG programs is not easy. Who is the right one to do such rating? Which input is required? How to you ensure that the rating does not become more complex than the rest of the program? What to look at what are the Key Performance Indicators (KPIs) & Key Risk Indicators (KRIs) to look at & how to do it without years-long collection of such indicators? What are the right benchmarks you can use Who can help you in benchmarking? Which lessons to draw from the results? In this session, Dr. Horst Walther will talk about the Dos & Don ts of Maturity Assessments.
Maturity models Maturity models are one of the widespread areas in the field of improving organizational performance. They identify organizational strengths & weaknesses as well as providing benchmarking information. There are many maturity models like OPM3, CMMI, P3M3, PRINCE, BPMM, Kerzner's model, SPICE, COBIT etc. These models differ from each other in terms of their factors & characteristics There is no standard related to these models. It is important for organizations to be able to assess their situation by a comprehensive and useful model. KuppingerCole 5/20/2014 P3M3. Portfolio, Programme & Project Management Maturity Model CMMI : Capability Maturity Model Integration OPM3 : Organizational Project Management Maturity Model SPICE: Software Process Improvement & Capability Determination COBIT: Control Objectives for Information & Related Technology 3
CMM The forefather of all maturity models Initial (chaotic, ad hoc) - the starting point for use of a new or undocumented repeat process. Repeatable The process is at least documented sufficiently, to enable repeating the same steps. Defined The process is defined/confirmed as a standard business process. Managed The process is quantitatively managed in accordance with agreed-upon metrics. Optimizing process management includes deliberate process optimization/improvement. KuppingerCole 2014-05-20 4
CMM gave maturity models a kick start In 1986 triggered by the U.S. DoD (Department of Defence), the SEI (Software Engineering Institute) at Carnegie Mellon University, started the development of a system for assessing the maturity of software processes. In 1991, the model was issued as Capability Maturity Model 1.0 CMMI (Capability Maturity Model Integration) was released in early 2002. CMM lead to a proliferation of CM models. Popular models, based on the original CMU CMM, are Spice for maturity assessment & assessment of software processes & COBIT for IT governance processes & many others. The notion of Maturity Models will be henceforth tied to one name: Watts S. Humphrey. KuppingerCole 2014-05-20 5
Why Maturity Assessments? Assessment according to a common maturity model enable Positioning - current achievements in a framework Benchmarking - to compare with others (competitors, best of breed, ) Quantification - of otherwise qualitative information Evidence - for compliance & certification purposes Orientation - to define the starting point for change activities. Reputation - as it is fancy not to rely on gut feelings. Transparency - serving as the foundation for any good governance. KuppingerCole 2014-05-20 6
Maturity Models for IAM / IAG & related There are overwhelmingly plenty models around; you could well craft your own. KuppingerCole 2014-05-20 7
Maturity assessments & IAM / IAG We deem it prudent to assess IAM / IAG processes for maturity too. However this discipline is inherently immature in itself. Terms (like authorisation, provisioning, ) are weakly defined & poorly understood. IT depts carry the burden to solve business tasks without being mandated. Few standards, generic practices have been established. Hence Maturity assessments have to be undertaken with some extra care. Nevertheless, a huge number of maturity models is around. Tailored approaches currently appear to be most promising. KuppingerCole pioneered in this discipline. KuppingerCole 2014-05-20 8
What it needs to do Maturity Assessements in-depth knowledge of the status of the technology market segment the programs are related today knowledge about the status of other organizations, both in the industry of the organization and in other industries good understanding of trends that will have an impact on the program and investments a rigorous methodological approach based on reliable information 9 KuppingerCole
Build the assessment on top of KPIs / KRIs The following generic approach for deriving KPIs/KRIs is recommended Define goals: Define what should be achieved & how the initiative relates to other initiatives in the organization. There should be one consistent risk management approach in the organization, while starting small & distributed. Define metrics: The KRIs/KPIs to be used have to be defined. That includes the definition of thresholds which should be met. Define responsibilities: In the beginning, the responsibilities for providing the current values of metrics, the aggregation of these metrics into scorecards & the reporting structures including alerting & escalations have to be defined. Define actions: The approach has to result in predefined actions in case that a risk increases beyond the defined threshold. KuppingerCole 5/20/2014 10
How to choose KRIs & KPIs It is most important to choose the appropriate KRIs/KPIs: 1. Choose valid indicators: Indicators have to be directly related to a risk. Changes in the value of the indicator have to indicate increasing/decreasing risks. 2. Choose indicators which can be influenced directly: There have to be actions defined for every indicator. Indicators which can be influenced (& improved) easily are a good choice. 3. Choose indicators which are easy to collect: If you need special tools or increased staff to collect raw data, you may have chosen the wrong metric collection has to be easy. KuppingerCole 5/20/2014 11
Work example: Digital identities per physical person Indicator: Group(s) of Indicators: Interpretation: Average number of Digital identities per physical person. IAM, GRC Defines the ratio of digital identities (e.g. identifiers to which accounts are mapped) and the number of physical persons (internal, external). Unit type: Percentage Direction: Minimize (Optimum: 1) IT Risks associated: Security risks: Situations in which one person has several digital identities often lead to unmanaged accounts. As well there are some security risks in preferring elevated accounts or unsecure authentication approaches. From a GRC perspective, these situations make it very difficult to analyse and control security. Efficiency risks: Having to deal with several identities is more complex and might lead to an increasing number of password losses. Operational Risks associated: Due to the security risks these situations might lead to undetected SoD conflicts in case that the relation of several digital identities to one physical person isn t identified. How to optimize: Use global identifiers as an abstraction level or map all accounts to one physical identity (if applicable). Annotations: KuppingerCole 5/20/2014 Some IAM and GRC tools can t deal with multiple layers of identities, e.g. accounts, digital identities and 12 additional global identifiers as an additional mapping layer.
Select KPIs / KRIs from these activity domains A typical assessment will evaluate KPIs / KRIs from the following activity domains against Best Practice: Visibility & Acceptance Guidelines & Policies Organisational Structure Status of Organisation Deployment Scope & Coverage Risk Awareness Technical Master Plan Access & Governance Analytics Identity Management & Provisioning Support for the Extended Enterprise Privilege Management & SIEM Authentication & Authorisation 13
Where to assess the Maturity? e.g. in the 7 KC IAM/IAG Maturity domains KC proposes Maturity Level Matrices for IAM/IAG for 7 major areas: 1. Access Governance 2. Access Management & Federation 3. Authentication 4. Cloud Identity Management 5. Dynamic Authorization Management 6. Identity Provisioning 7. Privilege Management These matrices cover the most important areas of IAM/IAG. Including some minor segments, such as Enterprise Single Sign-On. Some of the matrices cover a fairly broad range of topics. E.g. Authentication, includes strong authentication, risk- & context-based authentication & authorization, & versatile authentication. KuppingerCole 5/20/2014 14
Maturity Levels tailored to the domain KC example for Access Management / Governance 15 KuppingerCole
How to visualise the results Evaluation sample 1 (table) Maturity Level 5 Maturity Level 3 Best of Class Good in Class Current Average Visibility & Acceptance Guidelines & Policies Organisational Structure Penetration of the Organisation Scope & Coverage Risk Awareness Technical Master Plan Access Governance/Analytics Identity Management Extended Enterprise Privilege Management & SIEM Authentication & Authorisation KuppingerCole 16
How to visualise the results Evaluation sample 2 (graph) Maturity Assessment Example of evaluation The customer s status compared to Best of Class Customer Best of Class Visibility and Acceptance Authentication and 9 8 Guidelines and Policies 7 6 Privilege Management and 5 4 Organisational Structure 3 2 Support for the Extended 1 0 Status of Organisation Identity Management & Scope and Coverage Access and Governance Risk Awareness Technical Master Plan 17
The recommended actions example working plan for until the next maturity assessment Visibility & Acceptance Guidelines & Policies Organisational Structure Penetration of the Organisation Scope & Coverage Risk Awareness Technical Master Plan Access Governance/Analytics Identity Management Extended Enterprise Privilege Management & SIEM KuppingerCole 18 Authentication & Authorisation No actions required Consolidate & harmonise the existing stack Shift IAG responsibility to business Extend current practices to a 2nd business line Consider including customer direct access No actions recommended Consolidate isolated projects to a controlled program Employ a big data approach to enable analytics No actions required Actions recommended postponed due to low priority Apply SIEM to privileged Access Management Include dynamic authorisation to the enterprise concept
7 Dos & recommendations 1. Tailor oversize maturity models to your specific needs. 2. There is currently no way to avoid proprietary models 3. They provide (limited) knowledge bases & hence comparability. 4. IAM / IAGs inherent immaturity limit the benchmarking applicability. 5. Accept IAM / IAG purely as a business task. 6. Invest some effort into a clear, rigorous & logical terminology. 7. You may well define your own custom KPIs / KRIs. KuppingerCole 2014-05-20 19
5 Don ts & warnings 1. No overkill assessments must not be huge projects 2. Not for the shelf assessments should result in actions 3. Not one time effort - Assess regularly at least every 2-3 years 4. Not just IT - Consider business and technology 5. No introspection look for outside view, experts, external knowledge KuppingerCole 5/20/2014 20