2012 45th Hawaii International Conference on System Sciences Using Quantitative Analyses to Construct a Capability Maturity Model for Business Intelligence David Raber Robert Winter Felix Wortmann Institute of Information Management University of St. Gallen St. Gallen, Switzerland {david.raber, robert.winter, felix.wortmann}@unisg.ch Abstract One important means to explore the strengths and weaknesses of Business Intelligence (BI) initiatives is a comprehensive and accurate BI maturity assessment instrument. It is important that the assessment instrument is transparently developed using the current BI knowledge base. This paper proposes a BI maturity model that is based on an explicit BI maturity concept and using empirical data. The data is transformed into maturity levels by applying the Rasch algorithm and cluster analysis. The resulting BI maturity model is constructed on the basis of 58 items (capabilities). It is comprised of five levels that we choose to label initiate, harmonize, integrate, optimize and perpetuate. An evaluation of the model demonstrates its utility. 1. Introduction Business Intelligence (BI) resembles one of the major topics in modern information systems (IS) literature. Having been coined as a widely used term during the early 1990s [50], BI is regarded as strategic capability for most organizations when it comes to creating, collecting, analyzing and applying information and knowledge. While being understood and managed as a technology approach in the 1990ies, today BI is regarded as an organizational capability of strategic importance having the ability to enhance the competitiveness of organizations [33, 42]. Implementation challenges were more and more replaced by questions like strategic business alignment, business value improvement or optimization of the solution architecture [41, 48]. In fact, BI has recently been denominated as top business priority instead of being a top technology priority in previous years [41]. In order to cope with the broad business-to-it scope of BI, i.e. the wide array of business as well as technology related issues, a comprehensive view of its design and transformation is essential. A concept that is able to provide holistic support for such broad design and transformation tasks is the maturity model (MM). Being successfully used in the software engineering domain [36], MMs represent an established means to support effective management and continuous improvement for complex, multi-faceted phenomena [1, 9]. MMs are also used as an instrument for self or third party assessment [15, 20]. We therefore develop a BI MM that focuses on BI capabilities of organizations. The aim of our BI MM is to identify and explore strengths and weaknesses of organizations as well as to provide support to both evaluate their current state and to identify necessary improvements with respect to their BI capabilities. In the context of BI, a large number of MMs has been proposed. The recent state-of-the-art analysis in [27] however reveals a lack of theoretical foundation, inadequate documentation and especially the dismissal of methodical requirements as common shortcomings of existing MMs in this domain. This conforms to the findings of [6] for the MM concept in general. Furthermore, the underlying BI maturity concept (i.e. the fundamental understanding of BI maturity) is often not specified sufficiently and transparently. In order to consider the core issues identified by [6, 27], the goal of this paper is to develop a BI MM which (a) comprehensively covers BI business-to-it and (b) is developed in a transparent way based on a explicit maturity concept. We believe that the difficulty of a capability should be directly related to maturity, i.e. that a mature BI approach differs from a less mature BI approach by comprising more difficult capabilities. The Rasch algorithm has been successfully used to determine the difficulty of a capability. Hence, the development of our BI MM is based on a quantitative approach that applies the Rasch algorithm and a subsequent cluster analysis [26]. In order to provide a solid foundation for capability difficulty assessment, a rigor- 978-0-7695-4525-7/12 $26.00 2012 IEEE DOI 10.1109/HICSS.2012.630 4219
ous conceptualization of BI maturity is based on IS kernel theories. The remainder of this paper is structured as follows. Next, foundations of BI and MMs are presented and related work is analyzed. Subsequently, we introduce our conceptualization of BI maturity and briefly describe the applied methodology for developing the BI MM. The constructed BI MM is presented in detail thereafter, followed by an initial evaluation of the model. Concluding, we discuss the model as well as limitations of the proposal and give an outlook on future research. 2. Foundations and conceptual background 2.1. Business Intelligence Since its first mentioning in 1958 [29], the understanding of BI broadened from a collective term for data analysis, reporting and query tools [2] towards an encompassment of all components of an integrated decision support infrastructure [3]. BI is used as an umbrella term for both BI applications (e.g. dashboards) and BI technologies (e.g. online analytical processing (OLAP)), which are used to develop the BI applications [50]. A central component of BI systems is the data warehouse (DWH), which integrates data from various transactional IS for analytical purposes [23, 24]. The DWH is used by analytical frontends to present complex and competitive information to planners and decision makers [32]. Even though there exist best practices for many topics in BI, e.g. reference application architectures or certain processes, there is no universally accepted definition of BI [50]. In the context of this paper, the following BI definition of Wixom and Watson is used: Business intelligence (BI) is a broad category of technologies, applications, and processes for gathering, storing, accessing, and analyzing data to help its users make better decisions [50]. This definition emphasizes our understanding of BI in a way that BI represents a complex socio-technical system for which organizational as well as technical aspects need to be considered. 2.2. Maturity models MMs or correctly maturity assessment models are a widely accepted instrument for systematically documenting and guiding the development and transformation of organizations on the basis of best or common practices [37]. The concept of MMs has initially been proposed during the 1970s [19]. Driven by the success of prominent examples (e.g. [1, 9, 22]), numerous MMs have been developed by academics as well as practitioners since then. In the field of IS, over a hundred MM instantiations have been published [31]. A MM typically consists of a sequence of maturity levels for a class of objects [4, 25]. Each level requires the objects on that level to achieve certain requirements. Maturity in this context is understood as a measure to evaluate the capabilities of an organization [12], while the term capability is understood as the ability to achieve a predefined goal [45]. Table 1 briefly summarizes the most important characteristics of MMs. Table 1. Characteristics of MMs Characteristic Description Object of maturity assessment Dimension Level Maturity principle Assessment MMs allow for the maturity assessment of a variety of different objects. Most frequently assessed objects are technologies / systems [40], processes [8, 37], people / workforce [10] and management capabilities like project or knowledge management [9, 38]. Dimensions are specific capability areas which describe different aspects of the maturity assessment s object. Dimensions should preferably be both exhaustive and distinct [31]. Each dimension of a MM is further specified by a number of characteristics (practices, measures or activities) at each level [15]. Levels are archetypal states of maturity of the object that is assessed. Each level should have a set of distinct characteristics (practices, measures or activities per dimension) that are empirically testable [34]. MMs can be continuous or staged. While continuous models allow a scoring of characteristics at different levels, staged models require that all elements of one distinct level are achieved [15]. Hence, in continuous MMs a maturity rank may be determined as either the (weighted) sum of the individual scores or the individual levels in different dimensions. In contrast, staged MMs clearly specify a set of goals and key practices that need to be implemented in order to reach a certain level. In order to pursue a maturity assessment either qualitative (e.g. interviews) or quantitative approaches (e.g. questionnaires with Likert scales) may be used. With increasing popularity of MMs, criticism addressed a certain arbitrariness and fuzziness of the MM development and design process [4, 31]. In order to address this drawback, de Bruin et al. proposed a MM 4220
lifecycle model that is comprised of a scope, design, populate, test, deploy, and maintain phase [12]. For this paper, the design and populate phase are of particular interest. Regarding the design phase, two different approaches exist. Whereas in the top-down approach levels are defined first and afterwards characteristics that describe the different dimensions are derived, the bottom-up approach first derives dimensions and characteristics which are afterwards assigned to maturity levels. For the derivation of characteristics, dimensions and levels, various exploratory methods such as Delphi studies, case studies, and focus groups [4, 12] have been proposed. As quantitative methods require valid data sets and knowledge of statistical methods, they are less often used for designing MMs [15]. However, a lack of theoretical foundation has been identified as one of the major weaknesses of most MMs [26] because an explicated theoretical foundation, i.e. a rigorous derivation of the underlying maturity concept makes the relationships between different parts of the MM more comprehensible. 2.3. Existing BI maturity models In the field of BI, quite a high number of MMs has been proposed [27, 50]. In a recent literature review, ten BI MMs were identified and analyzed with respect to methodology and content [27]. Most of these MMs have their origin in practice and are poorly documented. Only four out of ten MMs have been developed based on empirical data. None of MMs has been evaluated with real world scenarios. Even for the empirically grounded MMs, no details about the construction process have been published. Not only the construction process, but also the underlying BI maturity concept should be explicated. The maturity concept outlines what exactly is measured and what the MM s purpose is. Also remarkable in this context is the fact, that only one out of the ten analyzed BI MMs has a theoretical foundation, i.e. only one model is explicitly based on (kernel) theories [6]: In their stage model for data warehousing, Watson et al. [47] refer to the stages of growth approach [19]. As in this case, an explicated theoretical foundation helps to understand how the different concepts of an MM influence each other. As the analysis of Lahrmann et al. further shows, comprehensiveness of existing BI MMs seems to be an issue, too. Traditional IT topics, e.g. applications, data, and infrastructure are highly present whereas topics as BI organization and BI strategy are widely neglected. This contrasts current IS literature where these two topics gained high visibility [17, 46]. A comprehensive BI MM should therefore address the business-to-it range of BI topics in its entirety, even though BI represents a very heterogeneous field. 3. Research methodology 3.1. BI maturity concept Maturity is defined by Soanes and Stevenson as a state of being complete, perfect, or ready or the fullness of development [43]. Following this definition, two aspects of maturity need to be differentiated. On the one hand, maturity has to reflect causes, e.g. Definitions of key performance indicators are standardized. On the other hand, effects need to be reflected as well, e.g. key performance indicators are consistent and transparent beyond functional borders. MMs focusing only on effects do not offer any valuable information on how to improve the state of the object to be measured. MMs that are limited to causes, do not provide any insights on the impact that is achieved by following their guidelines. Based on this argumentation, we utilize IS success models and their underlying theoretical foundations in order to develop our concept of BI maturity [14, 16, 39]. Thereby, we intend to complement the relevance of MMs with the rigor of IS theory. The intention of IS success models is to explain which variables or capabilities affect (better: cause) IS success. Hence, they are giving guidance on how to conceptualize IS success and the corresponding success drivers. Briefly summarized, IS success is conceptualized on the basis of IS use, which comprises the intention to use as well as the usage itself, and IS impact, i.e. the combined IS net benefits. IS use is directly affected by IS quality, i.e. system quality, information quality and service quality, which is driven by the capabilities of the information system itself ( IS ). As we want to focus especially on IS capabilities that foster IS success, we have to conceptualize IS in more detail. According to the strategic alignment model [21], IS is more than technology, i.e. it has to be understood as a combination of strategy and processes and infrastructure. Moreover, following socio-technical theory [7] (SST) processes and infrastructure may not solely be comprehended as a collection of different IT components. However, SST postulates that the social IS subsystems (people, methodological capabilities and organizational practices) and the technical IS subsystems are interdependent and need to be working in conformity with each other in order to maximize the system s benefits. 4221
Figure 1. Concepts representing BI maturity As outlined in the introduction, we want to focus our analysis on the causes of IS success, i.e. the BI capabilities of organizations. Thereby, we follow the practice of prominent MMs like the Capability Maturity Model Integration (CMMI) [8]. Hence, we form the basis for our MM by the five concepts strategy (S), social system (SO), technical system (T), quality (Q), and use/impact (U) (cf. Figure 1). The development of the questionnaire was driven by the conceptual results we derived on the basis of the IS success model, the strategic alignment model and SST. Table 2 lists the references that were used for developing the questionnaire. Source Baars and Kemper 2008 [3] Davenport 2010 [11] Table 2. References for questionnaire development Strategy Social system Tech. system Quality Use/ impact (IRT)-based approach is used in combination with cluster analysis. This approach adapts and extends the original work of [13] for maturity models in the IS domain. In the following, the main BI MM construction steps are briefly summarized. These steps are documented in detail in [26]. The applied method differs from traditional Rasch-based MM constructions in three ways: Firstly, the Rasch algorithm is used in combination with rating scales, e.g. a Likert scale from one to five. Due to the complexity of socio technical systems, in this case BI, the expressive power of rating scales is preferred over dichotomous scales. Secondly, not only the actual situation (as-is) of an item at an organization, but also the desired situation (to-be) is addressed in the questionnaire. In accordance to [26], the input for the Rasch algorithm is then computed by first taking the delta value between the as-is situation of an item and the median of the to-be situation of this item for all organizations. A positive delta value represents a difficult and desired item whereas negative delta values express more easy-to-achieve items. These values are recoded to a Likert scale where five represents the easiest items and 1 represents the most difficult items. Using such data, the Rasch algorithm yields a single ordinal value that represents the logit measure of each item and organization, but not distinct maturity levels. The third modification is therefore to apply an agglomerative cluster analysis on the basis of the item logit measure in order to derive distinct maturity levels (i.e. item clusters). As most maturity models use five maturity levels [5, 27], the number of clusters is set to five. Using cluster analysis overcomes subjectivity problems in defining maturity levels. Geiger et al. 2010 [18] Matney and Larson 2004 [30] Negash and Gray 2008 [33] 4. Development of the BI MM 4.1. Data collection Watson et al. 2001 [47] Wixom and Watson 2001 [49] Wixom and Watson 2010 [50] 3.2. Maturity model construction Basing on the BI maturity concept outlined in the previous section, the next step towards the development of a methodically sound BI MM is the empirically grounded MM construction. We apply the inductive design approach described in [26] to construct our MM in a transparent way based on empirical methods. In fact, the Rasch algorithm as an Item Response Theory Data was collected using a paper questionnaire distributed at (blinded for review) in May 2010. In addition, an online version of the questionnaire was distributed. It was ensured that participant segments did not overlap. The paper questionnaire was returned by 51 out of 144 participants of the conference, yielding a response rate of 35.4%. The conference was attended by BI/DWH specialists and executives working in business, management, and IT functions. The online questionnaire was sent to a focus group consisting of 28 people. Focus groups are an established approach to explore new ideas and to check the applicability of a research object by practitioners [44]. In this case, the members of the focus group were also BI practitioners, 4222
who meet on a regular basis three times a year to discuss latest developments in BI, as well as their experiences. 20 members of the focus group completed the questionnaire, resulting in a response rate of 71.4%. Table 3 summarizes the characteristics of the overall sample. Table 3. Sample characteristics Industry sector Abs. % Employees Abs. % Automotive 4 6 0-500 21 30 Chem. & pharmacy 4 6 500-5.000 17 24 Services 17 24 5.000-10.000 9 13 Utilities 7 10 > 10.000 19 27 Finance & banking 13 18 Not available 5 7 Healthcare 1 1 Sum 71 100 Respondent s Wholesale & retail 5 7 Function Abs. % Techn. & telecom. 8 11 Business 10 14 Logistics 2 3 IT 36 51 Others 7 10 Mixed 23 32 Not available 3 4 Not available 2 3 Sum 71 100 Sum 71 100 4.2. Data analysis and interpretation The BIGSTEPS software version 2.82 [28] was used to obtain Rasch item calibration. Important output statistics are the measure (of difficulty), the standard error, and a set of standardized fit statistics (infit and outfit) for each item. Infit is more sensitive to unexpected behavior affecting responses to items near the organization s capability level, whereas outfit is more sensitive to unexpected behavior of organizations on items far from the organization s capability level. Values of both fit statistics are expected to be close to one. The data is considered productive for measurement when infit and outfit are between 0.5 and 1.5. Regarding the model at hand, about 98% of infit and outfit values are considered productive for measurement. Therefore, our data set meets the quality criterion described in [13] and we conclude that the data conforms to the model. Table 4 exhibits the results of applying the Rasch algorithm. The items are arranged using the levels derived by the subsequent cluster analysis. Table 4. Results of applying the Rasch algorithm ordered by maturity level Measure Error Infit Outfit Level Concept Short description 1.1 0.14 0.96 1.25 5 S Comprehensive BI strategy with focus on organization, processes as well as technology and tools 0.88 0.13 0.58 0.63 5 Q Proactive data quality management 0.72 0.12 1.25 1.49 5 S Balanced Scorecard incl. quality, cost and user satisfaction 0.69 0.12 1.09 1.3 5 S Systematic and comprehensive measurement of actual BI usage 0.63 0.12 1.08 1.13 5 S BI steering committee within business 0.52 0.11 0.96 1.14 5 S BI strategy is updated on a regular basis 0.41 0.11 1.03 1.08 4 SO Development of BI solutions based on standardized BI specific process 0.4 0.11 0.8 0.8 4 S Portfolio management for systematic BI roadmap 0.39 0.11 0.79 0.84 4 SO Defined governance & standards for content 0.35 0.12 0.76 0.75 4 S Value-oriented development of BI, e.g. using business cases 0.35 0.11 0.66 0.68 4 Q Defined and documented roles for data quality management 0.35 0.11 0.81 0.82 4 Q Cost efficient BI operations 0.33 0.11 0.62 0.64 4 U Use of BI by middle-management 0.33 0.11 1.16 1.17 4 T Flexible, proactive analytics 0.21 0.11 1.13 1.13 3 Q BI operations based on well defined service-level-agreements (SLAs) 0.2 0.11 0.83 0.87 3 SO Central operation of BI applications based on ITIL 0.2 0.11 1.18 1.16 3 Q Standardized definitions for key performance indicators 0.18 0.11 0.81 0.83 3 SO Defined governance & standards for management 0.17 0.11 0.68 0.68 3 Q Defined processes for data quality management 0.17 0.11 0.87 0.88 3 Q Performance is satisfying for users 0.14 0.11 0.93 0.94 3 S Central, influential sponsor from business 0.13 0.11 1.38 1.38 3 S Multitude of decentralized sponsors from business 0.09 0.11 1.03 1.06 3 S BI strategy with focus on technology and tools 4223
0.09 0.11 1.16 1.15 3 SO Role of IT: Business partner - consulting of business lines 0.08 0.12 1.12 1.11 3 S BI steering committee within IT 0.07 0.11 0.79 0.81 3 Q Core business objects are consistently defined for whole enterprise 0.07 0.11 0.85 0.86 3 Q Usage of up-to-date tools and frontends 0.04 0.11 0.84 0.84 3 SO Development of BI solutions using agile development methods (e.g. SCRUM) 0.04 0.11 0.97 0.97 3 Q Standardized definitions for master data 0.02 0.11 1.5 1.62 3 U Use of BI by specialized analysts 0.02 0.11 0.87 0.87 3 T Integration of different frontends, e.g. "drill-through" from standard reports into OLAP cubes 0.02 0.11 0.85 0.85 3 T Partial integration of data in global systems (e.g. finance data warehouse) 0.02 0.11 1.07 1.09 3 Q Homogeneity: Usage of a few and coherent BI tools 0.01 0.11 0.98 0.97 3 T Balanced mix of central and decentralized systems based on organizational structure -0.02 0.11 1.3 1.29 2 T Highly centralized data warehouse -0.03 0.12 1.09 1.1 2 SO Hybrid development of BI solutions combining agile development and waterfall methods -0.05 0.11 0.8 0.8 2 SO Defined governance & standards for development -0.07 0.11 1.19 1.17 2 U Operational usage of BI -0.1 0.11 0.92 0.94 2 SO Balanced mix of central and decentralized organizational units -0.11 0.11 0.54 0.53 2 T Static reports -0.13 0.11 0.75 0.74 2 SO Role of IT: Operator of infrastructure -0.13 0.11 1.08 1.06 2 SO Decentralized BI organization with central CIO organization -0.16 0.11 1.02 1.03 2 SO Role of IT: Provider of standardized services -0.17 0.11 1.21 1.19 2 SO Centralized BI organization and responsibilities -0.18 0.11 1.13 1.18 2 U Use of BI by top-management -0.25 0.11 0.76 0.75 2 SO Defined governance & standards for operations -0.29 0.12 0.91 0.9 2 SO Development of BI solutions based on standardized IT process -0.29 0.11 0.86 0.88 2 Q High availability: No breakdowns, maintenance in well defined and short time slots -0.31 0.11 0.89 0.88 2 SO Defined governance & standards for tools and applications -0.38 0.12 1.42 1.4 2 S Central, influential sponsor from IT -0.4 0.12 1.02 1.08 2 T Decentralized, but harmonized systems (e.g. standardized master data) -0.52 0.12 1.2 1.15 1 SO Central operation of BI applications -0.57 0.12 1.29 1.25 1 SO Project oriented development -0.66 0.12 1.22 1.14 1 T Ad-hoc analyses (OLAP) -0.68 0.13 1.29 1.39 1 S Multitude of decentralized sponsors from IT -0.69 0.13 1.16 1.22 1 T Decentralized data warehouses and central enterprise data warehouse -0.74 0.13 1.29 1.37 1 S Standardized cost and profit calculation for BI -0.77 0.13 1.33 1.32 1 SO decentralized BI organization and responsibilities These results provide the content for an initial MM that can be built based on the conceptual structure developed in section 3.1. The matrix form of the complete BI MM, consisting of all items assigned to their respective levels is omitted here due to space limitations. Nonetheless, we developed a condensed and more comprehensible version of our model where the capabilities are summarized for each level and each concept. This summarized model is presented in Table 5. Several terms have been used in prior MM development efforts for labeling maturity levels. The generic CMMI labels Initial, Managed, Defined, Quantitatively managed, and Optimizing [8] for example are well-known. In the situation at hand, however, item assignments to levels have been derived bottom-up. Therefore we need to create maturity level labels using semantic interpretation of the clustered items instead. Having the management perspective in mind, we decided to use verbs as labels, which reflect the main goal and effect of each respective level. Level 1 of the BI MM is characterized by a high degree of decentralism with almost no standardization 4224
efforts, representing an early and immature state of BI, and is thus labeled initiate. In more detail, the BI organization, responsibilities, and sponsorship are decentralized, rendering standardization initiatives nearly inapplicable. From a technical point of view, the BI infrastructure is already operated centrally and basic capabilities as ad-hoc analyses are provided. Organizations that achieve level 2 are clearly oriented towards centrally managed BI in terms of governance and organizational setup. Standardization efforts regarding operations, development, tools, processes, and applications support this development by providing consistent policies and transparency beyond functional borders. The BI infrastructure at this level of maturity is still mainly decentralized. But central components as the DWH and the use of standardized master data are important steps towards a harmonized system landscape. We thus label level 2 harmonize. Additional reporting functionalities together with a high overall availability of BI systems create an increased business value potential of BI which is widely used in top management and for operational business by companies on this level. Table 5. Condensed BI MM Strategy Level 1 Level 2 Level 3 Level 4 Level 5 Initiate Harmonize Integrate Optimize Perpetuate Decentralized ITdriven Centralized IT-driven Business sponsor, BI BI initial BI strategy BI portfolio management and BI business cases Comprehensive BI strategy and BI performance management Social system (organization) Technical system (IT) Quality of service Use /impact Decentralized, individually acting BI organization Decentralized, nonstandardized BI infrastructure Standardization of operations, tools, applications and development Decentralized, but harmonized systems High availability and proper maintenance Top management and operational usage Centralized with respect to business model Centralized with respect to business model Data and system quality is guaranteed Specialized analysts Well-defined governance and business content Flexible, pro-active analytics Cost-efficient BI operations Middle management Pro-active data quality management Level 3 of the BI MM, represents the final step towards centralization and integration, as well as an intermediate stage with respect to optimization. This level is therefore labeled integrate. A BI steering committee located within IT centrally defines an initial BI strategy that is focused on technology and tools. Various forms of sponsorship, both from IT and business, are available showing the corporate acceptance of BI. Organizational setup of BI as well as BI systems are centralized with respect to organizational structure. An enhanced system and data integration together with standardized definitions of key performance indicators achieve consistency across functional and system boundaries. A further improvement is the organization of BI operations according to ITIL [35]. In terms of quality, the professional stage of BI maturity is characterized by usage of SLAs and defined processes for data quality management. The organization now also employs specialized BI analysts. On level 4, organizations are realizing the full potential of BI and drive advanced strategic topics such as BI portfolio management and business cases for BI. Governance is now well-defined also with regards to content. On the technical side, flexible and pro-active analytics are provided to achieve business impact, whereas the quality of BI systems is increased by improving cost-efficiency of BI operations. We therefore label this level as optimize. At stage 4, middle management is widely engaged in BI usage. For achieving the highest level of BI maturity, a sustainable and continuous management of BI needs to be established. In terms of capabilities, this stage of maturity requires a comprehensive BI strategy to be specified and regularly updated. In addition, BI performance management and pro-active data quality management need to be established. This most mature BI stage is labeled perpetuate. 5. Evaluation In the following, an initial evaluation of the developed BI MM is presented. The purpose of the evaluation is to field test the BI MM and to demonstrate its applicability in a real-world scenario. We therefore conducted three interviews with BI experts. The interviews started with documenting general information of the represented company and explaining the general setting and objective of the interview. Then the pro- 4225
posed BI MM was presented to the interviewees and they were asked whether the model (a) is comprehensive with respect to content, (b) allows for a valid self assessment, and (c) supports the development of a BI roadmap. An overview of the case companies is given in Table 6. Description Function/Team of interviewee Table 6. Overview of case companies Companies #1 #2 #3 Senior project manager IT Head of controlling BI strategy and governance Industry sector Finance Manufacturing Telecom. Employees 2000 8000 3000 Headquarters (blinded for review) (blinded for review) (blinded for review) Briefly summarized, the general reactions to the proposed model were positive. The interviewees emphasized the comprehensiveness as regards to content as well as the well-balanced mix of technical and business related items. However, all three interviewees criticized the fact that no use/impact items exist on level 1. After brief discussions, it was concluded that the absence of items at this level makes sense because the items represent rather organization wide usage of BI, whereas only few users use BI on level 1. One interviewee was missing BI staff related items (e.g. BI staff is highly specialized). Regarding BI staff, the inclusion of corresponding items at the current level of detail is out of scope, but might be considered in future iterations. When it came to self assessment, all interviewees were quickly able to position their respective company by using the BI MM. Company #3 for example was positioned on level 1 with respect to technical system, on level 2 for quality of service and social system, and on level 3 regarding the maturity of strategy and use/impact. Since BI improvement activities can be easily identified by analyzing capabilities at the next higher level, the BI MM was found particularly useful as foundation for BI roadmap development and as basis for investment decisions. 6. Discussion and limitations The approach proposed by [26] for developing maturity models in the domain of IS which was applied in this work, represents a rather innovative and unconventional method compared to MM development processes described elsewhere [4] [12]. A great advantage of this approach represents the fact that subjectivity issues, i.e. a certain arbitrariness when assigning capabilities to different maturity levels, can be eliminated. Being based on the theoretically sound conceptualization of BI maturity, the proposed BI MM overcomes many weaknesses inherent to existing MMs in the field of BI [6, 27]. However, there are two limitations which need to be mentioned. First, the data set is comprised of only 71 questionnaires. A larger data set would provide a better empirical basis for our quantitative analyses. Second, the number of different levels of the MM (e.g. the number of clusters to be created by cluster analysis) should be subject to further discussion. In our case, we followed previous and common practice by choosing five clusters but this number was decided subjectively. Finding an appropriate way to determine the optimal number of clusters for the method of Lahrmann et al. [26] could avoid this arbitrariness, but could lead to MMs that might not meet user expectations due to an un-conventional number of maturity levels, at the same time. 7. Conclusion and future work MMs have become an established means in the IS community to support organizations when it comes to effective management and continuous improvement for complex, multi-faceted phenomena [1, 9]. In this paper, we proposed a MM for assessing and evaluating capabilities of organizations in the field of BI. Being constructed in a transparent way and being based on an explicit maturity concept, the proposed BI MM overcomes weaknesses of related work in this domain. It also features a broad business to IT scope which covers technical as well as business related aspects of BI. A first evaluation at three companies showed the potential of our model. Nonetheless, future work might be addressing the choice of maturity level numbers, e.g. based on quality criteria for cluster analysis. Moreover, the proposed MM needs further extensive testing and validation in practice. References [1] Ahern, D. M., A. Clouse and R. Turner: CMMI Distilled: A Practical Introduction to Integrated Process Improvement, Addison-Wesley, Boston, 2003. [2] Anandarajan, M., A. Anandarajan and C. A. Srinivasan: Business Intelligence Techniques - A Perspective from Accounting and Finance, Springer, Berlin, 2004. [3] Baars, H. and H.-G. Kemper: "Management Support with Structured and Unstructured Data An Integrated Business Intelligence Framework", Information Systems Management, Vol. 25, 2, 2008 pp. 132-148. 4226
[4] Becker, J., R. Knackstedt and J. Pöppelbuß: "Developing Maturity Models for IT Management - A Procedure Model and its Application", Business & Information Systems Engineering, Vol. 1, 3, 2009 pp. 213-222. [5] Becker, J., et al.: "Maturity Models in IS Research", in: Proceedings of the 18th European Conference on Information Systems (ECIS 2010), Pretoria, South Africa 2010. [6] Biberoglu, E. and H. Haddad: "A Survey of Industrial Experiences with CMM and the Teaching of CMM Practices", Journal of Computing Sciences in Colleges, Vol. 18, 2, 2002 pp. 143-152. [7] Bostrom, R. P. and S. Heinen: "MIS Problems and Failures - A Socio-Technical Perspective. Part I - The Causes", MIS Quarterly, Vol. 1, 3, 1977 pp. 17-32. [8] Chrissis, M. B., M. Konrad and S. Shrum: CMMI: Guidelines for Process Integration and Product Improvement, Addison-Wesley, 2003. [9] Crawford, J. K.: "The Project Management Maturity Model", Information Systems Management, Vol. 23, 4, 2006 pp. 50-58. [10] Curtis, B., W. E. Hefley and S. A. Miller: The People Capability Maturity Model Guidelines for Improving the Workforce, Addison-Wesley, Boston, MA, 2010. [11] Davenport, T. H.: "Business Intelligence and Organizational Decisions", International Journal Of Business Intelligence Research, Vol. 1, 1, 2010 pp. 1-12. [12] de Bruin, T., et al.: "Understanding the Main Phases of Developing a Maturity Assessment Model", in: Proceedings of the 16th Australasian Conference on Information Systems (ACIS 2005), Campbell, Underwood, Bunker (eds.), Sydney, Australia 2005 pp. 1-10. [13] Dekleva, S. and D. Drehmer: "Measuring Software Engineering Evolution: A Rasch Calibration", Information Systems Research, Vol. 8, 1, 1997 pp. 95-104. [14] DeLone, W. H. and E. R. McLean: "The DeLone and McLean Model of Information Systems Success - A Ten- Year Update", Journal Of Management Information Systems, Vol. 19, 4, 2003 pp. 9-30. [15] Fraser, P., J. Moultrie and M. Gregory: "The Use of Maturity Models/Grids as a Tool in Assessing Product Development Capability", in: Proceedings of the IEEE International Engineering Management Conference, 2002 (IEMC 2002), Cambridge, UK 2002 pp. 244-249. [16] Gable, G. G., D. Sedera and T. Chan: "Reconceptualizing Information System Success: The IS-Impact Measurement Model", Journal Of The Association For Information Systems, Vol. 9, 7, 2008 pp. 377-408. [17] Gansor, T., A. Totok and S. Stock: Von der Strategie zum Business Intelligence Competency Center (BICC): Konzeption - Betrieb - Praxis, Hanser Fachbuchverlag, München, Wien, 2010. [18] Geiger, J. G., A. Maydanchik and P. Russom: "BI Experts Perspective: Pervasive BI", Business Intelligence Journal, Vol. 15, 2010 pp. 36-40. [19] Gibson, C. F. and R. L. Nolan: "Managing the four stages of EDP growth", Harvard Business Review, Vol. 52, 1, 1974 pp. 76-88. [20] Hakes, C.: The corporate self assessment handbook: For measuring business excellence, Chapman & Hall, London, 1996. [21] Henderson, J. C. and N. Venkatraman: "Strategic alignment: Leveraging information technology for transforming organizations", IBM Systems Journal, Vol. 32, 1, 1993 pp. 4-16. [22] Humphrey, W. S.: "Characterizing the Software Process: A Maturity Framework", IEEE Software, Vol. 5, 2, 1988 pp. 73-79. [23] Inmon, W. H., D. Strauss and G. Neushloss: DW 2.0: The Architecture for the Next Generation of Data Warehousing, Elsevier Science, Amsterdam, 2008. [24] Kimball, R., et al.: The Data Warehouse Lifecycle Toolkit, John Wiley & Sons, New York, 2008. [25] Klimko, G.: "Knowledge Management and Maturity Models: Building Common Understanding", in: Proceedings of, Bled, Slovenien 2001 pp. 269 278. [26] Lahrmann, G., et al.: "Inductive Design of Maturity Models: Applying the Rasch Algorithm for Design Science Research", in: Proceedings of the Proc. DESRIST 2011, Jain, Sinha, Vitharana (eds.)2011 pp. 176-191. [27] Lahrmann, G., et al.: "Business Intelligence Maturity Models: An Overview", in: Proceedings of the VII Conference of the Italian Chapter of AIS (itais 2010), D'Atri, Ferrara, George, Spagnoletti (eds.), Naples, Italy 2010. [28] Linacre, J. M. and B. D. Wright: "Bigsteps Rasch Software, Version 2.82", http://www.winsteps.com/bigsteps.htm, last accessed: 11.05.2011. [29] Luhn, H. P.: "A Business Intelligence System", IBM Journal of Research and Development, Vol. 2, 4, 1958 pp. 314-319. [30] Matney, D. and D. Larson: "The Four Components of BI Governance", Business Intelligence Journal, Vol. 9, 3, 2004 pp. 29-36. [31] Mettler, T. and P. Rohner: "Situational Maturity Models as Instrumental Artifacts for Organizational Design", in: 4227
Proceedings of the 4th International Conference on Design Science Research in Information Systems and Technology (DESRIST 2009), Philadelphia, PA 2009 p. 9. [32] Negash, S.: "Business Intelligence", Communications Of The Association For Information Systems, Vol. 13, 2004 pp. 177-195. [33] Negash, S. and P. Gray: "Business Intelligence", in: Handbook on Decision Support Systems 2, Burstein, Holsapple (eds.), Springer, Berlin, Heidelberg, 2008 pp. 175-193. [34] Nolan, R. L.: "Managing the Computer Resource: A Stage Hypothesis", Communications Of The ACM, Vol. 16, 7, 1973 pp. 399-405. [47] Watson, H. J., T. R. Ariyachandra and R. J. Matyska Jr: "Data Warehousing Stages of Growth", Information Systems Management, Vol. 18, 3, 2001 pp. 42-50. [48] Williams, S. and N. Williams: The profit impact of business intelligence, Morgan Kaufmann, San Francisco, CA, 2007. [49] Wixom, B. H. and H. J. Watson: "An Empirical Investigation of the Factors Affecting Data Warehousing Success", MIS Quarterly, Vol. 25, 1, 2001 pp. 17-41. [50] Wixom, B. H. and H. J. Watson: "The BI-Based Organization", International Journal of Business Intelligence Research, Vol. 1, 1, 2010 pp. 13-28. [35] Office of Government, C.: ITIL - Official Introduction to the ITIL Service Lifecycle, TSO, London, 2007. [36] Paulk, M. C., et al.: "Capability Maturity Model (SM) for Software, Version 1.1" Pittsburgh, PA, 1993. [37] Paulk, M. C., et al.: "Capability Maturity Model, Version 1.1", IEEE Software, Vol. 10, 4, 1993 pp. 18-27. [38] Paulzen, O., et al.: "A Maturity Model for Quality Improvement in Knowledge Management", in: Proceedings of the Acis 2002, Melbourne, Australia 2002. [39] Petter, S., W. DeLone and E. McLean: "Measuring information systems success: models, dimensions, measures, and interrelationships", European Journal Of Information Systems, Vol. 17, 2008 pp. 236-263. [40] Popovic, A., P. S. Coelho and J. Jaklič: "The impact of business intelligence system maturity on information quality", Information Research, Vol. 14, 4, 2009. [41] Richardson, J. and A. Bitterer: "Findings: The Risks of Losing Faith in BI", Technical Report, Gartner, Stamford, 2010. [42] Services, I. B. M. G. B.: "The New Voice of the CIO", Technical Report, IBM Global Services, Somers, NY, 2009. [43] Soanes, C. and A. Stevenson: "Concise Oxford English Dictionary", Oxford University Press Inc., 2008. [44] Tremblay, M. C., A. R. Hevner and D. J. Berndt: "Focus Groups for Artifact Refinement and Evaluation in Design Research", Communications Of The Association For Information Systems, Vol. 26, 27, 2010 pp. 599-618. [45] van Steenbergen, M., et al.: "The design of focus area maturity models", in: Proceedings of, St. Gallen, Switzerland pp. 317-332. [46] Vierkorn, S. and D. Friedrich: "Organization of Business Intelligence", Technical Report, BARC Institute, Würzburg, 2008. 4228