communicate with stakeholders. Identifying and quantifying value, however, has historically

Similar documents
COMPETITIVE INTELLIGENCE

SECURITY METRICS: MEASUREMENTS TO SUPPORT THE CONTINUED DEVELOPMENT OF INFORMATION SECURITY TECHNOLOGY

Measurement of Business Intelligence in a Finnish Telecommunications

This alignment chart was designed specifically for the use of Red River College. These alignments have not been verified or endorsed by the IIBA.

The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into

Partnering for Project Success: Project Manager and Business Analyst Collaboration

Develop Project Charter. Develop Project Management Plan

ISO 14001: White Paper on the Changes to the ISO Standard on Environmental Management Systems JULY 2015

Manage Competitive Intelligence for Strategic Advantage

Principles of IT Governance

STANDARD. Risk Assessment. Supply Chain Risk Management: A Compilation of Best Practices

Building HR Capabilities. Through the Employee Survey Process

Design Verification The Case for Verification, Not Validation

Course Outline. Foundation of Business Analysis Course BA30: 4 days Instructor Led

Monitoring and Evaluation Plan Primer for DRL Grantees

White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program

The Association of Change Management Professionals

Factors for the Acceptance of Enterprise Resource Planning (ERP) Systems and Financial Performance

Finding the Right People for Your Program Evaluation Team: Evaluator and Planning Team Job Descriptions

School of Advanced Studies Doctor Of Education In Educational Leadership With A Specialization In Educational Technology. EDD/ET 003 Requirements

Constraints and Shortfalls in Engineering Design Practice

Audit of the Test of Design of Entity-Level Controls

Existing Analytical Market Assessment Tools - Definitions

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

How to achieve excellent enterprise risk management Why risk assessments fail

Sales & Operations Planning Process Excellence Program

Insurance management policy and guidelines. for general government sector, September 2007

The value of information technology: A case study

IPDET Module 6: Descriptive, Normative, and Impact Evaluation Designs

5/30/2012 PERFORMANCE MANAGEMENT GOING AGILE. Nicolle Strauss Director, People Services

The Relationship between the Fundamental Attribution Bias, Relationship Quality, and Performance Appraisal

FlyntGroup.com. Enterprise Risk Management and Business Impact Analysis: Understanding, Treating and Monitoring Risk

School of Advanced Studies Doctor Of Management In Organizational Leadership/information Systems And Technology. DM/IST 004 Requirements

Organization Effectiveness Based on Performance Measurement and Organizational Competencies Getting to Perfect!

STEPS IN THEATRE PROJECT PLANNING

Business Architecture: a Key to Leading the Development of Business Capabilities

the indicator development process

COUNTERINTELLIGENCE. Protecting Key Assets: A Corporate Counterintelligence Guide

Conducting Formative Research

Future Council Programme Evaluation Framework

DEPARTMENT OF LOCAL GOVERNMENT AND COMMUNITY SERVICES. Performance Indicators Information Paper

The Balanced Scorecard. Background Discussion

Chapter: IV. IV: Research Methodology. Research Methodology

ACHIEVING COMPETITIVE ADVANTAGE BY PRIVATE MANAGEMENT COLLEGES OR PRIVATE UNIVERSITIES

BSBMKG506B Plan market research

INTERNATIONAL FRAMEWORK FOR ASSURANCE ENGAGEMENTS CONTENTS

The Basics of a Compensation Program

Enterprise Resource Planning Global Opportunities & Challenges. Preface

The Importance of Trust in Relationship Marketing and the Impact of Self Service Technologies

Principles of Execution. Tips and Techniques for Effective Project Portfolio Management

Table of Contents. Introduction 1. Insight, Planning, and Implementation Deliverables 2

System Development Life Cycle Guide

The 10 Knowledge Areas & ITTOs

Introduction to Marketing

Risk Management Practices in the Public and Private Sector: Executive Summary

PROPS Manual for Project Managers

School of Advanced Studies Doctor Of Health Administration. DHA 003 Requirements

Unlocking value from your ERP service organization*

Educational Leadership Studies

Development, Acquisition, Implementation, and Maintenance of Application Systems

Performance Management Readiness Survey

Reporting Service Performance Information

The University of Adelaide Business School

Disclosure to Promote the Right To Information

Frank P.Saladis PMP, PMI Fellow

ENTERPRISE RISK MANAGEMENT FRAMEWORK

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

In the first three installments of our series on Information Security

A. Planning is the process of anticipating the future and determining the courses of action necessary to achieve

How to manage the transition successfully ISO 9001:2015 TOP MANAGEMENT - QUALITY MANAGERS TECHNICAL GUIDE. Move Forward with Confidence

TOOL D14 Monitoring and evaluation: a framework

Skatteudvalget (2. samling) SAU Alm.del Bilag 48 Offentligt. Programme, Project & Service Management Analysis

IS INTERNATIONAL STANDARD. Environmental management - Life cycle assessment - Principles and framework

National Geospatial Data Asset Management Plan

Market Research Methodology

11. Conclusions: lessons, limitations and way forward

Step 1: Analyze Data. 1.1 Organize

Vision Statement for Innovative Software Development in a Large. Corporation

REGULATIONS FOR THE DEGREE OF MASTER OF SCIENCE IN LIBRARY AND INFORMATION MANAGEMANT (MSc[LIM])

PROJECT MANAGEMENT PLAN Outline VERSION 0.0 STATUS: OUTLINE DATE:

REQUEST FOR EXPRESSION OF INTEREST FOR HIRING THREE(3) EXPERT COACHES NO. 010/NCBS-BTC/S/

Office of the Auditor General AUDIT OF IT GOVERNANCE. Tabled at Audit Committee March 12, 2015

Guidelines for Setting Measurable Public Relations Objectives: An Update

2015 Trends & Insights

School of Advanced Studies Doctor Of Management In Organizational Leadership. DM 004 Requirements

STRATEGIC PRODUCT MANAGEMENT. Copyright 2005/2006

ISO What to do. for Small Businesses. Advice from ISO/TC 176

Interview studies. 1 Introduction Applications of interview study designs Outline of the design... 3

Organization. Project Name. Project Overview Plan Version # Date

Shaping. Business Strategy. Through. Competitive Intelligence. Strategic Use of. Intellectual Property Information

Measuring Quality in Graduate Education: A Balanced Scorecard Approach

The Role of Controlled Experiments in Software Engineering Research

The Future of Project Management: Preparing The Next Generation Project Manager

Data Visualization An Outlook on Disruptive Techniques (Technical Insights)

The Gateway Review Process

INTERNATIONAL STANDARD ON AUDITING 260 COMMUNICATION WITH THOSE CHARGED WITH GOVERNANCE CONTENTS

Data quality and metadata

PORTFOLIO, PROGRAMME & PROJECT MANAGEMENT MATURITY MODEL (P3M3)

Comparison Between Joint Commission Standards, Malcolm Baldrige National Quality Award Criteria, and Magnet Recognition Program Components

Transcription:

Measuring Competitive Intelligence Outcomes: A Case Study Faculty advisor endorsement: France Bouthillier, School of Information Studies, McGill University 1. Introduction Through measurement, information services quantify their performance and value, and communicate with stakeholders. Identifying and quantifying value, however, has historically been an accounting-based activity that has relied on financial figures that do not capture significant intangibles for organizations, such as innovation (Lev, 2001). Information services may provide financial benefits to clients and parent organizations, but often the results of the services are cognitive, personal, and cannot be accurately represented by dollars. This has led managers, such as librarians, to ask if the value of information services could be more accurately demonstrated to stakeholders (Poll & Payne, 2006) and if the representation of value requires a shift in discourse around the topic of measurement to evolve new, non-financial, qualitative measures (Town, 2011). In the field of library and information studies (LIS), scholars are investigating the value of information services to users, asking how managers and researchers can go beyond process measures such as usage statistics as arbiters of success, to outcome and impact measures. In the LIS literature can be found examples of LIS professionals and researchers attempting to develop outcome and impact measures which might better demonstrate the value of information services against organizational purpose and mandates, and not just functionality or cost-effectiveness. Impact has been defined by Poll for library-type services as the tangible or intangible difference or change in an individual or group resulting from the contact with library services, while outcomes are direct, pre-defined effects of the output related to goals and objectives of the library s planning such as customer satisfaction levels (2012, p. 123).

Measuring CI Outcomes Gainor 2 In this proposal is described a case study to examine one instance of an information service, a competitive intelligence (CI) unit, and how CI interacts with organizational decisionmaking processes to produce or not produce beneficial outcomes and impacts for the organization. The findings will provide some valuable fieldwork-based evidence to support the development of outcome-based performance metrics for CI. This purpose of this study is twofold: one, to determine whether the hypothesized causal relationship between CI and decision outcomes can be identified and isolated for measurement (e.g., improved customer service); and two, to critically examine prescriptive models of CI outcome measurement in the literature against the constraints of practice and measurement theory. 2. Literature Review A knowledge-based view of the firm (Grant, 1996) suggests that the information capabilities of organizations determine performance. Performance is a complex concept which for this study denotes the quantity and quality of desired beneficial results accruing due to purposeful organizational activities against yardstick standards (internal or external) of productivity and profitability. Competitive intelligence (CI) is a type of information service used by organizations: the process and the products of an organization s data collection and analysis about the competitive environment (Fleisher & Blenkhorn, 2001), with close historic and practical ties to covert intelligence, military intelligence, business intelligence, and market intelligence (Buchda, 2007; Juhari & Stephens, 2006). Managers of organizations have indicated when surveyed that the primary benefit they expect to receive from CI is improved decisionmaking (Marin & Poulter, 2004), resulting in savings of time and money from improvements to internal business processes (Herring, 1996), improvements in customer service (Qingjiu & Prescott, 2000), and improved ability to anticipate threats and opportunities in the marketplace

Measuring CI Outcomes Gainor 3 (Hannula & Pirttimaki, 2003), among others. Most expected outcomes can be loosely grouped under the headings of financial outputs, improved client relationships, and innovation in products and services. No evidence has yet been found to establish the relationship, if any, between CI and these hoped-for benefits, although some correlations have been found between CI use and positive organizational performance (e.g., Adidam, Banerjee, & Shukla, 2012). This situation has complicated the development of performance-based measures for CI. Scholars and practitioners of competitive intelligence indicate in the literature that significant challenges exist in identifying how to measure the outcomes and impact of intelligence. These challenges are frequently attributed to conceptual and methodological problems of measurement also cited in intellectual capital (IC), knowledge management (KM), and library and information studies (LIS) measurement literatures. These methodological issues are related to intangible results, secondary effects, and the occasional time lag for results to appear (see for example Kujansivu & Lönnqvist, 2009). Methodological challenges are further complicated for CI in that if the purpose of CI is to improve decision-making, any research into CI value as it affects decision-making must necessarily rely on highly subjective data and attempt to quantify cognitive effects. Conceptual problems include a multiplicity of prescriptive measurement models in the literature and descriptions of unique practice. Scholars have applied prescriptive models of measurement to measure the value and performance of CI, but these have not been evaluated (e.g., Kujansivu & Lönnqvist, 2009; McGonagle & Vella, 2002; Davidson, 2000). CI measurement in practice has defaulted to activity measures of process and usage (Ganesh, Miree, & Prescott, 2004), while CI practitioners call for improvements to measurement (Marin

Measuring CI Outcomes Gainor 4 & Poulter, 2004). In the literature there are significant assumptions being made about intelligence benefits that are unsubstantiated by research (Lönnqvist & Pirttimäki, 2006), partly because there is a lack of research-based evidence. In response, scholars interested in determining the value of CI have made calls for empirical data (Hughes, 2005), case studies (Wright & Calof, 2006), and additional fieldwork, so that measures of competitive intelligence outcomes might be developed (Marin & Poulter, 2004) and that the benefit and value of CI to organizations might be determined in relation to decision-making. 3. Objectives and Research Questions The author of this research proposal takes an instance of an information service a competitive intelligence unit and proposes a research study to investigate how the value of the service can be identified, quantified, and represented to stakeholders when its primary purpose, to inform and improve decision-making, is intangible. The purpose of this research project, in light of these calls for research, is twofold. First, to determine what relationship, if any, exists between CI and decision outcomes; second, data collected will be used to evaluate prescriptive models of CI measurement in the literature. Three research questions have been formulated: 1. How, when, and by whom is CI used as an input into organizational decision-making? 2. When CI is used, what are the organizational outcomes in critical categories such as finances, client relationships, and innovation? 3. In light of organizational constraints, which measurement methods identified in the literature are most appropriate for use in determining CI outcome and impact? This research design has been developed in order to identify the intangible strategic outcomes and impact of competitive intelligence (CI), if possible, and how they determine CI value in relationship to the decision-making process. Identification will then serve as a

Measuring CI Outcomes Gainor 5 preliminary step to measurement, taking into account reported challenges and difficulties related to representing and quantifying aspects of organizational decision making, outcomes, and impact. The conceptual and methodological concerns for measurement described briefly above mandate qualitative research methods. The research design is a case study, examining decisions affected by CI (the unit of analysis) at an organization with a formal CI unit. This research design responds to calls for more field research, such as case studies, investigating specific and particular aspects of the CI cycle and CI measurement (Ganesh, Miree, & Prescott, 2004; Calof & Wright, 2008). 4. Research Design In order to address these research questions, a case study is proposed to investigate practice and feasibility in CI measurement. Qualitative and explorative research will take place at a single organization with a CI unit. A CI unit is defined as dedicated resources and staff specifically tasked with generating CI deliverables such as reports and presentations for internal CI clients and decision makers. With the help of CI unit employees, decisions informed by CI and made within the organization 2-5 years in the past will be identified for study. This timeframe is to allow decision outcomes to have been manifested but employees involved to still be in the organization. Interviews and document analysis will be the research methods used. A model of the conceptual framework is provided below.

Measuring CI Outcomes Gainor 6 Output 1 Outcome 1a Problem Definition Problem Conceptualization (Decision Inputs): Decision-maker biases CI deliverables(s) Organizational processes Assumptions Etc. Selection Output 2 Outcome 2a Outcome 3a Impact(s) Decision Output 3 Outcome 3b Strategic Plan Figure 1: A model of the conceptual framework for the study Here the first stage of organizational decision-making, the problem definition, leads to problem conceptualization, in which CI is one input among potentially many. The organizational decision-making process terminates with a selection from an array of possible actions. A decision is defined as a process that involves these three stages, and is the unit of analysis. Subsequent to the selection and ensuing activity, outputs begin to appear, followed in time by more intangible outcomes. These outcomes in turn impact the organization, either leading it to fulfilment of or divergence from the organization s strategic plan. In the proposed model, a good measure becomes one that allows time for impact to manifest, accounts for the role of CI in decision making, and relates CI to organizational strategy. Indicators of finances, innovation, and client relationships of value or benefit can then be traced through tangible outputs in multiple dimensions of the organization. This case study site will be an organization that: has a CI unit in operation at least three years (to ensure maturity in its operations); employs a minimum of two full-time CI employees (identified by job function and not title); and has a strategic plan detailing goals for the organization in place for at least three years.

Measuring CI Outcomes Gainor 7 Once access to a case study site has been established, a combination of interviews and document analysis will be used to examine traceable decisions, that is, strategic decisions informed by a CI product 3-5 years ago. Those decisions will be studied to determine if and how CI affected the decision-making as an input to the organizational decision-making process. A single decision will serve as the unit of analysis. These will be identified with the cooperation of the case study site manager(s) and decision makers. The number of decisions isolated for analysis will entirely depend on how many decisions provide saturation. It is expected that the case study as it is described here will involve the interviews of 10-15 people, some of whom may have only one decision example to provide. Although the researcher anticipates that respondents will state that they believe CI to be useful to decision-makers and be able to provide examples, the interview guides are designed to capture negative responses and experiences, which will also provide useful data. In this study, document analysis will be used concurrently with the interviews to crossreference subjective statements whenever possible and provide some objective data. For example, subjective reports of CI value can be referenced against management s CI unit usage reports. Stories of meetings held can be checked against meeting minutes. A list of documents to be requested as a starting point for the document research at an organization such as Google or the Coca-Cola Company can be found in table 1: Document Strategic plan(s) for past five years Shareholders annual reports for past five years Stock quotes for past five years Chairman/CEO statements for past five years Press releases for past five years, particularly focussing on those related to traceable decisions and three indicators Industry reports published by a third party Press releases and other documentation published by competitors and industry observers relating to products and services affected

Measuring CI Outcomes Gainor 8 Strategic (business) plans for past five years Training documents relating to CI use, processes, products CI deliverables used to inform the traceable decisions Emails related to the decision, CI used, or evaluations of the decision or its results after the fact Other documentation, such as sales reports, related to identified outputs/outcomes of decisions Table 1: Preliminary list of documents for collection at case study site Once the data collection is completed, decision objectives and outcomes will be compared against the organization s strategic plans from both the time the decision was made, and the time the research is undertaken, in order to examine how well the outcomes align with strategic goals and assess impact. If, for example, a given decision results in an innovative outcome of a new product, the researcher can examine how that product has been used or valued since being introduced to the market and if it has fulfilled any requirement of the current or past strategic plan. The data collection and preliminary data analysis has been broken into stages, summarized in the table below. They will be followed by a final data analysis stage, in which recommendations for CI measurement found in the literature will be examined and tested. Research Question Steps Methods: Deliverable Interviews with CI unit employees, Understand how CI is used internal clients, as an input into decisionmaking at the organization decision makers; review of training documents 1. How, when, and by whom is CI used as an input into organizational decision-making? 2. When CI is used, what are the organizational outcomes in critical categories such as finances, client relationships, and innovation? Understand the organization s strategic decision-making process Describe foundational factors, implementation, and strategic/tactical objectives of decisions Identify decision outputs and outcomes Source assessments of the decision outcomes, value Identify where (if) outputs and outcomes align with the strategic plan Interviews with decision makers and managers Interviews with decision-makers and managers involved with the strategic decisions; document analysis; chains of evidence tables An account of CI practices within the case study organization for comparison to normative accounts of CI/ decision-making practice in the literature 1. Model(s) of retrospective decision analysis with CI as an input; 2. Chains of evidence tables showing relationship(s) if any between inputs and outcomes of decisionmaking

Measuring CI Outcomes Gainor 9 3. In light of organizational constraints, which measurement methods identified in the literature are most appropriate for use in determining CI outcome and impact? Test prescribed measurement methods and tools with the data collected from case site Table 2: Research questions connected to data collected Data analysis Recommendations as to best measurement methods, tools, and approaches Once data has been collected, the researcher will do some preliminary data analysis as described in the section on data collection. Data will be analysed to discover what role CI had in each traceable decision, identify outcomes, relate outcomes to the strategic plan, consider the value placed on CI and its role in the organization, and identify attitudes and practices in CI measurement. Other decisions will be analysed to determine the role, or lack thereof, of CI, within departments and activities of the organization. 5. Significance The value of this proposed case study research is that it examines CI s role in advising/influencing a decision, which field research assessing CI value has not yet done (see Blenkhorn & Fleisher, 2007; Lönnqvist & Pirttimäki 2006; Marin & Poulter, 2004), thus filling a gap in the literature, and informing future research methodology into CI value and practices. Another outcome will be the testing of prescriptive measurement methods and tools in the CI literature, which will help inform not only CI measurement research, but also practice. Most particularly, valuable data and insight will be gained into how outcomes of intelligence can be identified and assessed objectively. CI practitioners have reported that measurement is a priority area of development for the field (Hannula & Pirttimaki, 2003; Qingjiu & Prescott, 2000). This data will be applied to developing measurement tools and methods for practical use.

Measuring CI Outcomes Gainor 10 6. Conclusion Although correlations have been found between CI and positive organizational performance (e.g., Adidam, Banerjee, & Shukla, 2012), a cause and effect relationship has not yet been identified and described in relation to specific outcomes, which might then serve as indicators of performance. As a result, scholars have called for field research investigating this relationship. Simultaneously, recognizing that performance measures currently in practice are of questionable value, multiple calls have been made for research to investigate how CI performance can be measured, taking into account that the primary value of CI is in how well it informs decision-making (Sawka, in Blenkhorn & Fleisher, 2007; Lönnqvist & Pirttimäki 2006; Marin & Poulter, 2004). CI scholars have argued that the primary purpose of intelligence is to inform decision making, with the intent to increase the likelihood of the most optimal outcomes (Bose, 2008). CI units can be described as a specialized type of information service, with CI deliverables being a specialized type of information product, used with the intent to develop more effective decisionmaking. Understanding the impact of these intelligence services and products upon decisionmaking, and through decision-making upon organizational outcomes, has implications for performance measurement to demonstrate the value of information services in LIS and other fields, improvement to organizational decision-making processes, and the role of competitive intelligence services within organizations.