POS Data Quality: Overcoming a Lingering Retail Nightmare

Similar documents
Maximizing Business Value Through Effective IT Governance

Cognizant Insights. Executive Summary. Overview

Two-Tier ERP Strategy: First Steps

ICD-10 Advantages Require Advanced Analytics

Building a Collaborative Multichannel Insurance Distribution Strategy

Cognizant assetserv Digital Experience Management Solutions

> Cognizant Analytics for Banking & Financial Services Firms

How Healthy Is Your SaaS Business?

Driving Innovation Through Business Relationship Management

Complaints Management: Integrating and Automating the Process

Creating Competitive Advantage with Strategic Execution Capability

Retail Analytics: Game Changer for Customer Loyalty

Credit Decision Indices: A Flexible Tool for Both Credit Consumers and Providers

Cognizant Mobile Risk Assessment Solution

Agile Planning in a Multi-project, Multi-team Environment

LifeEngage : The Life Insurance Platform for the Digital-Age Insurer

Transform Customer Experience through Contact Center Modernization

Granular Pricing of Workers Compensation Risk in Excess Layers

Cloud Brokers Can Help ISVs Move to SaaS

Cognizant Mobility Testing Lab A state of the art Integrated platform for Mobility QA

Key Indicators: An Early Warning System for Multichannel Campaign Management

How To Choose A Test Maturity Assessment Model

Making Multicloud Application Integration More Efficient

DevOps Best Practices: Combine Coding with Collaboration

Agile/Scrum Implemented in Large-Scale Distributed Program

How To Know If A Project Is Safe

Business-Focused Objectives Key to a Winning MDM Implementation

Predictive Response to Combat Retail Shrink

Virtual Clinical Organization: The New Clinical Development Operating Model

A Tag Management Systems Primer

Innovative, Cloud-Based Order Management Solutions Lead to Enhanced Profitability

The Impact of RTCA DO-178C on Software Development

Extending Function Point Estimation for Testing MDM Applications

Cognizant Mobility Testing Lab. The faster, easier, more cost-effective way to test enterprise mobile apps.

Integrated Market Research: The Intelligence Behind Commercial Transformation

Reducing Costs, Increasing Choice: Private Health Insurance Exchanges

> Solution Overview COGNIZANT CLOUD STEPS TRANSFORMATION FRAMEWORK THE PATH TO GROWTH

Optimizing Agile with Global Software Development and Delivery

Using Predictive Analytics to Optimize Asset Maintenance in the Utilities Industry

Virtual Brand Management: Optimizing Brand Contribution

Cognizant White Paper. > Casual Dining vs. Quick Service. Key differences from a Process-IT standpoint

Two-Tier ERP: Enabling the Future-Ready Global Enterprise with Better Innovation, Customer Experience and Agility

Mortgage LOS Platform Evaluation and Selection

Improve Sourcing and Contract Management for better Supplier Relationship

Open Source Testing Tools: The Paradigm Shift

Moving Beyond Social CRM with the Customer Brand Score

Solving Storage Headaches: Assessing and Benchmarking for Best Practices

Strategic Cost Optimization: Driving Business Innovation While Reducing IT Costs

Integrated Approach to Build Patient Adherence: Helping Pharmaceutical Companies to Enhance Growth

Don t Let Your Data Get SMACked: Introducing 3-D Data Management

Streamlining Submission Intake in Commercial Underwriting for Middle Market Segments

Emerging Differentiators of a Successful Wealth Management Platform

Migration Decoded. Cognizant Insights

Talent as a Service: Enabling Employee Engagement While Boosting Efficiencies

HIX 2.0: New Alternatives for State Participation in Health Insurance Exchanges

Transforming the Business with Outcome-Oriented IT Infrastructure Services Delivery

How To Understand The Financial Impact Of Icd-10

Knowledge Management in Agile Projects

A Next-Generation Approach to Integrated Warranty Management

Online Capabilities of UAE Insurance Carriers: The Road to Customer Satisfaction

Moving Financial Planning and Analysis to the Next Level

e-signatures: Making Paperless Validation a Reality

Siebel Test Automation

Strategic Intraday Liquidity Monitoring Solution for Banks: Looking Beyond Regulatory Compliance

Business Process Management for Successful Core Banking Implementations

Securities Master Management Solution: The First Step Toward Credible STP

Transcription:

Cognizant 20-20 Insights POS Data Quality: Overcoming a Lingering Retail Nightmare By embracing a holistic and repeatable framework, retailers can first pilot and then remediate data quality issues incrementally, overcoming the planning challenges created by heterogeneous POS systems. Executive Summary At many retail organizations, flawless business execution depends on in-store and back-office systems working together in harmony. Multiple and diverse point-of-sale (POS) systems pump data to major internal applications that drive key processes, such as planning merchandise assortments, item allocation, replenishment and key performance indicator (KPI) metrics reporting. In reality, however, significant POS data quality issues and inconsistencies can and often do disrupt business operations. This white paper details a remediation approach that helps retailers validate and enhance POS data quality. The approach enables access to accurate, on-time data feeds, thus improving strategic decision-making throughout the business. Issues with Diverse POS Systems It s not uncommon for retailers to use a mix of POS systems that are as varied as their product portfolios: old and new, open source and proprietary, cloud and on-premises. 1 This is due to the timing of POS system retirement and replacement. New systems are usually deployed at a location/country level, leaving the other units within a franchise group or region with incompatible POS s that have been retained following a merger or another business reason. Whatever the rationale, retail chains of all sizes often utilize a medley of POS systems. These diverse, heterogeneous IT environments can deliver reasonable business value, but they are not suitable for every organization, especially when POS data is needed by the consuming applications in real time. The issue is made even more complex because of different POS systems and data formats, as well as the absence of proper audit/validation. Both of these issues need to be rationalized before the data reaches the applications because they can adversely affect weekly allocation and replenishment capabilities, decision support, reporting accuracy, planning processes and various operational processes. Major issues that can impact POS data include: Missing data: Data/information is not available in time for use in business processes. Duplicate data: The same data/information is received more than once. Suspicious or incorrect data: Data is present but incomplete, erroneous or unreliable. Delayed data: Data is received beyond the optimal timeframe. cognizant 20-20 insights january 2014

As an end-state objective, POS data quality must be improved to enable retailers to reach several essential goals, including enhanced planning, forecasting and reporting; improved efficiency through faster data reconciliation; and the ability to read and respond to changes in business performance. Remediating POS, One Step at a Time While a long-term solution would be a centralized data clearinghouse, such an implementation would not address the immediate needs of the business. To mitigate this problem, an interim solution can be outlined that strengthens data quality for partial store populations, one population at a time, starting with the most problematic populations. The steps involved in this solution include the following: Pilot before deployment: For starters, retailers need to take stock of their data quality validation (DQV) process and uncover key issues (see sidebar, page 4). DQV processes ensure that programs are operating with clean, correct and useful data. Such a pilot should be carried out with two objectives in mind: > > Demonstrate the data quality validation process via a proof of concept on a selected population of stores. Identify and catalog data quality issues, establish root causes, develop possible corrective actions (including the conversion/upgrade of low-performing franchisee stores) and, when appropriate, implement the corrective actions. > > Based on the lessons learned from the work above, design, build and document a repeatable framework (tools and processes) aimed at identifying, analyzing and fixing the issues, as well as improving POS data quality for additional or subsequent populations of stores and data feeds. This framework can then be used to carry out similar DQV exercises across other populations or geographies. Build a repeatable framework: The process of building and designing the repeatable framework to address the immediate needs of the business will consist of a series of steps to be performed by multiple, collaborating stakeholders, such as IT, key business partners and source system data providers. This flow of activities is depicted in Figure 1. Infrastructure requirements: Capture the environment and access requirements needed prior to the start of the POS data quality validation exercise. Include infrastructure requirements for production platforms, as well as those specific to the DQV platform (e.g., data capture file folders, mini-clearinghouse database, ad hoc programs, reports requirements, etc.). Having this completed before Orchestrating the POS Partner Ecosystem A. Infrastructure requirements B. Scope definition C. Interface documentation D. Requirements gathering for the consuming application F. Use and maintenance of database clearinghouse G. Collection of POS reports H. Collection of live data feeds J. Data fixes and remediation E. Documentation of known POS data issues I. Development and maintenance of variance reports Source System Resources Business Partners IT Resources (all steps) Figure 1 cognizant 20-20 insights 2

starting the DQV exercise will ensure hasslefree execution, with the project team focused on actual DQV activities rather than infrastructure problems. Scope definition: Identify the key elements that need to be defined as in- or out-of-scope for the next POS DQV exercise, such as a list of stores (categories based on affiliate, business model, POS system, etc.), a list of interfaces/ data feeds, variance reports, etc. Because a scope definition explains which elements are in or out scope and why, this activity helps to precisely define the target elements of a particular execution of the DQV exercise/process. Interface documentation: Describe the key information that needs to be collected, documented and verified to ensure full understanding and usability of the overall POS feeds integration architecture. Interface documentation is a crucial factor in the success of the exercise, as it provides a strong understanding of the data that is transmitted from the POS systems via the interfaces and helps determine whether adequate information is available to proceed with the DQV exercise. In the case of a data type or data definition mismatch, the interface documentation can expose the failure points in the data channels if they pertain to mapping issues between the various integration points. Requirements gathering for consuming applications: Capture the requirements that are specific to each consuming application (such as replenishment, allocation and merchandising applications) regarding POS data feeds This ensures that the validation takes into account and analyzes all critical data elements present in the POS feeds. This step is critical to gaining insight into what the target applications expect the interface data feeds to contain and why. It also provides knowledge of specific data processing, filtering and logic for the affiliate POS interface in one or more of the consuming applications. Documentation of known POS data issues: Catalog historic and live POS data issues and analyze how they can be used in a POS DQV exercise, including the accretion DQV documented issues from one iteration to the next. This activity should be performed to gather all the information for the known issues and to understand the scale and scope of data quality issues, the frequency of occurrence of those issues and the most problematic interfaces for the affiliate in question. This also helps the project team in the data analysis exercise down the line. Use and maintenance of a database clearinghouse: Build the custom mini-clearinghouse database, as well as the copy and load processes for POS reports and live data feeds. These processes will be reused, configured or updated for feeding the POS data into the miniclearinghouse database. This phase involves creating data structures, uploading captured data and performing other supporting processes related to the mini-clearinghouse database. Collection of POS reports: This is one of the data collection activities. In this step, the various POS reports are reviewed, captured and pre-processed to make sure they are ready for the mini clearinghouse. Collection of live data feeds: This is another step of data collection, which includes review, validation, capture and pre-processing of data feeds inbound and outbound of each integration point, preparing them for processing in the mini clearinghouse. This phase deals with the capture of in-scope data feeds as they are made available to the various in-scope integration points. Development and maintenance of variance reports: Identify and develop the variance reports to capture various data exceptions and variances. A follow-up root cause analysis exercise helps identify the possible reasons for the exceptions. This phase explains which activities need to be performed so that the data loaded in the mini clearinghouse can be analyzed to understand the data quality issues. Data fixes and recommendations: Review, assess, escalate, resolve and identify recommendations and actions. Follow-up issues and data fixes depend heavily on the local POS and system integrations. The objective of this phase is to build a prioritization, retention/rejection, remediation and implementation process to be followed for identified exceptions. cognizant 20-20 insights 3

Implement the framework for all stores: This is a recurring step in which the retailer needs to implement the repeatable framework to resolve data quality issues for all stores. It is very important to acknowledge the differences of dissimilar groups of stores and adapt or customize the framework before using it for this exercise. In addition, lessons learned from past implementations should be utilized to improve the framework for speedier and more efficient roll-outs. Drivers for Success Key success criteria that should be examined to ensure a successful DQV exercise include: Data issue fixes and remediation: Since the objective of this exercise is to resolve possible data quality issues, the main success criteria should stipulate that issues be fixed upon identification. Issue identification: Ensure the tool that is built as part of the repeatable process framework is capable of identifying the issues as and when they exist. Quick Take POS Data Validation for a Global Apparel Retailer We worked with a retailer that was struggling with data quality issues in its 40-plus POS systems that sent data to its internal applications. The issue originated in its European region, where 22 affiliates use five different POS interfaces and eight consuming systems, which added to the data quality complexity. We partnered with the retailer s IT and business teams to carry out a DQV exercise for its Eastern Europe unit and helped build a repeatable framework to be followed in other regions. The 15-week engagement addressed the DQV needs for 40-plus stores, with data flowing through four different POS systems. The exercise was Activities Performed During the DQV Exercise conducted in two different tracks. The first track was to build the repeatable process/framework; the second was to test the framework with data from different sets of stores. Figure 2 offers a snapshot of the activities performed to conduct the POS data quality analysis exercise. Roughly 2,500 issues were identified and fixed through this pilot POS data validation exercise. The client is in the process of rolling out the repeatable process framework to other countries or affiliates in Europe. Activities Week 1 Week 2 Week 3 Week 4 Week 5 Week 6 Week 7 Week 8 Week 9 Week 10 Week 11 Week 12 Week 13 Week 14 Week 15 Week 16 Week 17 Week 18 Infrastructure requirements Scope definition Interface documentation Requirements gathering for consuming applications Documentation of known EPOS data issues Use and maintenance of DB clearinghouse Collection of EPOS reports Collection of live EPOS data feeds Execution and development of variance reports Analysis of variances Data fixes and remediation Figure 2 Knowledge Acquisition Knowledge Acquisition Preparation Construction Development Documentation Documentation Data Load Data Sampling Window Data Capture Data Sampling Window Execution and Pre-Analysis Root Cause Analysis of Variances cognizant 20-20 insights 4

Issue documentation: To better understand issues for their fix and resolution, the tool should be able to document the issues completely and in a timely manner. Communication: The tool s ability to communicate issues to the team, along with a proper root cause analysis in near-real-time, influences timely and effective issue resolution. Degree of repeatability: The DQV exercise should not be limited to only one geography or one set of stores. The tool/framework must, therefore, be easily replicated to a new geography or set of stores under consideration for data quality validation. Hence, the ability to implement a repeatable process framework in a reasonable timeline for any new geography or set of stores is another important criterion for success. Data mining: An immense amount of data will be loaded into the mini-clearinghouse database during the entire sampling window, so the tool should be able to mine the resulting data. Looking Forward POS data utilization is fast becoming a necessity for retailers seeking to maintain their competitive advantage. POS data is critical in this regard, since it provides precious insight into consumer actions and, if properly applied, can help improve customer relationships and maximize internal business efficiencies. Moreover, once normalized, POS data serves as a vital input to various operational planning engines, such as replenishment and allocation. Such outcomes make a DQV exercise critical to any retailer s short- and long-term future. DQV ensures that the POS data extracted from diverse POS systems is correct, reliable and actionable. And importantly, this exercise ensures that the right data reaches the consuming applications, enabling better and more accurate decision-making and planning. Footnotes 1 Shannon Arnold, The Pros and Cons of Owning Diverse POS Systems, Maitre D, March 27, 2013, http:// web.maitredpos.com/bid/279593/the-pros-and-cons-of-owning-diverse-pos-systems. About the Authors Arindam Chakraborty is a Senior Consultant in Cognizant Business Consulting s Retail Practice. He has eight years of experience in merchandising, supply chain and store operations with specialization in point of sale. Arindam has CPIM and CSCP accreditation and holds an M.B.A. from the Institute of Management & Technology, Ghaziabad, India. He can be reached at Arindam.Chakraborty5@cognizant.com. Shilpi Varshney is a Consultant in Cognizant Business Consulting s Retail Practice. She has more than five years of experience in store operations, focused on multichannel and supply chain strategy. Shilpi holds an M.B.A. from Management Development Institute, Gurgaon, India, earning the Gold Medal. She can be reached at Shilpi.Varshney@cognizant.com. Doug Dennison is a Senior Manager in Cognizant Business Consulting s Retail Practice. He has more than 20 years of experience in store operations, store systems, loss prevention and workforce management. Doug holds an M.B.A. from Grand Valley State University in Grand Rapids, Michigan. He can be reached at Douglas.Dennison@cognizant.com. cognizant 20-20 insights 5

About Cognizant Cognizant (NASDAQ: CTSH) is a leading provider of information technology, consulting, and business process outsourcing services, dedicated to helping the world s leading companies build stronger businesses. Headquartered in Teaneck, New Jersey (U.S.), Cognizant combines a passion for client satisfaction, technology innovation, deep industry and business process expertise, and a global, collaborative workforce that embodies the future of work. With over 50 delivery centers worldwide and approximately 166,400 employees as of September 30, 2013, Cognizant is a member of the NASDAQ-100, the S&P 500, the Forbes Global 2000, and the Fortune 500 and is ranked among the top performing and fastest growing companies in the world. Visit us online at www.cognizant.com or follow us on Twitter: Cognizant. World Headquarters 500 Frank W. Burr Blvd. Teaneck, NJ 07666 USA Phone: +1 201 801 0233 Fax: +1 201 801 0243 Toll Free: +1 888 937 3277 Email: inquiry@cognizant.com European Headquarters 1 Kingdom Street Paddington Central London W2 6BD Phone: +44 (0) 20 7297 7600 Fax: +44 (0) 20 7121 0102 Email: infouk@cognizant.com India Operations Headquarters #5/535, Old Mahabalipuram Road Okkiyam Pettai, Thoraipakkam Chennai, 600 096 India Phone: +91 (0) 44 4209 6000 Fax: +91 (0) 44 4209 6060 Email: inquiryindia@cognizant.com Copyright 2014, Cognizant. All rights reserved. No part of this document may be reproduced, stored in a retrieval system, transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the express written permission from Cognizant. The information contained herein is subject to change without notice. All other trademarks mentioned herein are the property of their respective owners.