A Road Map to Successful Data Masking

Similar documents
Datamaker - the complete Test Data Management solution

Why Add Data Masking to Your IBM DB2 Application Environment

Data Masking: A baseline data security measure

Lalit Mittal NIIT Technologies White Paper

data express DATA SHEET OVERVIEW

Data Loss Prevention Best Practices to comply with PCI-DSS An Executive Guide

Data Masking Checklist

CPNI VIEWPOINT 01/2010 CLOUD COMPUTING

Defining the Enterprise Cloud

05.0 Application Development

Test Data Management

White Paper. Lower your risk with application data migration. next steps with Informatica

Datamaker for Skytap. Provide full-sized environments filled with up-to-date test data in minutes

Securing Data in the Cloud

HIPAA in the Cloud. How to Effectively Collaborate with Cloud Providers

Security in the Cloud: Visibility & Control of your Cloud Service Providers

Websense Data Security Suite and Cyber-Ark Inter-Business Vault. The Power of Integration

White Paper. Successful Legacy Systems Modernization for the Insurance Industry

Compliance. Review. Our Compliance Review is based on an in-depth analysis and evaluation of your organization's:

Data Masking. Cost-Effectively Protect Data Privacy in Production and Nonproduction Systems. brochure

Five Ways to Use Security Intelligence to Pass Your HIPAA Audit

Data-Centric Security vs. Database-Level Security

Data Security: Fight Insider Threats & Protect Your Sensitive Data

IBM InfoSphere Optim Data Masking solution

Test Data Management. Representative data, compliant for test environments

The NREN s core activities are in providing network and associated services to its user community that usually comprises:

PCI DSS Compliance for Cloud-Based Contact Centers Mitigating Liability through the Standardization of Processes for cloud-based contact centers.

Test Data Management. Services Catalog

HP Application Security Center

IBM Software Five steps to successful application consolidation and retirement

Third-Party Cybersecurity and Data Loss Prevention

Four Things You Must Do Before Migrating Archive Data to the Cloud

cloud Development Strategies - Part 1

How to address top problems in test data management

The State of Data Security Intelligence. Sponsored by Informatica. Independently conducted by Ponemon Institute LLC Publication Date: April 2015

Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking

GO LIVE, ON TIME, ON BUDGET

PCI DSS COMPLIANCE DATA

Third Party Supplier Security

Worldpay s guide to the Payment Card Industry Data Security Standard (PCI DSS)

A COMPLETE GUIDE HOW TO CHOOSE A CLOUD-TO-CLOUD BACKUP PROVIDER FOR THE ENTERPRISE

How to Define SIEM Strategy, Management and Success in the Enterprise

Application Security 101. A primer on Application Security best practices

DBKDA 2012 : The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications

Implementing a CMS. First Steps. A Case Study by. Raymond Sneddon, National Australia Group September Version: 1.

Best Practices in Contract Migration

Cyber Security - What Would a Breach Really Mean for your Business?

Business Continuity and Breach Protection: Why SSL Certificate Management Is Critical to Today s Enterprise

Ultracomms Cloud Solutions

Hybrid IT: The Cloud of the Future

Secure Enterprise Mobility Management. Cloud-Based Enterprise Mobility Management. White Paper: soti.net

Managing the Shadow Cloud

VENDOR MANAGEMENT. General Overview

How To Audit Cloud Computing

White Paper. Business Continuity and Breach Protection: Why SSL Certificate Management is Critical to Today s Enterprise

Test Data Management for Security and Compliance

Cyber Security and Privacy Services. Working in partnership with you to protect your organisation from cyber security threats and data theft

THE STATE OF DATA SHARING FOR HEALTHCARE ANALYTICS : CHANGE, CHALLENGES AND CHOICE

The Benefits of Archiving and Seven Questions You Should Always Ask

Managing SSL Certificates with Ease

Information Sheet: Cloud Computing

White Paper. Managing Risk to Sensitive Data with SecureSphere

Data Sanitization Techniques

Cloud Computing Questions to Ask

RISK MANAGEMENT PROGRAM THAT WORKS FOUR KEYS TO CREATING A VENDOR. HEADQUARTERS 33 Bradford Street Concord, MA PHONE:

Corporate Governor. New COSO Framework links IT and business process

Performance Testing and Functional Automation Specialist Cloud Services

Practical Data Masking: How to address Development and QA teams' 7 most common data masking related reactions and concerns

White Paper THE FIVE STEPS TO MANAGING THIRD-PARTY RISK. By James Christiansen, VP, Information Risk Management

IBM Software Four steps to a proactive big data security and privacy strategy

The potential legal consequences of a personal data breach

Formulate A Database Security Strategy To Ensure Investments Will Actually Prevent Data Breaches And Satisfy Regulatory Requirements

<Insert Picture Here> Oracle Database Security Overview

How to ensure control and security when moving to SaaS/cloud applications

PCI Compliance in Oracle E-Business Suite

Improving Business for SMEs with Online Backup Improving Business for SMEs with Online Backup

/ WHITEPAPER / THE BIMODAL IT

Transcription:

A Road Map to Successful Data Masking Secure your production data By Huw Price, Managing Director

2 Introduction How valuable is data masking to your organisation? According to the Ponemon Institute s 2013 Cost of Data Breach Study, the average data breach in 2012 cost $5.4m ($194 per record) in the US, and $3.1m/approx. 2m ($214/ 140 per record) in the UK1. In more heavily regulated industries, the risk is far greater the global average for healthcare ($233) and finance ($215) far exceed the mean cost of data breach ($136)2. Factor in the cost of customer defections and resultant falls in share price, and it becomes clear that it has never been more important for organisations to de-identify sensitive data. The case for data masking known variously as data obfuscation, data de-identification and data desensitisation becomes even more compelling when you consider that 63% of data breaches are the result of internal causes, such as human error or business/it process failures3. Therefore, simply by securing the data before it is outsourced to third-parties and off-site teams, or made available for development and testing, you can mitigate the risk of exposing sensitive content by two-thirds! Presented in these terms, data masking is no longer a nice-t-have, but an essential business process. Why do you need a Road Map? Put simply data masking is not the simple process that the uninitiated might suppose. 4 Gone are the days where replacing personally identifiable information (PII) with random characters will suffice. As you are obfuscating the data for use in development, testing and QA environments, you need to be able to quickly provide teams with secures sets of consistent, meaningful data that can be used again and again. However, this can be difficult to achieve, particularly in geographically dispersed organisations, without adopting a systematic, centralised approach to de-identifying sensitive data. For starters, not all data is created equal. In his paper on The Mathematics of Data Masking, Llyr Jones goes into greater depth on the four orders of data masking, which include sensitive commercial trends and transactional data, as well as PII. Whilst obfuscating the latter is usually enough to satisfy the regulators, internal policies may require that pricing rules or trends in stock prices, for example, are desensitised in order to mitigate the risk of leaking them to competitors. Establishing a centralised approach enables organisations to control what data you desensitise, and how it should be achieved. Modern organisations are also faced with organisational perils, such as outsourcing to third-party vendors, Big Data and migrating data to the Cloud, which exacerbate the risk of data breach and necessitate a more systematic approach to securing your sensitive data. For example, under the proposed reforms to the EU Data Protection Directive, any company which is active in the EU or is serving customers in the EU, will fall under the jurisdiction of both local and European data protection laws. This creates another potential minefield that can t be navigated without central guidelines. 1 Ponemon Institute, 2013 Cost of Data Breach Study: Global Analysis, p.5 2 Ibid. p.6 3 Ibid. p.7 4 Howard, P., Data Masking: A Spotlight Paper, Bloor Research, Oct 2012

3 It should, therefore, come as no surprise that organisations who appointed a CISO (Chief Information Security Officer) to manage a centralised, systematic approach to database security were able to further reduce the cost and risk of data breach5. With the potential cost of non-compliance and project failure so high, modern IT environments are just too complex to properly secure in an ad hoc fashion. This paper illustrates how entering into your data masking project with a clear, systematic road map enable you to better plan how much time and resource you require to understand: What data needs to be masked Where the sensitive data is located within your IT infrastructure How you need to desensitise the data to maintain compliance with data protection standards Discovering Your Sensitive Content The first stage of any data obfuscation project is understand what data you need to mask and where it is located. The former is usually determined by data protection legislation (HIPAA, PCI DSS, and the EU Data Protection Directive, for example) or internal database security policies and considerations. However, as Philip Howard suggests, manually locating all of the potentially sensitive records in large, complex modern IT organisations faced with the challenge of processing big data stored in various formats across multiple, disparate data sources, is wholly inappropriate. 6 To begin with, manual data discovery is expensive, resource heavy and error-prone. After all, can an individual really be expected to find all of the potentially sensitive information in a database containing hundreds of tables, even with up-to-date documentation? They can t; and here lies a fatal flaw in utilising manual techniques. Other pitfalls also await the organisation which continues to sample their data manually; the most pervasive being data quality. Take, for example, a debit card number in the form nnnn nnnn nnnn nnnn. Have you considered whether or not the database supports spaces? Can you guarantee that every single entry has been entered in that format and not with dashes as separators? In that case, is it actually a debitcard number, or just a 16 digit number? This requires a lot of subjective supposition to ascertain, which can lead to false positives being passed, and more disconcertingly, overlooked PII. Automated data discovery ensures an objective, systematic approach to your data sampling, making it possible to verify that all of the required sensitive content has been identified. Powerful, mathematically-based algorithms also allow you to identify potentially sensitive trends and relationships within the data, then filter them out. Knowing the location, and trends and relationships within, your data is essential to performing consistent masking runs, which go beyond securing PII; a task that is impossible to complete on applications which touch upon multiple data sources and types. 5 Ponemon Institute, 2013 Cost of Data Breach Study: Global Analysis, p.9 6 Howard, P., Data Masking: A Spotlight Paper, Bloor Research, Oct 2012

4 Creating a Process for Auditing Maintaining full compliance with current data protection standards is somewhat of a moving target. However, organisations are expected to keep pace with regular alterations to regulations, as well as manage dynamic changes within their own IT infrastructures. Therefore to demonstrate best efforts, you need to be able to show that you have implemented systematic measures towards compliance. We suggest a three-tier structure, in which masking operations are checked, validated and approved, but there are numerous ways of achieving this. The key is to demonstrate checks and balances. Introducing rigorous, centralised auditing also allows you report on the details of the mask: who performed it, how, when and what technology was used etc. This enables you to track the process from start to finish in audit reports; an operation which data protection regulations increasingly demand can be produced upon request. Thorough reporting also provides before and after comparisons of your data source, enabling you to check that all of the sensitive data has been masked. Improving your Masking Infrastructure Once you have established what sensitive data is going to be masked and where it is located, you need to consider how you are going to go about it. However, this involves a number of considerations. First of these should be performance. Data masking is a quick win solution to preparing your sensitive data for use in non-production. Therefore, your approach needs to be flexible, fast and easy-to-use. In the modern market, this means adopting an automated data masking solution; manual approaches are slow, costly and resource heavy, whilst in-house utilities can be difficult to maintain, with user knowledge often limited to a handful of personnel and lacking in good supporting documentation. Valuable automated data masking solutions should be optimised to use native database utilities for masking, particularly for Mainframe and non-windows platforms. Removing the need to extract the data before treating it ensures the highest possible performance when executing your masking run. This is particularly important on Mainframe platforms, where having to extract, mask and reload the data is expensive, slow and uses significant amounts of CPU time, which can be difficult to secure. For high quality, efficient development and testing, you also need to make sure that the deidentified data has the look and feel of production, but without the sensitive content. In the past, it has been common to merely encrypt sensitive records, or replace them with random characters. However, this does not make for effective testing. For example, many social security numbers have check digits which define them as such. Without these or say, easy readability that something is a name etc. the data is unintelligible, and cannot be re-used across different teams for development or testing. Although it requires a little work upfront, the answer here is to build and use seed tables, which contain lists of realistic values, or use automated masking rules which maintain the format of the data. There are a number of benefits to this. The first is that you can replace sensitive content with the realistic, randomly generated values needed for meaningful testing. Secondly, masking your data according to centralised policies allows you to ensure that the data is masked consistently across the enterprise, maintaining all of the business rules and referential integrity inherent within your data. This provides considerable value to modern organisations, particularly when outsourcing to thirdparties, enabling you to share and re-use the data across multiple teams, projects and environments.

5 Summary Modern IT organisations are large, complex and disparately located. They are also required to respond to the needs of the business more quickly than ever before. This means providing development, testing and QA teams with the realistic, consistent, secure they need to shift left in the Software Development Lifecycle (SDLC). However, any test data provisioning exercise needs to consider the requirements of data protection legislation and internal policies for securing sensitive commercials. Adopting a structured, systematic approach to data masking allows you to respond to the needs of the business, whilst also ensuring best efforts in meeting compliance with data protection standards. This enables you to significantly mitigate the risk of at least two-thirds of data breaches, whilst allowing you to accurately scope, and minimise, the cost and effort required to secure your sensitive content, providing a powerful business case for adopting the best practices expected by regulators. Visit our website www.grid-tools.com/fastdatamasker Call us: USA: 1-866- 519-3751 UK: +44 (0) 1865 884600 Or Email us at sales@grid-tools.com