Providing Secure Representative Data Sets
|
|
|
- Aleesha Ellis
- 9 years ago
- Views:
Transcription
1 Test Data Protection Providing Secure Representative Data Sets By Dr. Ron Indeck VelociData Inc. - World Headquarters 321 North Clark Street, Suite 740 Chicago, IL Telephone: Research & Development 349 Marshall Ave, Suite 302 St Louis, MO Telephone:
2 Contents 1. VelociData Enterprise Streaming Compute Appliance (ESCA) 1 2. Test Data 1 3. Creating a Representative Model Copy Challenges in Creating the Model Copy 3 4. VelociData TDP 4 5. Format Preserving Masking 4 6. Deterministic Masking 6 7. TDP Use Cases Use Case 1: Creating a secure, HIPAA-compliant full production dataset from Microsoft SQL Server Use Case 2: Secure data for insertion into an Azure cloud Use Case 3: Securing test data for off-shore developers Use Case 4: Creating daily datasets for Development, QA, and Test Integration Summary Let Us Help You 11 All Content 2015 Velocidata inc.
3 1. VelociData Enterprise Streaming Compute Appliance (ESCA) The VelociData Enterprise Streaming Compute Appliance (ESCA) is the result of over two decades of development and the deployment of hundreds of systems in the most demanding IT environments. The system comprises a unique combination of components in a system that is dedicated to high performance processing of streaming and serial information. Figure 1: The first enterprise streaming compute appliance. Cloud Production Databases Sensitive Data Protection Streaming Data Ingestion Batch Process Delegation Mainframe Enterprise Data Warehouse Streaming Data Masking, Encryption, Transformation & Distribution Application Servers Cloud Hadoop HDFS This white paper focuses on using ESCA to protect sensitive data when it is used for testing software applications. To do this, the data must be rendered unusable but still retain their format (e.g., obfuscated telephone numbers will still be 10 ASCII digits), their volume (no-specific subsetting is required), and their relation (fields will still join properly). These processes can be applied as the data moves from source to target and representative model copies can be different for different targets without slowing the data down. 2. Test Data Development, testing, and quality assurance groups need access to data to build and test applications. For better, more rapid development, that data needs to look and feel like real production data. In many organizations, the way they achieve that look and feel is by copying over production data directly. This is acceptable for some data sets, but when the production environment holds PHI, PCI, or any other PII data, this exposes the company to unnecessary risk, including: Exposing sensitive data to a (drastically) broader set of users provides greater opportunity for breaches due to social engineering IT organizations need to manage and secure more user accounts, more data centers or network segments, and more copies of data at rest All Content 2015 Velocidata inc. Test Data Protection p.1
4 As an alternative, organizations could offer anyone who doesn t truly need the production data access to a Model Copy that holds the key characteristics of the production data, yet doesn t carry any true personally identifying information. To offer this in an effective way, it s important to differentiate between systems or users that need access to actual production data, or a representative model copy: Table 2: Example Data Needs Production Data Transactional Systems Billing Systems Fraud Detection Applications Reporting (user specific) Characteristic Data (Model Copy) Analytics Application Development Testing / QA Reporting (general reports) Proof of Concept / Evaluation Projects The key characteristics of model copies of data is that they must be representative in data character, distribution, and volume, and they must be fast and easy to generate. When these are generated quickly and easily, administrators can strictly limit access to raw production data, while being able to safely and easily provide representative data to a broad set of users. This provides several benefits, including: Less need for limiting user access, compensating controls, securing environments, etc. Less pressure for exposing production data into different development groups (especially when the model copy very closely mirrors the production data) Faster, more productive development, QA, integration testing, etc. 3. Creating a Representative Model Copy One of the best ways to generate a truly representative model copy is to perform a selective, deterministic, format-preserving masking operation on the raw production data to generate a derived output. This will ensure that test data will very closely mirror production for many different purposes. Representative: The test data is derived table for table, row for row from the production data Selective: Any sensitive fields (e.g., PHI) within those tables are masked using a NIST standard algorithm Deterministic: All similar input fields will map to the same masked output value such that correlations and joins can match on the same keys Format-Preserving: Output records must maintain the same data format (text, phone numbers, social security numbers, dates, etc.) When all of these conditions are met, testing environments can use the same database schemas, the same testing algorithms, run the same processing operations, and observe the same volumes and capacities that will be observed in the production environment. All Content 2015 Velocidata inc. Test Data Protection p.2
5 Figure 2: Test Data Protection 3.1 Challenges in Creating the Model Copy There are several concerns with the current solutions in the market that make creating a true model copy in an effective manner challenging: 1. Formatting or Schema changes Many masking solutions require changes to the format of the data elements when encrypting or masking the data 2. Lack of Deterministic Behavior Many simple masking solutions perform pseudo-random operations on the data to mask it, breaking the ability to perform correlations / aggregations / etc. 3. Limited Performance Most software vendors that provide format preserving encryption only transform a few hundred fields per second, which makes large data copies infeasible given typical time windows. 4. Lack of Tool Integration Many masking solutions are not integrated into data movement / data transformation components, requiring the users to create complex multi-product multi step jobs 5. Hard to Use Interfaces Most solutions require complicated tools to access masking functionality 6. Discovery Challenges Identifying PHI / PII elements is often a time-consuming chore 7. Insufficient Throughput Inability to perform daily refreshes or offer production-sized volumes for stress and performance which often results in data sub-setting vs. full model copies All Content 2015 Velocidata inc. Test Data Protection p.3
6 4. VelociData TDP VelociData offers a solution that can perform format-preserving masking while facilitating data movement / data transformations required to move data between production and test / development environments. This solution includes: Table 2: VelociData TDP Feature Format Preserving Masking (static and dynamic) Description Ability to de-identify data without changing its characteristics (permanent and reversible) Note that both static and dynamic operations are fully deterministic Hashing (MD5, SHA-2) Field Redaction Data Transformation Lookup / Replace Combine multiple input fields into a hashed surrogate key that can be used for tokenization Ability to remove / clear sensitive data elements that are not required for the model copy Ability to connect to a wide variety of data sources and to transform data formats in between (e.g. mainframe EBCDIC to ASCII) Ability to perform lookup-based replacements of sensitive terms with non-sensitive values 5. Format Preserving Masking VelociData offers a format preserving masking or format preserving encryption option that conforms to the NIST G standard. This solution can mask or encrypt data without changing the format of the fields. This means that a credit card number that is stored as 16 ASCII numeric digits can be deterministically masked into 16 ASCII numeric digits. A varchar name field in the database can be masked or encrypted into an equivalent number of alphabetic characters. All Content 2015 Velocidata inc. Test Data Protection p.4
7 Figure 3: Example Masking This format preserving characteristic allows users to fully secure their data without needing to change the database schema of development or testing systems. Below are the sets of field types currently supported or in development by VelociData: Table 3: VelociData Masking Data Types Value Description name All alphabetic characters and hyphens numeric ASCII numeric digits: 0-9 alphabetic Upper and lowercase characters: a-z and A-Z alphabetic_uppercase All upper case alphabetic characters: A-Z alphabetic_lowercase All lower case alphabetic characters: a-z alphanumeric All alphabetic characters and base 10 digits: a-z, A-Z, 0-9 alphanumeric_uppercase All upper case alphabetic characters and base 10 digits: A-Z and 0-9 alphanumeric_lowercase All lower case alphabetic characters and base 10 digits: a-z and 0-9 hex_uppercase ASCII numeric digits 0-9 and letters A-F hex_lowercase ASCII numeric digits 0-9 and letters a-f date Dates in ASCII numbers, in the format YYYYMMDD printable All printable ASCII characters everything The full set of ASCII characters mailing_address In Development- Ability to mask addresses into valid USPS mailing address output All Content 2015 Velocidata inc. Test Data Protection p.5
8 Also note that VelociData s performance allows for data to be masked or encrypted at 10 million fields per second. (Where competing solutions can handle hundreds or thousands of fields per second) As many fields are encrypted out of each record in your data set, this means the difference between trickling records through the system in dozens per second or moving data through at hundreds of thousands of records per second. When production data sets contain millions or billions of records, this could mean the difference between being forced to mask only a small subset of your data or being able to mask the entire data set in a matter of minutes. 6. Deterministic Masking Note that the nature of masking is critical in ensuring that data in the model copy are truly representative of your source data set. To clarify what that means, consider the diagram below: Figure 4: Deterministic Masking Notice in this case that John is masked to id Hw each time it is observed in the data, and notice that the patient s SSN is masked to the same output value every time, even when looking at multiple different tables. This allows data sets to be joined and correlated, even when the join keys are being masked. This is a strong feature to consider when choosing a masking solution. Another feature of the VelociData system is the choice between one-way obfuscation versus reversible processing. For most applications involving model copies for test environments there is no need to ever reverse the process and recover the origi- All Content 2015 Velocidata inc. Test Data Protection p.6
9 nal information. In the rare circumstances where the original data need to be recovered, VelociData works with key management systems to enable reversible processing when required. These methods and modes can all be accommodated on data in flight passing through the network or on static data at rest headed for data stores including data warehouses and HDFS. Table 4: VelociData Data Masking Processing Types Form of Obfuscation Redaction/removal Scrambling/shuffling Replacement/substitution Hashing Encryption Format-preserving Encryption Description Removing original information in its entirety (no spaces or other characters left); in some instances a single character e.g., *, may denote a point of redaction No fixed algorithm; information is replaced with a series of (pseudo-)random characters; non-deterministic A fixed character pattern (usually a single character) replaces sensitive information; e.g., phone # may become: (xxx) xxx-xxxx NIST standard MD5 and SHA families; deterministic with the same salt; non-reversible NIST standard (AES and derivatives); block-oriented; deterministic and reversible with the same key NIST standard under consideration; field-oriented; retains field character; deterministic; reversible or non-reversible is user-selectable 7. TDP Use Cases VelociData offers an extremely valuable format-preserving data masking mode. This data security process conforms to the NIST G specification and allows users to encrypt (reversibly) or mask (irreversibly) data without changing its schema or field specifications (lengths and dictionaries are preserved). This enables downstream applications to run without any changes. Use cases include local targets, private and public clouds, and targets where data cross geographic, company, or regulatory boundaries. A data set containing 10 million records with ten sensitive fields in each record can be secured in seconds using VelociData rather than a day using conventional approaches. All Content 2015 Velocidata inc. Test Data Protection p.7
10 Figure 5: Schematic for Creating Secure Model Copies Mainframe Data Sources IMS DB2 VSAM Sensitive Data Data Center Regulatory, Company or Geographic Boundary Application Test Environment QA Database Masked Data (Model Copy) POC / Test RDBMS Log Files CSV Files Non-Mainframe Data Sources Sensitive Data Development Database 7.1 Use Case 1: Creating a secure, HIPAA-compliant full production dataset from Microsoft SQL Server A large health benefits provider needs to create a model copy of a full production dataset for access by their developers. All 18 PHI data field types need to de-identified for HIPAA/HITECH audit compliance. The production data is about 400 GB loaded into Microsoft SQL Server. Following the outline of Figure 5, a workflow is established that: 1. extracts data out of SQL Server; 2. secures the data through the VelociData appliance using format-preserving masking (to ensure data integrity and application usability); and 3. performs a bulk load of the model data into a development set of tables. As an example, one of the tables contains 1 Million records, each of which are comprised of 34 fields. For HIPAA Final Rule compliance 14 of the fields in each record need to be de-identified (totaling 14 M fields). The dataset included a number of different field types (names, SSNs,...) requiring the following dictionaries: Names Numbers Dates Numerics hex_uppercase hex_lowercase alphanumerics alphanumeric_uppercase alphanumeric_lowercase printable characters All Content 2015 Velocidata inc. Test Data Protection p.8
11 The overall processing time for this table including all database queries, masking operations, and insertion into the resulting database, was just over one minute (65 seconds). With the longest running process being the database insert 7.2 Use Case 2: Secure data for insertion into an Azure cloud A retail company must de-identify PII data from records it needs to share with its business partners. This sensitive data contains names, addresses, phone numbers, and other personally identifying data. The manufacturer wants to put the data into a hosted environment but cannot let unprotected data leave its firewall. For this reason they have chosen to use VelociData to de-identify the data in their datacenter before it leaves to enter the cloud. The data contains a large volume of daily transactions. The business associates require the freshest data to address immediate results of campaigns, implementation changes for agile app development, and preparing model reports. Figure 6: Schematic for Securing Data to a Cloud Datastore Corporate Firewall As identified in Figure 6, data move through the VelociData appliance de-identifying the PII data found within the data flow. These records then are allowed to move to the cloud-based storage for access by business associates of the retailer. Since no sensitive data remain there is no risk to the company or the individuals should unauthorized access be gained or data breach occur. All Content 2015 Velocidata inc. Test Data Protection p.9
12 7.3 Use Case 3: Securing test data for off-shore developers A major Telco would like to move production data to India to leverage faster, round-the-clock development and lower costs. In order to remove audit deficiencies they would like to generate a model copy of the data to send to off-shore. While de-identification removes the risk from leaking precious sensitive customer and corporate data the developers require access to a dataset that closely mirrors fresh production data in character such as volume, distribution, and relation. The dataset represents 30 million records and 12 fields per record that need to be de-identified. VelociData can provide a fresh test dataset for the off-shore partners in a minute where the alternate solution takes almost a week before the data are available in test... by then, the developers have a new application built to be tested! 7.4 Use Case 4: Creating daily datasets for Development, QA, and Test Integration A large financial institution needs to provide model datasets with de-identified data to different parts of the development process. While all data needs to be fully de-identified for every user, not all data needs to go to all groups; as an example Web Development may not need a field relating to fraud but Test Integration may need it to complete processing. VelociData Test Data Protection solution has the ability to route different dataset builds to different end users. Leveraging routing is fast and efficient and provides the right data, in the right form, to the right individuals. Proper data arrive at the given locations saving on storage and maintenance of TB of useless replicated data. 8. Summary The VelociData appliance offers an easy to deploy, easy to use solution for test data protection. The system does not require any coding for integration and operation in the existing software and data base environment. Rather, it operates as a simple network resource for automatically masking sensitive data at wire speed. The appliance can communicate with all kinds of systems, including mainframes, commodity servers, and cloud services and can work relational data, flat files, logs, and XML data, and it requires no additional software or hardware to operate. The VelociData Test Data Protection solution reduces regulatory exposure and hacker risk, and it improves software testing speed and agility. All Content 2015 Velocidata inc. Test Data Protection p.10
13 9. Let Us Help You For reducing hacker risk and regulatory exposure in test data protection, VelociData offers the fastest time to safety. If you are using custom coding or packaged software for test data protection, VelociData would like to show you how our unique appliance-based solution can significantly reduce your cost and increase the speed of your test data protection workflow. If you are testing software with sensitive data unprotected, you are taking a huge risk and should consider adopting some remedy immediately, either VelociData s or some other. We would like to show you how quickly you can make this problem go away. Please contact us at [email protected] to see what we can do for you. Author: Ron Indeck Ron Indeck is the President & CTO of VelociData and has over 25 years of industry and academic experience, most recently as a founder and CTO of Exegy. He was a professor at Washington University in St. Louis, where he was the Das Family Distinguished Professor and Director of the Center for Security Technologies. Among his distinguished professional affiliations, Dr. Indeck was also the President of the Institute of Electrical and Electronics Engineers (IEEE) Magnetics Society. Dr. Indeck has been named the Bar Association Inventor of the Year. All Content 2015 Velocidata inc. Test Data Protection p.11
Data-Centric security and HP NonStop-centric ecosystems. Andrew Price, XYPRO Technology Corporation Mark Bower, Voltage Security
Title Data-Centric security and HP NonStop-centric ecosystems A breakthrough strategy for neutralizing sensitive data against advanced threats and attacks Andrew Price, XYPRO Technology Corporation Mark
Data Masking Checklist
Data Masking Checklist Selecting the Right Data Masking Tool Selecting Your Masking Tool Ensuring compliance with current data protection regulations and guidelines has become a mandatory operation. Non-compliance
Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking
Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking SUMMARY The Payment Card Industry Data Security Standard (PCI DSS) defines 12 high-level security requirements directed
Protegrity Tokenization
Securing Sensitive Data for PCI, HIPAA and Other Data Security Initiatives 2011 Edition Who should read it System architects, security experts, and other IT professionals who are looking to use tokenization
Data Breaches Gone Mad. Straight Away! Wednesday September 28 th, 2011
Data Breaches Gone Mad Learn how to Secure your Data Warehouse Straight Away! Wednesday September 28 th, 2011 Martin Willcox Director Product & Solutions Marketing Teradata Europe, Middle East & Africa
Protegrity Data Security Platform
Protegrity Data Security Platform The Protegrity Data Security Platform design is based on a hub and spoke deployment architecture. The Enterprise Security Administrator (ESA) enables the authorized Security
Oracle Database 12c Plug In. Switch On. Get SMART.
Oracle Database 12c Plug In. Switch On. Get SMART. Duncan Harvey Head of Core Technology, Oracle EMEA March 2015 Safe Harbor Statement The following is intended to outline our general product direction.
SafeNet DataSecure vs. Native Oracle Encryption
SafeNet vs. Native Encryption Executive Summary Given the vital records databases hold, these systems often represent one of the most critical areas of exposure for an enterprise. Consequently, as enterprises
Data Refinery with Big Data Aspects
International Journal of Information and Computation Technology. ISSN 0974-2239 Volume 3, Number 7 (2013), pp. 655-662 International Research Publications House http://www. irphouse.com /ijict.htm Data
Why Add Data Masking to Your IBM DB2 Application Environment
Why Add Data Masking to Your IBM DB2 Application Environment dataguise inc. 2010. All rights reserved. Dataguise, Inc. 2201 Walnut Ave., #260 Fremont, CA 94538 (510) 824-1036 www.dataguise.com dataguise
Datenverwaltung im Wandel - Building an Enterprise Data Hub with
Datenverwaltung im Wandel - Building an Enterprise Data Hub with Cloudera Bernard Doering Regional Director, Central EMEA, Cloudera Cloudera Your Hadoop Experts Founded 2008, by former employees of Employees
Addressing Risk Data Aggregation and Risk Reporting Ben Sharma, CEO. Big Data Everywhere Conference, NYC November 2015
Addressing Risk Data Aggregation and Risk Reporting Ben Sharma, CEO Big Data Everywhere Conference, NYC November 2015 Agenda 1. Challenges with Risk Data Aggregation and Risk Reporting (RDARR) 2. How a
Teradata and Protegrity High-Value Protection for High-Value Data
Teradata and Protegrity High-Value Protection for High-Value Data 03.16 EB7178 DATA SECURITY Table of Contents 2 Data-Centric Security: Providing High-Value Protection for High-Value Data 3 Visibility:
Data-Centric Security vs. Database-Level Security
TECHNICAL BRIEF Data-Centric Security vs. Database-Level Security Contrasting Voltage SecureData to solutions such as Oracle Advanced Security Transparent Data Encryption Introduction This document provides
data express DATA SHEET OVERVIEW
data express DATA SHEET OVERVIEW The reliability of IT systems is a key requirement of almost any organization. Unexpected failure of enterprise systems can be expensive and damaging to an organization.
Unlock your data for fast insights: dimensionless modeling with in-memory column store. By Vadim Orlov
Unlock your data for fast insights: dimensionless modeling with in-memory column store By Vadim Orlov I. DIMENSIONAL MODEL Dimensional modeling (also known as star or snowflake schema) was pioneered by
The Future of Data Management
The Future of Data Management with Hadoop and the Enterprise Data Hub Amr Awadallah (@awadallah) Cofounder and CTO Cloudera Snapshot Founded 2008, by former employees of Employees Today ~ 800 World Class
End to End Solution to Accelerate Data Warehouse Optimization. Franco Flore Alliance Sales Director - APJ
End to End Solution to Accelerate Data Warehouse Optimization Franco Flore Alliance Sales Director - APJ Big Data Is Driving Key Business Initiatives Increase profitability, innovation, customer satisfaction,
Data Domain Profiling and Data Masking for Hadoop
Data Domain Profiling and Data Masking for Hadoop 1993-2015 Informatica LLC. No part of this document may be reproduced or transmitted in any form, by any means (electronic, photocopying, recording or
Auditing Data Access Without Bringing Your Database To Its Knees
Auditing Data Access Without Bringing Your Database To Its Knees Black Hat USA 2006 August 1-3 Kimber Spradlin, CISA, CISSP, CPA Sr. Manager Security Solutions Dale Brocklehurst Sr. Sales Consultant Agenda
nwstor Storage Security Solution 1. Executive Summary 2. Need for Data Security 3. Solution: nwstor isav Storage Security Appliances 4.
CONTENTS 1. Executive Summary 2. Need for Data Security 3. Solution: nwstor isav Storage Security Appliances 4. Conclusion 1. EXECUTIVE SUMMARY The advantages of networked data storage technologies such
DBKDA 2012 : The Fourth International Conference on Advances in Databases, Knowledge, and Data Applications
Evaluation of Data Anonymization Tools Sergey Vinogradov Corporate Technology Siemens LLC Saint-Petersburg, Russia [email protected] Alexander Pastsyak Corporate Technology Siemens LLC Saint-Petersburg,
Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities
Technology Insight Paper Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities By John Webster February 2015 Enabling you to make the best technology decisions Enabling
Making Sense of Big Data in Insurance
Making Sense of Big Data in Insurance Amir Halfon, CTO, Financial Services, MarkLogic Corporation BIG DATA?.. SLIDE: 2 The Evolution of Data Management For your application data! Application- and hardware-specific
Data Loss Prevention Best Practices to comply with PCI-DSS An Executive Guide
Data Loss Prevention Best Practices to comply with PCI-DSS An Executive Guide. Four steps for success Implementing a Data Loss Prevention solution to address PCI requirements may be broken into four key
Copyright 2011 Sentry Data Systems, Inc. All Rights Reserved. No Unauthorized Reproduction.
The Datanex Platform is a healthcare focused cloud computing platform that allows solution providers to construct rich healthcare business intelligence applications that leverage the world s fastest and
Object Storage: Out of the Shadows and into the Spotlight
Technology Insight Paper Object Storage: Out of the Shadows and into the Spotlight By John Webster December 12, 2012 Enabling you to make the best technology decisions Object Storage: Out of the Shadows
Hadoop in the Hybrid Cloud
Presented by Hortonworks and Microsoft Introduction An increasing number of enterprises are either currently using or are planning to use cloud deployment models to expand their IT infrastructure. Big
Myths and Realities of Data Security and Compliance: Ulf Mattsson, CTO, Protegrity
Myths and Realities of Data Security and Compliance: The Risk-based Data Protection Solution Ulf Mattsson, CTO, Protegrity Ulf Mattsson 20 years with IBM Development, Manufacturing & Services Inventor
Streamlining Information Protection Through a Data-centric Security Approach
WHITE PAPER Streamlining Information Protection Through a Data-centric Security Approach Overview The sophistication and persistence of criminal attacks on online systems is growing, along with government
Microsoft Big Data Solutions. Anar Taghiyev P-TSP E-mail: [email protected];
Microsoft Big Data Solutions Anar Taghiyev P-TSP E-mail: [email protected]; Why/What is Big Data and Why Microsoft? Options of storage and big data processing in Microsoft Azure. Real Impact of Big
Information Architecture
The Bloor Group Actian and The Big Data Information Architecture WHITE PAPER The Actian Big Data Information Architecture Actian and The Big Data Information Architecture Originally founded in 2005 to
6 Steps to Faster Data Blending Using Your Data Warehouse
6 Steps to Faster Data Blending Using Your Data Warehouse Self-Service Data Blending and Analytics Dynamic market conditions require companies to be agile and decision making to be quick meaning the days
In-Database Analytics
Embedding Analytics in Decision Management Systems In-database analytics offer a powerful tool for embedding advanced analytics in a critical component of IT infrastructure. James Taylor CEO CONTENTS Introducing
A Scalable Data Transformation Framework using the Hadoop Ecosystem
A Scalable Data Transformation Framework using the Hadoop Ecosystem Raj Nair Director Data Platform Kiru Pakkirisamy CTO AGENDA About Penton and Serendio Inc Data Processing at Penton PoC Use Case Functional
12 Key File Sync and Share Advantages of Transporter Over Box for Enterprise
WHITE PAPER 12 Key File Sync and Share Advantages of Transporter Over Box for Enterprise Cloud storage companies invented a better way to manage information that allows files to be automatically synced
How Multi-Pay Tokens Can Reduce Security Risks and the PCI Compliance Burden for ecommerce Merchants
How Multi-Pay Tokens Can Reduce Security Risks and the PCI Compliance Burden for ecommerce Merchants 2012 First Data Corporation. All trademarks, service marks and trade names referenced in this material
NoSQL for SQL Professionals William McKnight
NoSQL for SQL Professionals William McKnight Session Code BD03 About your Speaker, William McKnight President, McKnight Consulting Group Frequent keynote speaker and trainer internationally Consulted to
Increased Security, Greater Agility, Lower Costs for AWS DELPHIX FOR AMAZON WEB SERVICES WHITE PAPER
Increased Security, Greater Agility, Lower Costs for AWS DELPHIX FOR AMAZON WEB SERVICES TABLE OF CONTENTS Introduction... 3 Overview: Delphix Virtual Data Platform... 4 Delphix for AWS... 5 Decrease the
HDP Hadoop From concept to deployment.
HDP Hadoop From concept to deployment. Ankur Gupta Senior Solutions Engineer Rackspace: Page 41 27 th Jan 2015 Where are you in your Hadoop Journey? A. Researching our options B. Currently evaluating some
High-Volume Data Warehousing in Centerprise. Product Datasheet
High-Volume Data Warehousing in Centerprise Product Datasheet Table of Contents Overview 3 Data Complexity 3 Data Quality 3 Speed and Scalability 3 Centerprise Data Warehouse Features 4 ETL in a Unified
So What s the Big Deal?
So What s the Big Deal? Presentation Agenda Introduction What is Big Data? So What is the Big Deal? Big Data Technologies Identifying Big Data Opportunities Conducting a Big Data Proof of Concept Big Data
Cloud models and compliance requirements which is right for you?
Cloud models and compliance requirements which is right for you? Bill Franklin, Director, Coalfire Stephanie Tayengco, VP of Technical Operations, Logicworks March 17, 2015 Speaker Introduction Bill Franklin,
Getting Started Practical Input For Your Roadmap
Getting Started Practical Input For Your Roadmap Mike Ferguson Managing Director, Intelligent Business Strategies BA4ALL Big Data & Analytics Insight Conference Stockholm, May 2015 About Mike Ferguson
SOLUTION BRIEF. JUST THE FAQs: Moving Big Data with Bulk Load. www.datadirect.com
SOLUTION BRIEF JUST THE FAQs: Moving Big Data with Bulk Load 2 INTRODUCTION As the data and information used by businesses grow exponentially, IT organizations face a daunting challenge moving what is
Can you Continue to Ignore Data Encryption in SAP?
Can you Continue to Ignore Data Encryption in SAP? James Baird, Dolphin DOLPHIN AT A GLANCE Focus: SAP Customer Only Proven: SAP certified solutions for the SAP customer leveraging SAP technology Stature:
Leveraging Machine Data to Deliver New Insights for Business Analytics
Copyright 2015 Splunk Inc. Leveraging Machine Data to Deliver New Insights for Business Analytics Rahul Deshmukh Director, Solutions Marketing Jason Fedota Regional Sales Manager Safe Harbor Statement
An Oracle White Paper November 2010. Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics
An Oracle White Paper November 2010 Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics 1 Introduction New applications such as web searches, recommendation engines,
Implement Hadoop jobs to extract business value from large and varied data sets
Hadoop Development for Big Data Solutions: Hands-On You Will Learn How To: Implement Hadoop jobs to extract business value from large and varied data sets Write, customize and deploy MapReduce jobs to
APPLICATION COMPLIANCE AUDIT & ENFORCEMENT
TELERAN SOLUTION BRIEF Building Better Intelligence APPLICATION COMPLIANCE AUDIT & ENFORCEMENT For Exadata and Oracle 11g Data Warehouse Environments BUILDING BETTER INTELLIGENCE WITH BI/DW COMPLIANCE
Debunking The Myths of Column-level Encryption
Debunking The Myths of Column-level Encryption Vormetric, Inc. 888.267.3732 408.433.6000 [email protected] www.vormetric.com Page 1 Column-level Encryption Overview Enterprises have a variety of options
White Paper. Managing Risk to Sensitive Data with SecureSphere
Managing Risk to Sensitive Data with SecureSphere White Paper Sensitive information is typically scattered across heterogeneous systems throughout various physical locations around the globe. The rate
PLATFORA INTERACTIVE, IN-MEMORY BUSINESS INTELLIGENCE FOR HADOOP
PLATFORA INTERACTIVE, IN-MEMORY BUSINESS INTELLIGENCE FOR HADOOP Your business is swimming in data, and your business analysts want to use it to answer the questions of today and tomorrow. YOU LOOK TO
HIPAA and HITECH Compliance Simplification. Sol Cates CSO @solcates [email protected]
HIPAA and HITECH Compliance Simplification Sol Cates CSO @solcates [email protected] Quick Agenda Why comply? What does Compliance look like? New Cares vs Rental Cars vs Custom Cars Vormetric Q&A Slide
Solving data residency and privacy compliance challenges Delivering business agility, regulatory compliance and risk reduction
Solving data residency and privacy compliance challenges Delivering business agility, regulatory compliance and risk reduction Introduction In today s dynamic business environment, corporation s intangible
Taming Big Data. 1010data ACCELERATES INSIGHT
Taming Big Data 1010data ACCELERATES INSIGHT Lightning-fast and transparent, 1010data analytics gives you instant access to all your data, without technical expertise or expensive infrastructure. TAMING
Getting Real Real Time Data Integration Patterns and Architectures
Getting Real Real Time Data Integration Patterns and Architectures Nelson Petracek Senior Director, Enterprise Technology Architecture Informatica Digital Government Institute s Enterprise Architecture
SAP Data Services 4.X. An Enterprise Information management Solution
SAP Data Services 4.X An Enterprise Information management Solution Table of Contents I. SAP Data Services 4.X... 3 Highlights Training Objectives Audience Pre Requisites Keys to Success Certification
PLATFORM ENCRYPTlON ARCHlTECTURE. How to protect sensitive data without locking up business functionality.
PLATFORM ENCRYPTlON ARCHlTECTURE How to protect sensitive data without locking up business functionality. 1 Contents 03 The need for encryption Balancing data security with business needs Principles and
Testing Big data is one of the biggest
Infosys Labs Briefings VOL 11 NO 1 2013 Big Data: Testing Approach to Overcome Quality Challenges By Mahesh Gudipati, Shanthi Rao, Naju D. Mohan and Naveen Kumar Gajja Validate data quality by employing
Big Data, Meet Enterprise Security
WHITE PAPER Big Data, Meet Enterprise Security Will Data Security and Compliance Issues Put Big Data Developments on Hold? Large organizations worldwide are working to develop and deploy Big Data analytical
Mainframe Data Protection in an Age of Big Data, Mobile, and Cloud Computing
SOLUTION BRIEF Mainframe Data Protection in an Age of Big Data, Mobile, and Cloud Computing Compelling business value propositions such as improved time-to-insight, customer access, business agility, and
Hayri Tarhan, Sr. Manager, Public Sector Security, Oracle Ron Carovano, Manager, Business Development, F5 Networks
EXTENDING ACCESS WHILE ENHANCING CONTROL FOR YOUR ORGANIZATION S DATA LEVERAGE THE POWER OF F5 AND ORACLE TO DELIVER SECURE ACCESS TO APPLICATIONS AND DATABASES Hayri Tarhan, Sr. Manager, Public Sector
Accelerate Data Loading for Big Data Analytics Attunity Click-2-Load for HP Vertica
Accelerate Data Loading for Big Data Analytics Attunity Click-2-Load for HP Vertica Menachem Brouk, Regional Director - EMEA Agenda» Attunity update» Solutions for : 1. Big Data Analytics 2. Live Reporting
Active Directory User Management System (ADUMS)
Active Directory User Management System (ADUMS) Release 2.9.3 User Guide Revision History Version Author Date Comments (MM/DD/YYYY) i RMA 08/05/2009 Initial Draft Ii RMA 08/20/09 Addl functionality and
Big Data, Big Traffic. And the WAN
Big Data, Big Traffic And the WAN Internet Research Group January, 2012 About The Internet Research Group www.irg-intl.com The Internet Research Group (IRG) provides market research and market strategy
IBM Campaign and IBM Silverpop Engage Version 1 Release 2 August 31, 2015. Integration Guide IBM
IBM Campaign and IBM Silverpop Engage Version 1 Release 2 August 31, 2015 Integration Guide IBM Note Before using this information and the product it supports, read the information in Notices on page 93.
W H I T E P A P E R. Deriving Intelligence from Large Data Using Hadoop and Applying Analytics. Abstract
W H I T E P A P E R Deriving Intelligence from Large Data Using Hadoop and Applying Analytics Abstract This white paper is focused on discussing the challenges facing large scale data processing and the
Alliance Key Manager Solution Brief
Alliance Key Manager Solution Brief KEY MANAGEMENT Enterprise Encryption Key Management On the road to protecting sensitive data assets, data encryption remains one of the most difficult goals. A major
Integrated Data Management: Discovering what you may not know
Integrated Data Management: Discovering what you may not know Eric Naiburg [email protected] Agenda Discovering existing data assets is hard What is Discovery Discovery and archiving Discovery, test
We are Big Data A Sonian Whitepaper
EXECUTIVE SUMMARY Big Data is not an uncommon term in the technology industry anymore. It s of big interest to many leading IT providers and archiving companies. But what is Big Data? While many have formed
Securing and protecting the organization s most sensitive data
Securing and protecting the organization s most sensitive data A comprehensive solution using IBM InfoSphere Guardium Data Activity Monitoring and InfoSphere Guardium Data Encryption to provide layered
PROTECTING ENTERPRISE DATA IN HADOOP
TECHNICAL BRIEF PROTECTING ENTERPRISE DATA IN HADOOP Introduction Big Data is an exciting concept and emerging set of technologies that hold seemingly unlimited promise to enable organizations to gain
Enabling Real-Time Sharing and Synchronization over the WAN
Solace message routers have been optimized to very efficiently distribute large amounts of data over wide area networks, enabling truly game-changing performance by eliminating many of the constraints
Decoding the Big Data Deluge a Virtual Approach. Dan Luongo, Global Lead, Field Solution Engineering Data Virtualization Business Unit, Cisco
Decoding the Big Data Deluge a Virtual Approach Dan Luongo, Global Lead, Field Solution Engineering Data Virtualization Business Unit, Cisco High-volume, velocity and variety information assets that demand
Extraction Transformation Loading ETL Get data out of sources and load into the DW
Lection 5 ETL Definition Extraction Transformation Loading ETL Get data out of sources and load into the DW Data is extracted from OLTP database, transformed to match the DW schema and loaded into the
Protecting Enterprise Data In Hadoop HPE SecureData for Hadoop
Protecting Enterprise Data In Hadoop HPE SecureData for Hadoop Introduction Big Data is an exciting concept and emerging set of technologies that hold seemingly unlimited promise to enable organizations
How To Handle Big Data With A Data Scientist
III Big Data Technologies Today, new technologies make it possible to realize value from Big Data. Big data technologies can replace highly customized, expensive legacy systems with a standard solution
White Paper: Evaluating Big Data Analytical Capabilities For Government Use
CTOlabs.com White Paper: Evaluating Big Data Analytical Capabilities For Government Use March 2012 A White Paper providing context and guidance you can use Inside: The Big Data Tool Landscape Big Data
