Protecting Enterprise Data In Hadoop HPE SecureData for Hadoop
|
|
|
- Derrick Dorsey
- 10 years ago
- Views:
Transcription
1 Protecting Enterprise Data In Hadoop HPE SecureData for Hadoop Introduction Big Data is an exciting concept and emerging set of technologies that hold seemingly unlimited promise to enable organizations to gain new analytic insights and operational efficiencies. It is a unique architecture that enables low-cost, high-speed, parallel processing of huge data sets of structured and unstructured data. In a recent survey, Frost & Sullivan found that 58% of large enterprises had already implemented one or more Big Data and/or analytic solutions, and the rest were either implementing, evaluating, or planning to investigate in the next 8-24 months*. This Big Data movement is supported by multiple, open source initiatives centered on Apache Hadoop. The core driving forces have been providing flexibility, performance and scalability through the use of multiple standard, low-cost processing nodes and the Hadoop Distributed File System (HDFS) on commodity off-the-shelf hardware. Security was not a key design criterion. When used in an enterprise environment, however, the importance of data security becomes paramount. Organizations must protect sensitive customer, partner and internal information from an increasing array of advanced threats and risks, and must adhere to a complex set of privacy laws and compliance requirements. But, by its nature, Hadoop poses many unique challenges to properly securing this environment, including: *Big Data and Analytics Market Survey: Initial Observations, by Sandy Borthick, Frost & Sullivan, BDA 2-09, July 204 Inability to guarantee complete removal of sensitive data once it has entered the cluster (as Hadoop does not guarantee when a file is actually removed from the Trash plus there are no secure wipe within HDFS to ensure all data blocks have been purged). Rapidly evolving tools with frequent releases, driven by a diverse and well funded opensource developer community. Lack of a core set of security controls that are commonplace on commercial, relational database management system (RDBMS) products. Multiple sources of data coming from multiple enterprise systems and real-time feeds with varying (or unknown) protection requirements.
2 USE CASE Global Telecommunication Company A global communications company wanted to use Hadoop to analyze massive customer data sets, social media and communications feeds for patterns of behavior in order to detect fraud and enhance customer service. HPE FPE technology was used to de-identify sensitive data in several hundred million customer records from Teradata and IBM Mainframes as it was ingested into Hadoop in under 90 seconds. Multiple types of data (structured RDBMSbased, file-based, text feeds, etc.) combined together in the Hadoop data lake. Access by many different users with varying analytic needs and ad-hoc processes, with the likelihood of propagating to other enterprise systems and tools. Automatic and largely uncontrollable replication of data across multiple nodes once entered into the HDFS data store. Reduced physical and logical access controls if Hadoop clusters are deployed remotely or in a cloud environment. Further exacerbating these risks is the fact that the aggregation of data found within these Big Data systems makes for an extremely alluring target for hackers and data thieves. Hadoop presents brand new challenges to data risk management: the potential concentration of vast amounts of sensitive corporate and personal data in a low-trust environment. The data-concentration issue alone can lead to far-reaching business risks through the exposure of a company s data assets if data access, analysis, and extraction are not strictly controlled. A bank s portfolio of positions and transactions may be visible and searchable. An insurance company s entire risk posture may be easily determined. A government s military health information may be indirectly exposed from health analytics. Fully securing Hadoop in enterprise environments is essential to mitigate these potentially huge Big Data exposures. Basic Hadoop Security Controls There are a number of traditional IT security controls that should be put in place as the basis for securing a Hadoop environment. All the standard perimeter protection controls apply network firewalls, intrusion detection and protection systems (IDPS), security information and event management (SIEM), configuration control, vulnerability management, etc. These are the base level of general security controls. For the next level of security, the open source community has been investing heavily in building Hadoopspecific tools and best practices to provide enterprise grade security. There are four key areas of focus: authentication, authorization, audit and data protection. Authentication is fundamental to ensuring users are who they claim to be. It can be best accomplished by integrating with existing enterprise identity and access management services. Apache Knox provides a REST API Gateway as a single point of access to the Hadoop cluster (or multiple clusters). Using Kerberos provides strong authentication to establish identities for clients, hosts and services. These tools can be integrated with an organization s LDAP server, including Microsoft Active Directory. Much of the issue around Hadoop authorization has been the limited ability to grant or deny access privileges on a granular basis. This is improving, as SQL-style authorization to Hive and HDFS ACLs are now available in Hadoop 2.4. The capability to audit and track individual user accesses to specific services and HDFS data components is also improving with the use of the above tools. The fourth pillar of Hadoop security is data protection. Even with all the above controls, it is a given that at some point unauthorized users will gain inappropriate access to Hadoop data. This is why protecting the data itself through encryption or other de-identification techniques is of paramount importance. De-identified data in Hadoop is protected data, and even in the event of a data breach, yields nothing of value, avoiding the penalties and costs such an event would otherwise have triggered. Data Protection Methodologies There are multiple approaches that can be deployed to directly protect the data in a Hadoop environment: Volume Storage and File-level Encryption This is a fairly standard type of encryption frequently deployed by IT to protect sensitive data in file systems. This is often referred to as data-at-rest encryption. The data is encrypted at the file system or disk volume level, and is protected while residing at rest on the data store. This protects against unauthorized personnel who may have physically obtained the disk from being able to read anything from it. This is a useful control in a large Hadoop cluster due to frequent disk repairs and swap outs.
3 USE CASE A leading health care insurance company wanted to open their massive, previously untapped data sets to their Hadoop developers to enable research, discovery and innovation through developer hackathons. They also had the objective to automate multiple high value use cases, such as identification of prescription fraud, previously hampered by manual processes. HPE FPE technology enabled field-level de-identification of sensitive ephi and PII data across a 000-node distribution. USE CASE Health Care Insurance Company Global Financial Services Company A global financial services firm needed to adopt Hadoop to improve its fraud detection capabilities, analyzing customer transaction patterns across hundreds of millions of consumers. Data de-identification in-stream during ETL ingestion with Informatica enabled secure analytics across fields containing dates, date ranges, card-holder data, and consumer personal data without exposing the live data. After Hadoop processing, the de-identified data is transferred into their traditional BI tools for additional analysis. The data remains protected throughout with end-to-end field level encryption and tokenization, avoiding compliance and risk challenges, while enabling new approaches to reducing fraud. Volume-level encryption using base-level Linux OS capabilities has been available for some time, and there is considerable effort right now to add HDFS-level encryption to Hadoop, which would provide more granular control at the file system level. However, both these approaches do nothing to protect the data from any and all access when the disk is running within the system which, of course, is essentially all the time. Decryption is applied automatically when the data is read by the operating system, and live, vulnerable data is fully exposed to any user or process accessing the system, whether authorized or not. Further, every read and every write invokes encryption and decryption, which adds overhead, even if the data being written or read is not sensitive. Transport-level Encryption This is another common data encryption technique used to protect data while being transferred across a network, and is sometimes called data-in-motion or wire encryption. SSL/TLS protocols are a common example of this. Wire encryption is currently receiving a lot of attention from the Hadoop community to ensure Hadoop programs can support these secure transport-level protocols. This is an important data protection technique, as data moving across the wire is often most vulnerable to interception and theft, whether transferring data across clusters in a rack within the datacenter, nodes in the cloud, or especially across a public network (i.e. the Internet). However, the protection this method affords is limited to the actual transfer itself. Clear, unprotected data exists at the source, and clear, unprotected data is automatically reassembled at the destination. This provides many vulnerable points of access for the potential thief, and like Storage-level encryption above, does nothing to protect the data from unauthorized users who have obtained fraudulent access to the system itself. Data Masking This is a useful technique for obfuscating sensitive data, commonly used for creation of test and development data or analytic data from live production information. A typical system will support multiple transformation techniques, from extremely simple (replacing digits with X s ) to quite complex (replacing valid addresses with different valid addresses from the same area). Depending on the requirements and how cleverly you select the transformations, masked data can sometimes be used in its protected state for analytic applications. However, there are several limitations of data masking in the Hadoop environment: Masked data is intended to be irreversible, which limits its value for many analytic applications and post-processing requirements. For example, a common Hadoop use case is taking data from live systems, performing low cost analysis in Hadoop, and then pushing downstream processing to existing Business Intelligence (BI) tools where live data is again needed for business action and decision. There is no guarantee the specific masking transformation chosen for a specific sensitive data field fully obfuscates it from identification, particularly when analyzed with adjacent, unaltered fields or correlated with other data. For example, in a recently released dataset of supposedly masked information on 73 million taxi rides in New York City, individual taxis and their drivers could be re-identified using related data because of a weak hashing algorithm used to alter the license plate numbers. Many data masking transformations create duplicates and destroy referential integrity, resulting in join operations on data base tables not mapping properly and reducing the ability to perform analytic functions on the data. Specific masking techniques may or may not be accepted by auditors and assessors, affecting whether they truly meet compliance requirements and provide safe harbor in the event of a breach. The masking process, depending on the transformation, may require mapping tables, making processing quite slow and not scalable. These mapping tables create another place to protect sensitive data, which at Hadoop scale, may be very expensive or even impossible to achieve.
4 NIST Special Publication G Draft Recommendation for Block Cipher Modes of Operation: Methods for Format-Preserving Encryption Morris Dworkin C O M P U T E R S E C U R I T Y FPE facilitates the targeting of encryption to sensitive information, as well as the retrofitting of encryption technology to legacy applications, where a conventional encryption mode might not be feasible because it would disrupt data fields/ pathways. FPE has emerged as a useful cryptographic tool, whose applications include financialinformation security, data sanitization, and transparent encryption of fields in legacy databases. Source: US Government NIST SP G Draft Standard Data-centric Security A data-centric security approach is quite different than the above techniques, and calls for de-identifying the data as close to its source as possible, replacing the sensitive data elements with usable, yet deidentified, equivalents that retain their format, behavior and meaning. This is also called end-to-end data protection and provides an enterprise-wide solution for data protection that extends beyond the Hadoop environment. This protected form of the data can then be used in subsequent applications, analytic engines, data transfers and data stores. For Hadoop, the best practice is to never allow sensitive information to reach the HDFS in its live and vulnerable form. HPE SecureData - Data-centric Security To properly protect data in Hadoop, you need to employ a data-centric security approach. As the above examples illustrate, this cannot be achieved by simply deploying piecemeal encryption or data masking within Hadoop. It requires protecting data-at-rest, in-use, and in-motion; as close to its source as possible, and doing so in way that is scalable, format preserving and maintains the analytic meaning and logic of the data without live data exposure risk. It requires protecting the data at the field level in a way that it can be used by applications in its de-identified state, while also being selectively and securely re-identified for those specific applications and users that require it. HPE SecureData provides such a solution. Three core technology breakthroughs enable HPE SecureData to meet these requirements: HPE Format-Preserving Encryption A fundamental innovation enabling HPE SecureData data-centric platform is HPE Format-Preserving Encryption (FPE), providing high strength encryption of data without altering the original data format and preserving business value and referential integrity across distributed data sets. This enables applications, analytic processes and databases to use the protected data without alteration, even across distributed systems, platforms and tools. Protection is applied at the field or even partial-field level, leaving non-sensitive portions of fields available for applications while protecting the sensitive parts. HPE FPE preserves referential integrity, which means protected data can still be consistently referenced and joined across tables and data sets a major requirement for proper operations with the mix of data entered into Hadoop, and especially critical where common identifiers like Social Security Numbers or ID s are used as common references across disparate data sets. Policy controlled secure reversibility enables data to be selectively re-identified in trusted systems which need live data, enabling full end-to-end data processes to be secured without resorting to risky and cumbersome mapping databases. HPE Format-Preserving Encryption has also received strong attention from the government, and is recognized as a mode of standard AES encryption, specifically FF mode AES defined in NIST G. This provides users confidence in the security proofs and standards underpinning HPE FPE. HPE SecureData has built a robust eco-system around HPE FPE, providing support across multiple enterprise platforms with proven implementation tools, delivering always-on data protection. HPE Secure Stateless Tokenization Augmenting the HPE FPE encryption technology is HPE Secure Stateless Tokenization (SST). Tokenization is the process of replacing live data with a random surrogate. Traditional approaches use databases to map live to surrogate values, which limit performance and scale, rendering it impractical for most Hadoop use cases. HPE SST technology is stateless, eliminating the need for a token database. Instead, the system uses a pre-generated token mapping table containing random numbers using a proven, independently validated random token mapping method. This eliminates the complexities of token database synchronization, back-up and recovery, and provides a 00% guarantee of data integrity with no data collisions. Tokenization is an important de-identification capability for credit card numbers,
5 and proper use of this feature can help ensure your Hadoop environment stays out of scope for PCI DSS compliance requirements, avoiding cost and disruptive audit processes. Combining HPE FPE with HPE SST gives the user the necessary flexibility to select the best data protection technique for each data type and regulatory mandate, all within a single solution. Both HPE FPE and HPE SST permit data to be de-identified at scale. De-identification tasks can be processed in parallel across multiple nodes, enabling massive scale and performance without additional infrastructure or complexity. HPE Stateless Key Management HPE Stateless Key Management provides keys to be derived as needed with no storage or database management issues because database synchronization and frequent backups are not required. Because keys are dynamically generated, there is no need for key backup, and no possibility of key loss. Key management can be linked to existing identity management infrastructure, including external LDAP directories. Permission to decrypt or de-tokenize can be assigned on an application policy basis, and can incorporate user roles and groups to simplify management based on identity management system policy. This role based access to data at the field level is extremely important in the Hadoop environment for enabling applications and users to decrypt exactly the data they are authorized to access, presenting different views of clear and encrypted fields to match specific user permissions. HPE Stateless Key Management also enables simplified implementation and provides high-performance, scalable, distributed processing that is well-matched with the Hadoop architecture. HPE SecureData Deployment Architecture Implementing data-centric security involves installing the HPE SecureData infrastructure components and then interfacing with the appropriate applications and data flows. Infrastructure deployment is simplified by the use of a HPE SecureData virtual software appliance that includes the Key Server, Management Console, Web Services Server, etc. Developer templates, APIs and command line tools enable encryption and tokenization to occur natively on the widest variety of platforms, including Linux, mainframe and mid-range computers, and support integration with a broad range of infrastructure components, including ETL, databases, and, of course, programs running in the Hadoop environment. See the HPE SecureData Deployment Architecture diagram below summarizing these options:
6 HPE SecureData Data Protection Options using Hadoop Using a data-centric security approach, it is necessary to identify the sensitive data fields within all data sources that will be transferred into Hadoop, and make a determination as to the most appropriate point to de-identify the data. The general principal is to protect the data as close to the original source as possible. Put another way, the best place to protect data in Hadoop is before it gets to Hadoop. This is not always possible, and encryption/tokenization can also be evoked during an ETL transfer to a landing zone, or from the Hadoop process transferring the data into HDFS. Once the secure data is in Hadoop, it can be used in its de-identified state for additional processing and analysis without further interaction with the HPE SecureData system. Or the analytic programs running in Hadoop can access the clear text by utilizing the HPE SecureData high-speed decryption/detokenization interfaces with the appropriate level of authentication and authorization. If processed data needs to be exported to downstream analytics in the clear such as into a data warehouse for traditional BI analysis you again have multiple options for re-identifying the data, either as it exits Hadoop using Hadoop tools or as it enters the downstream systems on those platforms. HPE SecureData provides seven specific options for protecting sensitive data used in Hadoop: Options. Apply data protection at source applications 2. Apply data protection during import into Hadoop (ETL process, Sqoop) 3. Apply data protection within Hadoop (e.g. MapReduce) 4. Using de-identified data within Hadoop (e.g. Hive) 5. Using and exporting re-identified data from Hadoop (e.g. Hive) 6. Exporting data and re-identifying outside Hadoop (ETL process) 7. Using storage-level encryption within Hadoop Option : Apply data protection at source applications This is the ideal scenario, so that the sensitive data is fully protected wherever it flows, including Hadoop. It also helps ensure that the Hadoop system is not brought into scope for PCI and other compliance policies. It requires access to the source applications outside the Hadoop environment to add the HPE SecureData interface calls for encryption and tokenization. But once protected, this information can be transferred freely into HDFS tools and then be used as is for many analytic tasks (see option 4), or selectively re-identified when in-the-clear data is required (see options 5 and 6).
7 To facilitate the discussion, we have constructed two example data base tables (each one record in length), the first showing multiple fields related to a credit card transaction, the second showing credit score indexed by social security number (SSN). The records are coded in red to designate unprotected, in-the-clear data. Table : Street City State Post code Phone Birthday Credit Card SSN Tyshawn Medhurst Verl Plaza New Lianemouth LA (405) [email protected] 3/2/ Table 2: SSN Score 62 , birthdate, credit card number and social security number have been determined to be sensitive information that must be protected (marked in yellow). Using HPE FPE technology, , birthdate and SSN are encrypted as close to the source of the data as possible, using whatever HPE SecureData platform interface is required. HPE SST is used to tokenize the credit card number. This results in the following de-identified data records: Protected Table : Street City State Post code Phone Birthday Credit Card SSN Tyshawn Medhurst Verl Plaza New Lianemouth LA (405) [email protected] 8/2/ Protected Table 2: SSN Score Data from these protected tables are available to be loaded into Hadoop as needed; including subsets of the fields and/or joined data using the social security number as the index, even though that is an encrypted field, for example: Tyshawn Medhurst LA [email protected] 8/2/ HPE FPE and HPE SST preserve referential integrity, which means protected data can be consistently referenced and joined across tables and data sets. Option 2: Apply data protection during import into Hadoop (e.g. ETL process or Sqoop ingestion) This is a good option that does not require interfacing at the source applications the data is protected as it enters Hadoop. This can either be achieved by interfacing with HPE SecureData from traditional ETL and batch tools in a landing zone outside Hadoop, or by a Hadoop-resident import program such as Sqoop. As part of the HPE SecureData Hadoop Installation Kit, Developer Templates and documentation are included to guide the programmer in properly calling the HPE SecureData interfaces from common Hadoop programs including Sqoop, MapReduce and Hive. In this case, Sqoop would be one of the preferred methods of ingesting data into the Hadoop instance. Sqoop can connect to an SQL database and load data directly into HDFS. While Sqoop is not a full ETL tool, it does provide efficient and fast Extract and Load functionality. Although Sqoop does not provide transformation functionality as a standard capability, HPE Security - Data Security has developed a unique approach that enables encryption operations to be performed during a Sqoop import job. To secure data during the ingestion, the sensitive fields to be protected are identified, and the format to be used for de-identification is defined. A simple excerpt from the HPE SecureData configuration file for protecting credit card data would be as follows: Sqoop.field.0.column.name = credit_card Sqoop.field.0.format.name = cc-sst-6-4
8 That would convert the unencrypted credit card number stored in the data store to follow the encryption settings for policy cc-sst-6-4, which encrypts the middle digits of the credit card number while leaving the first six and last four in their original state. After generation of the import classes, the system then automatically de-identifies the information in the credit_card column when loading the information into Hadoop. With this integration, the source information stored in the database that Sqoop is accessing might be this: Tyshawn Medhurst LA /2/ While the actual data being imported into Hadoop, assuming de-identification of the four sensitive fields would be this: Tyshawn Medhurst LA /2/ When using Sqoop, it is important you segregate the Hadoop processing in its own landing zone. Clear data will temporarily exist in memory, but only protected data will be written into HDFS. Option 3: Apply data protection within Hadoop (e.g. MapReduce) This option utilizes the HPE SecureData interfaces running directly within Hadoop jobs. This provides the opportunity to integrate with other Hadoop preprocessing tasks, as well as protecting data fields once they are identified in Hadoop (such as when schema are dynamically defined). Since the MapReduce Framework is built on Java, it is easy to integrate encryption and decryption into the Map and Reduce jobs that are executed in the Hadoop cluster. HPE SecureData Hadoop API and web services interfaces are again used, with Developer Templates supplied to assist the programmer in properly calling these HPE SecureData interfaces. In order to perform the encryption function, the writer of the MapReduce job needs to create a Crypto object and pass it information about the object to be encrypted and its format, so that the correct encryption for the field can be performed (i.e. Credit Card, SSN, etc). The return value is then the actual encrypted value. String[ ] output = crypto.protectformatteddatalist(input); The process to decrypt information is very similar you provide the encrypted value and the decrypted information is then returned. Using our example data, we could assume that the data already stored in HDFS looks like this: Tyshawn Medhurst LA [email protected] 3/2/ Once the MapReduce job is completed, there would be another file inside of HDFS that contains the encrypted information as shown below: Tyshawn Medhurst LA [email protected] 8/2/ The customer needs to be aware that with this approach, unprotected data has already entered the Hadoop system, and the original data cannot easily be removed (due to the automatic data replication and lack of a secure wipe capability within HDFS). For this reason, some customers might use a dedicated, highly controlled Hadoop cluster specifically for staging, and/or will completely rebuild the cluster once the sensitive data has been protected and transferred out. Option 4: Using de-identified data within Hadoop Once the data has been imported into Hadoop, the ideal scenario is performing all analytics and processing on the protected data, which avoids the need for decryption or de-tokenization. This is often possible by utilizing HPE SecureData data transformation options such as leaving leading/trailing digits in the clear, maintaining data ranges, preserving date relationships, etc.
9 Using Hive on our sample data, you might execute the following Query: SELECT sid, s.name, s. ,,s.birth_date, s.cc, s.ssn,cs.creditscore FROM voltage_samples JOIN voltage_sample_creditscorecs ON (s.ssn = cs.ssn) WHERE s.id <= 0; The returned data would be: Tyshawn Medhurst LA [email protected] 8/2/ The consistency of the data is preserved, and analytical operations in Hadoop would work the same way as in the unencrypted state, including table joins. Because HPE FPE provides high-strength encryption while it preserves business value of the data, and referential integrity across distributed data sets, the vast majority of analytics can be securely performed on the de-identified data. Option 5: Using and exporting re-identified data from Hadoop There will be situations in which some data fields need to be processed in their clear, unaltered form, and this is possible using the same interfaces as discussed in Options #2 and #3 above. HPE SecureData architecture enables the distributed, high-speed re-identification of the data so as not to slow down the Hadoop processing. If required, this data can be streamed to downstream applications for additional processing. Taking the same HIVE example used in #4 above, only this time we assume that clear text is required, we use access functions to re-identify the information, using the Developer Template for HIVE UDF, as follows: SELECT s.id, s.name, accessdata(s. , alpha ), accessdata(s.birth_date, date ), accessdata(s. cc, cc ), accessdata(s.ssn, ssn ),cs.creditscore FROM voltage_sample s JOIN voltage_sample_creditscorecs ON (s.ssn = cs.ssn) WHERE s.id <= 0; This results in returning the data in its re-identified form: Tyshawn Medhurst LA [email protected] 3/2/ This information can be used temporarily by the Hadoop analytic program to facilitate its computations, or passed outside Hadoop to other post-processing applications or analytic programs. This functionality is even available remotely through ODBC and JDBC, allowing customers to utilize their favorite analytic application without the need to add HPE SecureData code on the client side. Option 6: Exporting data and re-identifying outside Hadoop (ETL process, data export or BI import) Hadoop data can also be bulk transferred and re-identified during ETL processes and data exports for downstream processing outside of Hadoop. As with options # and #2, the interface to HPE SecureData can be evoked from virtually any standard platform. A common use case is to perform large scale analytics in the Hadoop cluster and then transfer the reduced data set into Teradata or other computing platforms with existing fine tuned business analytic rules, where it can be processed by existing BI tools using the data either in its protected state or re-identified within the host BI environment, as needed. Option 7: Using storage-level encryption within Hadoop HPE SecureStorage enables production use of Transparent Data Encryption (TDE) within Hadoop to create a safe landing zone at the scale of the full Hadoop cluster, to deposit data with granular access controls. From there, data-centric security can be applied as the data is migrated into the broader cluster using tools such as MapReduce. TDE can also be used to provide the data-at-rest with granular access control for unstructured data in Hadoop.
10 HPE SecureStorage also offers the option of encryption at the volume level, similar to other data-at-rest Hadoop encryption solutions available in the industry today. It is not necessary for any fields already protected by the data-centric security options listed above, of course, but is very useful as an extra level of defense for the entire data set stored in Hadoop, particularly unstructured data or sensitive data you might not yet be aware of. As discussed above, this reduces data exposure when cluster drives are disposed, repaired or re-purposed, but does not protect the data if accessed while the disk is running live within the system. The big advantage of HPE SecureStorage volume-level encryption versus these other Hadoop tools is that it uses the same key servers and encryption infrastructure as HPE SecureData data field-level products, enabling simplified and consistent key management across your organization. With HPE Stateless Key Management, there is no need for key backup, and no possibility of key loss. Conclusion Hadoop is an exciting new set of technologies whose unique architecture provides the ability to efficiently process huge sets of data. However, by its nature it is less secure than traditional enterprise computing environments, while presenting a particularly high-value target for hackers and thieves. The only way to ensure sensitive information is protected in Hadoop is to protect the data itself, ideally before it enters the Hadoop data store. This protection needs to prevent the data from unauthorized access while at-rest, in-use and in-motion. HPE SecureData data-centric security platform provides unique benefits that can provide this protection while still maintaining its value. Specifically, HPE SecureData provides: The ability to protect data as close to its source as possible. Support for encryption, tokenization and data masking protection techniques. Data is usable for most applications in its de-identified state. The ability to securely re-identify data when required only by authorized users and applications. Protection techniques backed by security proofs and standards. High performance, high scalability well-matched with Hadoop speeds. Broad platform and application support inside and outside Hadoop. For more information: voltage.com Copyright 205 Hewlett Packard Enterprise Development Company, L.P. The information contained herein is subject to change without notice. The only warranties for Hewlett Packard Enterprise products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. Hewlett Packard Enterprise shall not be liable for technical or editorial errors or omissions contained herein. Trademark acknowledgments, if needed. 4AA6-0857ENW, November 205
PROTECTING ENTERPRISE DATA IN HADOOP
TECHNICAL BRIEF PROTECTING ENTERPRISE DATA IN HADOOP Introduction Big Data is an exciting concept and emerging set of technologies that hold seemingly unlimited promise to enable organizations to gain
Data-Centric security and HP NonStop-centric ecosystems. Andrew Price, XYPRO Technology Corporation Mark Bower, Voltage Security
Title Data-Centric security and HP NonStop-centric ecosystems A breakthrough strategy for neutralizing sensitive data against advanced threats and attacks Andrew Price, XYPRO Technology Corporation Mark
Data-Centric Security vs. Database-Level Security
TECHNICAL BRIEF Data-Centric Security vs. Database-Level Security Contrasting Voltage SecureData to solutions such as Oracle Advanced Security Transparent Data Encryption Introduction This document provides
Big Data, Meet Enterprise Security
WHITE PAPER Big Data, Meet Enterprise Security Will Data Security and Compliance Issues Put Big Data Developments on Hold? Large organizations worldwide are working to develop and deploy Big Data analytical
Providing Secure Representative Data Sets
Test Data Protection Providing Secure Representative Data Sets By Dr. Ron Indeck VelociData Inc. - www.velocidata.com World Headquarters 321 North Clark Street, Suite 740 Chicago, IL 60654 Telephone: 312-600-4422
Data Security as a Business Enabler Not a Ball & Chain. Big Data Everywhere May 12, 2015
Data Security as a Business Enabler Not a Ball & Chain Big Data Everywhere May 12, 2015 Les McMonagle Protegrity - Director Data Security Solutions Les has over twenty years experience in information security.
Protegrity Data Security Platform
Protegrity Data Security Platform The Protegrity Data Security Platform design is based on a hub and spoke deployment architecture. The Enterprise Security Administrator (ESA) enables the authorized Security
Teradata and Protegrity High-Value Protection for High-Value Data
Teradata and Protegrity High-Value Protection for High-Value Data 03.16 EB7178 DATA SECURITY Table of Contents 2 Data-Centric Security: Providing High-Value Protection for High-Value Data 3 Visibility:
White paper. The Big Data Security Gap: Protecting the Hadoop Cluster
The Big Data Security Gap: Protecting the Hadoop Cluster Introduction While the open source framework has enabled the footprint of Hadoop to logically expand, enterprise organizations face deployment and
Securing Hadoop Data Big Data Everywhere - Atlanta January 27, 2015
Securing Hadoop Data Big Data Everywhere - Atlanta January 27, 2015 2015 Voltage Security, Inc. A History of Excellence Company: Founded in 2002 Out of Stanford University Based in Cupertino, California
SafeNet DataSecure vs. Native Oracle Encryption
SafeNet vs. Native Encryption Executive Summary Given the vital records databases hold, these systems often represent one of the most critical areas of exposure for an enterprise. Consequently, as enterprises
Mainframe Data Protection in an Age of Big Data, Mobile, and Cloud Computing
SOLUTION BRIEF Mainframe Data Protection in an Age of Big Data, Mobile, and Cloud Computing Compelling business value propositions such as improved time-to-insight, customer access, business agility, and
Fast, Low-Overhead Encryption for Apache Hadoop*
Fast, Low-Overhead Encryption for Apache Hadoop* Solution Brief Intel Xeon Processors Intel Advanced Encryption Standard New Instructions (Intel AES-NI) The Intel Distribution for Apache Hadoop* software
ProtectV. Securing Sensitive Data in Virtual and Cloud Environments. Executive Summary
VISIBILITY DATA GOVERNANCE SYSTEM OS PARTITION UNIFIED MANAGEMENT CENTRAL AUDIT POINT ACCESS MONITORING ENCRYPTION STORAGE VOLUME POLICY ENFORCEMENT ProtectV SECURITY SNAPSHOT (backup) DATA PROTECTION
Big Data Management and Security
Big Data Management and Security Audit Concerns and Business Risks Tami Frankenfield Sr. Director, Analytics and Enterprise Data Mercury Insurance What is Big Data? Velocity + Volume + Variety = Value
SECURING YOUR ENTERPRISE HADOOP ECOSYSTEM
WHITE PAPER SECURING YOUR ENTERPRISE HADOOP ECOSYSTEM Realizing Data Security for the Enterprise with Cloudera Securing Your Enterprise Hadoop Ecosystem CLOUDERA WHITE PAPER 2 Table of Contents Introduction
IBM InfoSphere Guardium Data Activity Monitor for Hadoop-based systems
IBM InfoSphere Guardium Data Activity Monitor for Hadoop-based systems Proactively address regulatory compliance requirements and protect sensitive data in real time Highlights Monitor and audit data activity
Ensure PCI DSS compliance for your Hadoop environment. A Hortonworks White Paper October 2015
Ensure PCI DSS compliance for your Hadoop environment A Hortonworks White Paper October 2015 2 Contents Overview Why PCI matters to your business Building support for PCI compliance into your Hadoop environment
Why Add Data Masking to Your IBM DB2 Application Environment
Why Add Data Masking to Your IBM DB2 Application Environment dataguise inc. 2010. All rights reserved. Dataguise, Inc. 2201 Walnut Ave., #260 Fremont, CA 94538 (510) 824-1036 www.dataguise.com dataguise
Securing Sensitive Data
Securing Sensitive Data A Comprehensive Guide to Encryption Technology Approaches Vormetric, Inc. 888.267.3732 408.433.6000 [email protected] www.vormetric.com Page 1 Executive Summary Enterprises can
Data Security and Governance with Enterprise Enabler
Copyright 2014 Stone Bond Technologies, L.P. All rights reserved. The information contained in this document represents the current view of Stone Bond Technologies on the issue discussed as of the date
Enterprise Key Management: A Strategic Approach ENTERPRISE KEY MANAGEMENT A SRATEGIC APPROACH. White Paper February 2010 www.alvandsolutions.
Enterprise Key Management: A Strategic Approach ENTERPRISE KEY MANAGEMENT A SRATEGIC APPROACH White Paper February 2010 www.alvandsolutions.com Overview Today s increasing security threats and regulatory
Streamlining Information Protection Through a Data-centric Security Approach
WHITE PAPER Streamlining Information Protection Through a Data-centric Security Approach Overview The sophistication and persistence of criminal attacks on online systems is growing, along with government
Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking
Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking SUMMARY The Payment Card Industry Data Security Standard (PCI DSS) defines 12 high-level security requirements directed
End-to-end Encryption for E-Commerce Payments using Voltage SecureData Web
Technical Brief using Voltage SecureData Web Introduction Today, merchants accepting card-not-present payments on the web are concerned about three major issues affecting their business with respect to
Big Data on Microsoft Platform
Big Data on Microsoft Platform Prepared by GJ Srinivas Corporate TEG - Microsoft Page 1 Contents 1. What is Big Data?...3 2. Characteristics of Big Data...3 3. Enter Hadoop...3 4. Microsoft Big Data Solutions...4
ORACLE DATABASE 10G ENTERPRISE EDITION
ORACLE DATABASE 10G ENTERPRISE EDITION OVERVIEW Oracle Database 10g Enterprise Edition is ideal for enterprises that ENTERPRISE EDITION For enterprises of any size For databases up to 8 Exabytes in size.
HP Atalla. Data-Centric Security & Encryption Solutions. Jean-Charles Barbou Strategic Sales Manager HP Atalla EMEA MAY 2015
Copyright 2015Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice. HP Restricted HP Atalla Data-Centric Security & Encryption Solutions Jean-Charles
Solving data residency and privacy compliance challenges Delivering business agility, regulatory compliance and risk reduction
Solving data residency and privacy compliance challenges Delivering business agility, regulatory compliance and risk reduction Introduction In today s dynamic business environment, corporation s intangible
Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities
Technology Insight Paper Converged, Real-time Analytics Enabling Faster Decision Making and New Business Opportunities By John Webster February 2015 Enabling you to make the best technology decisions Enabling
Efficient Key Management for Oracle Database 11g Release 2 Using Hardware Security Modules
Efficient Key Management for Oracle Database 11g Release 2 Using Hardware Security Modules WHITE PAPER Thales e-security www.thalesesec.com/oracle TABLE OF CONTENT Introduction...3 Oracle Database 11g
Data Governance in the Hadoop Data Lake. Michael Lang May 2015
Data Governance in the Hadoop Data Lake Michael Lang May 2015 Introduction Product Manager for Teradata Loom Joined Teradata as part of acquisition of Revelytix, original developer of Loom VP of Sales
Protegrity Tokenization
Securing Sensitive Data for PCI, HIPAA and Other Data Security Initiatives 2011 Edition Who should read it System architects, security experts, and other IT professionals who are looking to use tokenization
QLIKVIEW DEPLOYMENT FOR BIG DATA ANALYTICS AT KING.COM
QLIKVIEW DEPLOYMENT FOR BIG DATA ANALYTICS AT KING.COM QlikView Technical Case Study Series Big Data June 2012 qlikview.com Introduction This QlikView technical case study focuses on the QlikView deployment
RSA Solution Brief RSA. Encryption and Key Management Suite. RSA Solution Brief
RSA Encryption and Key Management Suite The threat of experiencing a data breach has never been greater. According to the Identity Theft Resource Center, since the beginning of 2008, the personal information
Coalfire Systems Inc.
Security Review Web with Page-Integrated Encryption (PIE) Technology Prepared for HP Security Voltage by: Coalfire Systems Inc. March 2, 2012 Table of contents 3 Executive Summary 4 Detailed Project Overview
An Oracle White Paper June 2009. Oracle Database 11g: Cost-Effective Solutions for Security and Compliance
An Oracle White Paper June 2009 Oracle Database 11g: Cost-Effective Solutions for Security and Compliance Protecting Sensitive Information Information ranging from trade secrets to financial data to privacy
ENABLING GLOBAL HADOOP WITH EMC ELASTIC CLOUD STORAGE
ENABLING GLOBAL HADOOP WITH EMC ELASTIC CLOUD STORAGE Hadoop Storage-as-a-Service ABSTRACT This White Paper illustrates how EMC Elastic Cloud Storage (ECS ) can be used to streamline the Hadoop data analytics
Capitalize on Big Data for Competitive Advantage with Bedrock TM, an integrated Management Platform for Hadoop Data Lakes
Capitalize on Big Data for Competitive Advantage with Bedrock TM, an integrated Management Platform for Hadoop Data Lakes Highly competitive enterprises are increasingly finding ways to maximize and accelerate
An Oracle White Paper November 2010. Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics
An Oracle White Paper November 2010 Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics 1 Introduction New applications such as web searches, recommendation engines,
Innovative technology for big data analytics
Technical white paper Innovative technology for big data analytics The HP Vertica Analytics Platform database provides price/performance, scalability, availability, and ease of administration Table of
Data Security as a Business Enabler Not a Ball & Chain. Big Data Everywhere May 21, 2015
Data Security as a Business Enabler Not a Ball & Chain Big Data Everywhere May 21, 2015 Les McMonagle Protegrity - Director Data Security Solutions Les has over twenty years experience in information security.
Securing and protecting the organization s most sensitive data
Securing and protecting the organization s most sensitive data A comprehensive solution using IBM InfoSphere Guardium Data Activity Monitoring and InfoSphere Guardium Data Encryption to provide layered
Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010
Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010 Better Together Writer: Bill Baer, Technical Product Manager, SharePoint Product Group Technical Reviewers: Steve Peschka,
Securing Data in the Virtual Data Center and Cloud: Requirements for Effective Encryption
THE DATA PROTECTIO TIO N COMPANY Securing Data in the Virtual Data Center and Cloud: Requirements for Effective Encryption whitepaper Executive Summary Long an important security measure, encryption has
Data Security For Government Agencies
Data Security For Government Agencies Version: Q115-101 Table of Contents Abstract Agencies are transforming data management with unified systems that combine distributed storage and computation at limitless
White Paper Big Data Without Big Headaches
Vormetric, Inc. 2545 N. 1st Street, San Jose, CA 95131 United States: 888.267.3732 United Kingdom: +44.118.949.7711 Singapore: +65.6829.2266 [email protected] www.vormetric.com THE NEW WORLD OF DATA IS
Debunking The Myths of Column-level Encryption
Debunking The Myths of Column-level Encryption Vormetric, Inc. 888.267.3732 408.433.6000 [email protected] www.vormetric.com Page 1 Column-level Encryption Overview Enterprises have a variety of options
How To Use Hp Vertica Ondemand
Data sheet HP Vertica OnDemand Enterprise-class Big Data analytics in the cloud Enterprise-class Big Data analytics for any size organization Vertica OnDemand Organizations today are experiencing a greater
Sharpen your document and data security HP Security solutions for imaging and printing
Sharpen your document and data security HP Security solutions for imaging and printing Recognize hidden risks You know how valuable data is to your organization. But the more data you acquire and share,
EmulexSecure 8Gb/s HBA Architecture Frequently Asked Questions
EmulexSecure 8Gb/s HBA Architecture Frequently Asked Questions Security and Encryption Overview... 2 1. What is encryption?... 2 2. What is the AES encryption standard?... 2 3. What is key management?...
Voltage Secure Commerce
SOLUTION BRIEF Voltage Secure Commerce PROTECT SENSITIVE DATA FROM BROWSER TO BACK-OFFICE Safely Enable Mobile and E-commerce Channels while Simplifying PCI Compliance If your business runs credit card
Protecting Big Data Data Protection Solutions for the Business Data Lake
White Paper Protecting Big Data Data Protection Solutions for the Business Data Lake Abstract Big Data use cases are maturing and customers are using Big Data to improve top and bottom line revenues. With
Implement Hadoop jobs to extract business value from large and varied data sets
Hadoop Development for Big Data Solutions: Hands-On You Will Learn How To: Implement Hadoop jobs to extract business value from large and varied data sets Write, customize and deploy MapReduce jobs to
Data Breaches Gone Mad. Straight Away! Wednesday September 28 th, 2011
Data Breaches Gone Mad Learn how to Secure your Data Warehouse Straight Away! Wednesday September 28 th, 2011 Martin Willcox Director Product & Solutions Marketing Teradata Europe, Middle East & Africa
Affordable, Scalable, Reliable OLTP in a Cloud and Big Data World: IBM DB2 purescale
WHITE PAPER Affordable, Scalable, Reliable OLTP in a Cloud and Big Data World: IBM DB2 purescale Sponsored by: IBM Carl W. Olofson December 2014 IN THIS WHITE PAPER This white paper discusses the concept
Leveraging innovative security solutions for government. Helping to protect government IT infrastructure, meet compliance demands and reduce costs
IBM Global Technology Services Leveraging innovative security solutions for government. Helping to protect government IT infrastructure, meet compliance demands and reduce costs Achieving a secure government
Brochure. Update your Windows. HP Technology Services for Microsoft Windows 2003 End of Support (EOS) and Microsoft Migrations
Brochure Update your Windows HP Technology Services for Microsoft End of Support (EOS) and Microsoft Migrations Stabilize and secure your infrastructure Microsoft will end support for Windows Server 2003/R2
Voltage Secure Stateless Tokenization
WHITE PAPER Voltage Secure Stateless Tokenization DATA PROTECTION AND PCI SCOPE REDUCTION FOR TODAY S BUSINESSES Introduction Cyber criminals have proved adept at thwarting existing IT defenses and exploiting
Move Data from Oracle to Hadoop and Gain New Business Insights
Move Data from Oracle to Hadoop and Gain New Business Insights Written by Lenka Vanek, senior director of engineering, Dell Software Abstract Today, the majority of data for transaction processing resides
Testing Big data is one of the biggest
Infosys Labs Briefings VOL 11 NO 1 2013 Big Data: Testing Approach to Overcome Quality Challenges By Mahesh Gudipati, Shanthi Rao, Naju D. Mohan and Naveen Kumar Gajja Validate data quality by employing
HADOOP SOLUTION USING EMC ISILON AND CLOUDERA ENTERPRISE Efficient, Flexible In-Place Hadoop Analytics
HADOOP SOLUTION USING EMC ISILON AND CLOUDERA ENTERPRISE Efficient, Flexible In-Place Hadoop Analytics ESSENTIALS EMC ISILON Use the industry's first and only scale-out NAS solution with native Hadoop
Brochure. Data Protector 9: Nine reasons to upgrade
Brochure Data Protector 9: Nine reasons to upgrade Data Protector 9: Nine reasons to upgrade Shifting data center requirements are forcing IT organizations to reassess backup and recovery processes, strategies,
OPEN MODERN DATA ARCHITECTURE FOR FINANCIAL SERVICES RISK MANAGEMENT
WHITEPAPER OPEN MODERN DATA ARCHITECTURE FOR FINANCIAL SERVICES RISK MANAGEMENT A top-tier global bank s end-of-day risk analysis jobs didn t complete in time for the next start of trading day. To solve
Voltage SecureData Web with Page-Integrated Encryption (PIE) Technology Security Review
Voltage SecureData Web with Page-Integrated Encryption (PIE) Technology Security Review Prepared for: Coalfire Systems, Inc. March 2, 2012 Table of Contents EXECUTIVE SUMMARY... 3 DETAILED PROJECT OVERVIEW...
Agile Business Intelligence Data Lake Architecture
Agile Business Intelligence Data Lake Architecture TABLE OF CONTENTS Introduction... 2 Data Lake Architecture... 2 Step 1 Extract From Source Data... 5 Step 2 Register And Catalogue Data Sets... 5 Step
Preemptive security solutions for healthcare
Helping to secure critical healthcare infrastructure from internal and external IT threats, ensuring business continuity and supporting compliance requirements. Preemptive security solutions for healthcare
Advanced Big Data Analytics with R and Hadoop
REVOLUTION ANALYTICS WHITE PAPER Advanced Big Data Analytics with R and Hadoop 'Big Data' Analytics as a Competitive Advantage Big Analytics delivers competitive advantage in two ways compared to the traditional
SAP HANA SPS 09 - What s New? HANA IM Services: SDI and SDQ
SAP HANA SPS 09 - What s New? HANA IM Services: SDI and SDQ (Delta from SPS 08 to SPS 09) SAP HANA Product Management November, 2014 2014 SAP SE or an SAP affiliate company. All rights reserved. 1 Agenda
Optimized for the Industrial Internet: GE s Industrial Data Lake Platform
Optimized for the Industrial Internet: GE s Industrial Lake Platform Agenda The Opportunity The Solution The Challenges The Results Solutions for Industrial Internet, deep domain expertise 2 GESoftware.com
An Oracle White Paper June 2012. High Performance Connectors for Load and Access of Data from Hadoop to Oracle Database
An Oracle White Paper June 2012 High Performance Connectors for Load and Access of Data from Hadoop to Oracle Database Executive Overview... 1 Introduction... 1 Oracle Loader for Hadoop... 2 Oracle Direct
What's New in SAS Data Management
Paper SAS034-2014 What's New in SAS Data Management Nancy Rausch, SAS Institute Inc., Cary, NC; Mike Frost, SAS Institute Inc., Cary, NC, Mike Ames, SAS Institute Inc., Cary ABSTRACT The latest releases
EMC DATA DOMAIN ENCRYPTION A Detailed Review
White Paper EMC DATA DOMAIN ENCRYPTION A Detailed Review Abstract The proliferation of publicized data loss, coupled with new governance and compliance regulations, is driving the need for customers to
Navigating Endpoint Encryption Technologies
Navigating Endpoint Encryption Technologies Whitepaper November 2010 THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES ONLY, AND MAY CONTAIN TYPOGRAPHICAL ERRORS AND TECHNICAL INACCURACIES. THE CONTENT IS
How to Secure Your SharePoint Deployment
WHITE PAPER How to Secure Your SharePoint Deployment Some of the sites in your enterprise probably contain content that should not be available to all users [some] information should be accessible only
Information Architecture
The Bloor Group Actian and The Big Data Information Architecture WHITE PAPER The Actian Big Data Information Architecture Actian and The Big Data Information Architecture Originally founded in 2005 to
Mitigating Risks and Monitoring Activity for Database Security
The Essentials Series: Role of Database Activity Monitoring in Database Security Mitigating Risks and Monitoring Activity for Database Security sponsored by by Dan Sullivan Mi tigating Risks and Monitoring
and Hadoop Technology
SAS and Hadoop Technology Overview SAS Documentation The correct bibliographic citation for this manual is as follows: SAS Institute Inc. 2015. SAS and Hadoop Technology: Overview. Cary, NC: SAS Institute
HP and Business Objects Transforming information into intelligence
HP and Business Objects Transforming information into intelligence 1 Empowering your organization Intelligence: the ability to acquire and apply knowledge. For businesses today, gaining intelligence means
Business Case for Voltage SecureMail Mobile Edition
WHITE PAPER Business Case for Voltage SecureMail Mobile Edition Introduction Mobile devices such as smartphones and tablets have become mainstream business productivity tools with email playing a central
IBM Software Information Management Creating an Integrated, Optimized, and Secure Enterprise Data Platform:
Creating an Integrated, Optimized, and Secure Enterprise Data Platform: IBM PureData System for Transactions with SafeNet s ProtectDB and DataSecure Table of contents 1. Data, Data, Everywhere... 3 2.
Achieving PCI DSS 2.0 Compliance with Voltage Security
WHITE PAPER Achieving PCI DSS 2.0 Compliance with Security Introduction The Payment Card Industry (PCI) Data Security Standard (DSS) 2.0 1 dictates that organizations processing and storing credit card
Data Masking: A baseline data security measure
Imperva Camouflage Data Masking Reduce the risk of non-compliance and sensitive data theft Sensitive data is embedded deep within many business processes; it is the foundational element in Human Relations,
nwstor Storage Security Solution 1. Executive Summary 2. Need for Data Security 3. Solution: nwstor isav Storage Security Appliances 4.
CONTENTS 1. Executive Summary 2. Need for Data Security 3. Solution: nwstor isav Storage Security Appliances 4. Conclusion 1. EXECUTIVE SUMMARY The advantages of networked data storage technologies such
Top 5 reasons to choose HP Information Archiving
Technical white paper Top 5 reasons to choose HP Information Archiving Proven, market-leading archiving solutions The value of intelligent archiving The requirements around managing information are becoming
Real-Time Database Protection and. Overview. 2010 IBM Corporation
Real-Time Database Protection and Monitoring: IBM InfoSphere Guardium Overview Agenda Business drivers for database security InfoSphere Guardium architecture Common applications The InfoSphere portfolio
IBM Software InfoSphere Guardium. Planning a data security and auditing deployment for Hadoop
Planning a data security and auditing deployment for Hadoop 2 1 2 3 4 5 6 Introduction Architecture Plan Implement Operationalize Conclusion Key requirements for detecting data breaches and addressing
Hadoop in the Hybrid Cloud
Presented by Hortonworks and Microsoft Introduction An increasing number of enterprises are either currently using or are planning to use cloud deployment models to expand their IT infrastructure. Big
How To Reduce Pci Dss Scope
WHITE PAPER Intel Expressway Tokenization Broker PCI DSS Reducing PCI DSS Scope: The Gateway Approach Challenge: Payment applications that handle credit card numbers pull connected systems into PCI DSS
Data-Centric Security Key to Cloud and Digital Business
Data-Centric Security Key to Cloud and Digital Business Ulf Mattsson CTO, Protegrity Ulf.Mattsson AT protegrity.com Ulf Mattsson, Protegrity CTO Cloud Security Alliance (CSA) PCI Security Standards Council
Ubuntu and Hadoop: the perfect match
WHITE PAPER Ubuntu and Hadoop: the perfect match February 2012 Copyright Canonical 2012 www.canonical.com Executive introduction In many fields of IT, there are always stand-out technologies. This is definitely
Detecting Anomalous Behavior with the Business Data Lake. Reference Architecture and Enterprise Approaches.
Detecting Anomalous Behavior with the Business Data Lake Reference Architecture and Enterprise Approaches. 2 Detecting Anomalous Behavior with the Business Data Lake Pivotal the way we see it Reference
ORACLE DATA INTEGRATOR ENTERPRISE EDITION
ORACLE DATA INTEGRATOR ENTERPRISE EDITION Oracle Data Integrator Enterprise Edition 12c delivers high-performance data movement and transformation among enterprise platforms with its open and integrated
Luncheon Webinar Series May 13, 2013
Luncheon Webinar Series May 13, 2013 InfoSphere DataStage is Big Data Integration Sponsored By: Presented by : Tony Curcio, InfoSphere Product Management 0 InfoSphere DataStage is Big Data Integration
