Managing Risk to Sensitive Data with SecureSphere White Paper Sensitive information is typically scattered across heterogeneous systems throughout various physical locations around the globe. The rate at which sensitive data grows outpaces organizations ability to manage and protect it. As a result, most organizations simply can t track where all their sensitive data is. Lack of visibility into the location and content of critical data assets leaves companies exposed to significant risk. Understanding where databases are located, what type of information they hold, and who has access to the stored sensitive data is a critical step in managing risk and achieving data governance and compliance. In this white paper we will explore the need to discover and classify sensitive data in enterprise databases. We will explain how SecureSphere Discovery and Assessment Server (DAS) enables the assessment of data risk posture through the analysis of discovered data and vulnerabilities on database platforms. Additionally, we will explore risk mitigation via Imperva SecureSphere Data Security Suite in terms of identifying and managing risk to sensitive data.
The Role of Discovery And Classification in Database Security Managing sensitive data such as cardholder data, personal identification information (PII), Non Public Information (NPI), Protected Health Information (PHI) or other types of sensitive information that should be protected, creates a data security and compliance challenge. This is because organizations must ensure that confidential and sensitive data is protected from theft, abuse, and misuse. Regulations and privacy acts require governance of sensitive data. Some examples include:» PCI-DSS: requires protection of credit card information and deletion of card-holder authorization information i.e. magnetic strip information» Sarbanes Oxley: demands that organizations ensure the integrity of corporate financial data» HIPAA: requires protection of patient information» Privacy Acts in 40+ US states and around the world require protection of personal identification information and customer information Other types of unregulated data including intellectual property and operational data must be protected as well since there is growing evidence that this type of data is being targeted. In order to protect sensitive data, organizations must know where data resides and which data types exist. Once organizations understand how sensitive data is distributed across their data repositories, they can better enforce security policies and apply more effective controls. Continuous discovery ensures new data can be included in security and protection efforts. Discovering Database Servers on the Enterprise Network As organizations accumulate more data that needs to be protected it is important to ensure that all systems holding sensitive information are included within the scope of a data security/compliance project. This is a challenge in most environments as it becomes increasingly difficult to keep track of the location of systems in different datacenters and the types of data they contain. As a result, the scope of a security or compliance project may not be properly defined, documented, or controlled. Changes to the scope of the project may lead to overrunning the project s budgets and delivery dates. In order to help organizations gain better visibility Imperva SecureSphere DAS includes network discovery tools that can automatically scan enterprise networks and identify the existence of a database server. Users can easily create custom discovery jobs to scan any part of their network. Picture 1 shows SecureSphere service discovery policy definition. Users can choose the IP range to be scanned, the port range to be scanned and the services they d like to discover. Note that not all IPs and ports need to be scanned. By defining a limited IP and port range you can shorten the time it takes to discover services within this range. Picture 1: Defining a discovery job < 2 >
Analysis of Database Server Discovery Results SecureSphere DAS provides the necessary details for managing these data assets, including the IP address and host name of the asset, the ports used by web and database services, and the existence of sensitive data on that server. It also helps organizations understand if the system is new on the network, or if any configuration changes were made since the last scan. Picture 2 shows a tabular view of the discovery results. Additional analysis views help organizations analyze discovery results and gain better understanding of the distribution of these systems within the network. These analysis views are shown in picture 3. A filter option enables users to focus views on specific details. The PDF button at the top left area of these views allows users to easily convert the views into static reports with a single click of a mouse. Picture 2: Service discovery results Picture 3: Pre-defined analysis views for understanding discovered services In order to further protect new database servers on the network, once a database server has been identified, SecureSphere can immediately assign it to an existing SecureSphere Server Group and apply security and audit policies to it. This can be done automatically or manually (by administrative users). Workflow analysis helps keep track of systems that are pending assignment, added to a server group, or rejected. Picture 4 shows how new entities would be uniquely identified once added into SecureSphere Server Groups. Picture 4: adding discovered servers into existing SecureSphere server groups Through its discovery capabilities, SecureSphere DAS helps organizations better manage their enterprise data assets. < 3 >
Identification and Classification of Data Stored in Databases: Sensitive data must be identified before it can be monitored, audited, and protected from theft, abuse, and misuse. Understanding where data is located is the foundation of a sound framework for assessing governance and compliance risk. Without knowing where data resides and which data types can be found in the organization s databases it is practically impossible to protect that data. Since today s IT environments are dynamic, and since businesses continue to expand the amount of sensitive data stored in repositories, it is not possible to track sensitive data manually. Organizations must repeatedly scan data repositories to ensure continuous awareness of data that must be protected. Visibility into the data uncovered in data discovery and classification initiatives can help organizations clean up repositories that unnecessarily hold sensitive data. Sensitive data should be stored only where it can be protected and controlled. However, it is common to see cases where sensitive data creeps into systems which are not properly managed and protected. This could be a result of improper practices like cloning production sensitive data into development and test environments, or as a result of application or infrastructure changes. Regardless, this information should either be deleted or protected. Technical Guidelines for PCI Data Storage: PCI-DSS is an example of a regulation that requires organizations to clean up databases from unnecessary sensitive data. PCI-DSS prohibits merchants from storing card-holder authentication data including CVV2, CVC2 and CID codes, track data from the magnetic strip or PIN data. Merchants that have such data residing within their databases must identify it and remove it. Unlike non-authentication cardholder data (PAN, Cardholder name, service code and expiration date) which can be stored as long as it is protected by data encryption solutions or other compensating controls, when it comes to sensitive authentication, data merchants are required to delete any sensitive authentication data from the database. PCI Data Storage Guidelines for Merchants are summarized in the following table: Data Element Storage Permitted Protection required PCI DSS Req.3.4 Primary Account Number (PAN) Yes Yes Yes Cardholder Name Yes Yes No Cardholder Data Service Code Yes Yes No Expiration Date Yes Yes No Full Magnetic Stripe Data No N/A N/A Sensitive CAV2/CVC2/CVV2/CID No N/A N/A Authentication Data PIN/PIN Block No N/A N/A Source: https://www.pcisecuritystandards.org/pdfs/pci_fs_data_storage.pdf Practical Data Discovery and Classification SecureSphere DAS is the first solution that enables organizations to effectively scan their databases and identify the existence of sensitive data. Data discovery and classification policies are easily created to address the custom needs of the customer. With default, purpose-built content, SecureSphere includes the ability to locate the following data types:» Financial Transactions» Credit Card Numbers and Cardholder information» System and Application Credentials» Personal Identification Information (PII)» User Defined Account Numbers» Personal Identification Numbers (PIN)» Custom Data Types Custom data types can be added by SecureSphere users to accommodate the unique needs of each organization. < 4 >
Data Discovery and classification can operate together with a network discovery scan to identify sensitive databases on the network; however unlike the network scan which doesn t require credentials to discover the server, Data Discovery and Classification requires users to provide database credentials. SecureSphere DAS uses two methods for classifying data as sensitive: 1. Dictionary SecureSphere DAS searches objects and object columns for known key words that may indicate the existence of sensitive data. For example if a database object is called user credentials or if a column is called password this object will be tagged as sensitive. 2. Pattern Matching SecureSphere DAS will try to match data within an object with different data patterns that may indicate that the data is sensitive. SecureSphere includes a list of known data patterns and allows users to add additional data patterns. These custom patterns can be easily defined by using regular expressions and are outlined in the SecureSphere user guide. Together these two methods provide the rapid and holistic identification of sensitive data within databases. Data Validation Reduces False Positives SecureSphere DAS ensures minimal false positive identification of sensitive data by using validation algorithms. Not every occurrence of a nine-digit string indicates a US social security number. Nine digit strings can also be a phone number, a zip code plus four, or many other things. If falsely identified organizations might be spending resources on protecting non-sensitive information instead of focusing on the truly sensitive information. An example of a validation algorithm used by SecureSphere is the Luhn Algorithm. In order to classify a discovered string of sixteen digits as a credit card number, SecureSphere DAS uses the Luhn algorithm which is the same algorithm used by credit card providers for creating the credit card numbers. This validates the discovered string as a valid credit card number as opposed to a random sequence of sixteen digits. Data Discovery and Classification Analysis SecureSphere DAS data discovery and classification results include the name of the database, the schema, the object (table, synonym or view) and the specific columns that hold the sensitive data. It also provides information about the data category and informs the user if the table is new or previously seen. Picture 5 shows a tabular view of data discovery and classification details: Picture 5: Data discovery and classification results Additional interactive analysis views help organizations analyze the data and understand where different data types reside and how data is distributed across the organization. Users can easily apply different pre-defined views and apply various filters to analyze discovered data. The PDF button at the top left area of these views allows users to easily convert the views into static reports with a single click of a mouse. An example of the data discovery and classification analysis views is shown in picture 6. This view analyzes the distribution of classified data within scanned databases. < 5 >
Picture 6: Pre-defined analysis views for understanding discovered data Taking the Next Steps To Effective Data Risk Management Data Risk Management requires knowledge of data assets, where they are, what is happening to them, what kind of data breach might take place on the assets and, most importantly, the costs associated with a data breach involving the asset. In most organizations the dynamic nature of the business, changing infrastructure and evolving applications make the management of these data assets a challenge. Comprehensive assessment of platform, software, and configuration vulnerabilities is a critical component of data risk management as it enables the identification of vulnerabilities that put data at risk. Based on the analysis of identified vulnerabilities and data at risk, organizations can prioritize remediation efforts. In order to achieve complete data governance organizations should also consider implementing audit and security controls such as Database Activity Monitoring solutions which provide visibility into the actual usage of sensitive data and enforce better access controls. Managing Risk to Sensitive Data: SecureSphere delivers a unique data risk management approach that centralizes and automates data risk management processes resulting in improved visibility into risk pertaining sensitive data. In order to understand risk, SecureSphere DAS assesses the vulnerability of discovered systems and provides a risk score which is based on the severity of discovered vulnerabilities and the sensitivity of the data on the specific platform. The results are shown in a graphical risk explorer which provides a centralized view of overall risk to data. Users can navigate the risk explorer to view risk that pertains to a specific data type, i.e. look only at PCI data, PII data, etc. or analyze risk at different locations containing data assets. Drilldown views provide more details about specific vulnerabilities and mis-configurations associated with the relevant platforms. The graphical Risk Explorer helps organizations effectively understand the areas of risk in the organization, and supports better analysis and decision making. Picture 7 shows the data view in the graphical risk explorer from this dashboard users can analyze risk that pertains to specific data types. Further analysis is done by selecting the area of interest and drilling down to get more details about the vulnerable platforms and the specific vulnerabilities that put this data at risk. < 6 >
Picture 7: SecureSphere Risk Explorer Assessing Vulnerabilites and Mis-Configuratrions that May Put Data at Risk: Over 1000 tests for assessing vulnerabilities and mis-configurations of database servers, and their OS platforms, are included in SecureSphere DAS. Custom scripts used to test specific configurations or vulnerabilities can be easily added to SecureSphere and included in assessment scans. The assessments can be run ad-hoc or scheduled to run periodically to assess any group of servers automatically, without administrator intervention. Running centralized assessments improves the quality and productivity of any security team. SecureSphere assessments are kept up-to-date through the ADC update mechanism which automatically updates SecureSphere based on the latest research from the Imperva Application Defense Center (ADC) research team. Picture 8 shows a sample of predefined assessment policies and the test list available in SecureSphere DAS. Picture 8: SecureSpehre assessment policies and tests Analyzing and Managing Vulnerabilities Interactive analysis views and reports to help IT manage and mitigate discovered vulnerabilities quickly and efficiently. These views include an overview of open vulnerabilities, trends and mitigation status. Automated reports can be easily created, from any view, and scheduled to be delivered in PDF or CSV formats. Integration with 3rd party security solutions is supported for streamlining security management processes. Picture 9 shows an example of vulnerability analysis view. This is an analysis of vulnerable servers in the environment based on the distribution of the vulnerabilities, number of vulnerabilities per server and their severity. < 7 >
Picture 9: Pre-defined views for vulnerability analysis Picture 10 shows the Vulnerability Distribution Tab Cloud. The interactive Tag Cloud depicts vulnerabilities using font size and boldness to indicate the severity and occurrences of different vulnerabilities. The tags are hyperlinked to analysis views that focus on specific vulnerabilities, allowing users to quickly focus and analyze these vulnerabilities. Picture 10: Vulnerability Distribution Tag Cloud Picture 11 shows the vulnerability workbench area, which enables users to track and manage identified vulnerabilities. Discovered vulnerabilities are assigned with a severity a calculated value based on the Common Vulnerability Scoring System (CVSS). They also mapped to a CVE identifier and the NIST standard, allowing users to search and learn more about the vulnerability. < 8 >
Picture 11: the vulnerability workbench Mitigation and Virtual Patching Protecting sensitive data from known vulnerabilities and mis-configurations requires organizations to either remediate vulnerabilities by applying patches, or have an ability to disable the risk associated with the vulnerabilities. When a patch for addressing vulnerability exists, organizations can deploy it after testing. Organizations can also change the configurations of their database servers. However, recent studies have shown that most organizations are lagging the deployment of patches and configuration changes. There are a couple of reasons for this:» Application and Database Mission Criticality: Enterprise applications and database systems supporting them can be extremely complex and critical to business operations. It can take years to develop and deploy an enterprise application. Any change/patch may affect the performance of these critical applications or even break it, causing downtime. Patches and configuration changes must be thoroughly tested before deployed on a critical system. But testing patches requires time and resources which are not always available. As a result patches and configuration changes are not deployed in a timely manner. Patches may also introduce new issues: there have been cases where a patch fixed a known vulnerability, but exposed the application and database to other vulnerabilities. In order to ensure that patches and configurations are safe to deploy, they should be thoroughly tested but again the time and resource aren t always available.» Patches may not exist: Patches need to be created, either by the organization in the case of custom solutions or by the vendors providing the technologies. This requires assignment of resources and delaying other priorities. As a result, organizations don t always have available patches that can be deployed. If a patch is not available, organizations need to find an alternate way to protect their systems. SecureSphere Data Security Suite enables users to apply a Virtual Patching solution: Virtual Patching is the ability to transparently protect systems from attempts to exploit known vulnerabilities, without making any changes to the current configuration of the server and without deployment of patches. This capability is enabled by the SecureSphere Database Firewall and has the ability to block and or alert on exploit attempts before they reach the database server. This is the most efficient and effective method of addressing known vulnerabilities and at a minimum gives the organization time to adequately evaluate patches before deploying them no more patching fire-drills. Virtual Patching is only available with SecureSphere Database Firewall or the complete Data Security Suite. < 9 >
Addressing Data Security and Governance Requirements Regulations such as PCI DSS and Sarbanes-Oxley Act (SOX) of 2002 set requirements for ensuring the integrity and authorized usage of sensitive information. To verify data security and governance, auditors look at multiple aspects of a database environment including: user management, authentication, separation of duties, access control, and audit trail. Data Discovery, Classification and Risk Assessments enable the first step in addressing security and governance requirements. Once sensitive data has been located and classified, and risk to data has been analyzed, appropriate audit and protection policies must to be defined. Ongoing monitoring and protection of data related activities must be implemented and analysis tools like statics reports and dynamic views are needed to measure the effectiveness of these controls as well as support forensic investigations. SecureSphere Data Security Suite In order to audit the usage of sensitive data and enforce better controls over it, Imperva SecureSphere enables the implementation of a data security and compliance lifecycle (shown in picture 12). The first stage of this solution is the discovery and assessment of databases and data within the organization s infrastructure. Following the discovery of data assets, Imperva SecureSphere monitors the usage of the data and uses the unique Dynamic Profiling technology to create and apply security and audit policies. SecureSphere Database Activity Monitoring (DAM) provides intelligent monitoring and analysis of the activity that affects sensitive data, providing detailed audit trails, and reporting on all user access to sensitive data. It helps organizations answer the important questions:» Who is accessing the data?» What specific data is being accessed and what s the source of the activity?» Where does the data reside?» When was the data accessed? (specific date and time for each data related activity)» How is the data being accessed (source applications and tools)? SecureSphere Database Firewall (DBF) adds the ability to enforce access controls and block target attacks on databases. All SecureSphere Data Security solutions includes a comprehensive set of value-added compliance reports that demonstrate configuration and usage are within best practice guidelines. Administrators can define custom reports with the necessary level of audit data granularity, and can export them in.pdf or.csv formats for easy distribution to auditors and executives. This allows risk, security, and compliance executives to easily review the results validate the integrity of their data and certify to stakeholders that management has taken appropriate steps and implemented controls. The SecureSphere Data Security Suite also includes an integrated Web Application Firewall which protects web applications from attacks and provides complete correlation between the web user activity and related activity at the database layer. < 10 >
Picture 12: SecureSphere Data Security Lifecycle Conclusion Businesses today process more sensitive data than ever before; the amount of digital data available and the number of people that can access it is growing exponentially. Managing confidential and sensitive data creates a data security and compliance challenge as organizations must ensure that it is protected from theft, abuse, and misuse. Various regulations and privacy acts require governance of sensitive data. Other types of unregulated data including intellectual property and operational data must be protected as well. In order to protect sensitive data, organizations must know where data resides and which data types exist. Continuous discovery of databases and classification of the information they store ensures new data can be included in security and compliance efforts. Once organizations understand how sensitive data is distributed across their data repositories, they can better enforce audit and security policies, and apply more effective controls. SecureSphere Discovery and Assessment Server (DAS) provides the best database discovery and data classification solution identifying data assets that need to be protected from unauthorized access and targeted attacks. An integrated Data Risk Management solution enables organizations to manage risk to sensitive data through analysis of discovered databases, classified data, and vulnerability assessment. Visibility into databases on the network, and sensitive data residing on them, enables organizations to properly scope database security and compliance initiatives, and avoid a scope creep that may result in overrunning planned budgets and resources. Upgraded into Imperva SecureSphere Database Security Suite it enables organizations to gain unprecedented control over sensitive data in enterprise database. From locating and classifying sensitive data, analyzing vulnerabilities that put sensitive data at risk, to obtaining ongoing visibility into data usage, a complete audit trail and real-time protection against unauthorized activities and database attacks SecureSphere is a powerful and highly cost-effective solution for controlling risk to data and ensuring data governance. < 11 >
Imperva Headquarters 3400 Bridge Parkway Suite 101 Redwood Shores, CA 94065 Tel: +1-650-345-9000 Fax: +1-650-345-9004 Toll Free (U.S. only): +1-866-926-4678 www.imperva.com Copyright 2009, Imperva All rights reserved. Imperva and SecureSphere are registered trademarks of Imperva. All other brand or product names are trademarks or registered trademarks of their respective holders. #WP-DISCOVERY_ASSESSMENT_SERVER-0909rev1