DATA CENTER IPS COMPARATIVE ANALYSIS



Similar documents
NEXT GENERATION FIREWALL COMPARATIVE ANALYSIS

DATA CENTER IPS COMPARATIVE ANALYSIS

DATA CENTER IPS COMPARATIVE ANALYSIS

NEXT GENERATION FIREWALL COMPARATIVE ANALYSIS

Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities?

2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

WEB APPLICATION FIREWALL COMPARATIVE ANALYSIS

WEB APPLICATION FIREWALL PRODUCT ANALYSIS

2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

ENTERPRISE EPP COMPARATIVE ANALYSIS

SSL Performance Problems

Breach Found. Did It Hurt?

CORPORATE AV / EPP COMPARATIVE ANALYSIS

NETWORK INTRUSION PREVENTION SYSTEM PRODUCT ANALYSIS

TEST METHODOLOGY. Endpoint Protection Evasion and Exploit. v4.0

ENTERPRISE EPP COMPARATIVE REPORT

TEST METHODOLOGY. Web Application Firewall. v6.2

An Old Dog Had Better Learn Some New Tricks

Evolutions in Browser Security

Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT

The CISO s Guide to the Importance of Testing Security Devices

TEST METHODOLOGY. Distributed Denial-of-Service (DDoS) Prevention. v2.0

How To Create A Firewall Security Value Map (Svm) 2013 Nss Labs, Inc.

Why Is DDoS Prevention a Challenge?

TEST METHODOLOGY. Network Firewall Data Center. v1.0

TEST METHODOLOGY. Hypervisors For x86 Virtualization. v1.0

NEXT GENERATION INTRUSION PREVENTION SYSTEM (NGIPS) TEST REPORT

How To Test A Ddos Prevention Solution

Internet Advertising: Is Your Browser Putting You at Risk?

NEXT GENERATION FIREWALL PRODUCT ANALYSIS

NEXT GENERATION FIREWALL PRODUCT ANALYSIS

Multiple Drivers For Cyber Security Insurance

NEXT GENERATION FIREWALL TEST REPORT

Achieve Deeper Network Security

Mobile App Containers: Product Or Feature?

Achieve Deeper Network Security and Application Control

BROWSER SECURITY COMPARATIVE ANALYSIS

Best Practices in Deploying Anti-Malware for Best Performance

TEST METHODOLOGY. Data Center Firewall. v2.0

2010 White Paper Series. Layer 7 Application Firewalls

The Business Case for Security Information Management

How To Sell Security Products To A Network Security Company

NETWORK INTRUSION PREVENTION SYSTEM

Cautela Labs Cloud Agile. Secured. Threat Management Security Solutions at Work

Securing Endpoints without a Security Expert

Cisco Advanced Services for Network Security

Barracuda Intrusion Detection and Prevention System

How To Make An Attack Unnoticed By A Network Security System (Ipo) Or A Network (Ngfw) Without Being Detected (Ngw)

NETWORK INTRUSION PREVENTION SYSTEM

Mingyu Web Application Firewall (DAS- WAF) All transparent deployment for Web application gateway

Wie Cyber-Kriminelle IT-Security Systeme umgehen. Andreas Maar Senior Security Engineer

The Advantages of a Firewall Over an Interafer

THIRD BRIGADE DEEP SECURITY HOST INTRUSION PREVENTION SYSTEM (WINDOWS SERVER 2003) PRODUCT REPORT ON PCI SUITABILITY

Effectiveness of blocking evasions in Intrusion Prevention Systems. White Paper. April, Konstantinos Xynos, Iain Sutherland, Andrew Blyth

Streamlining Web and Security

Beyond the Hype: Advanced Persistent Threats

NETWORK FIREWALL PRODUCT ANALYSIS

Symantec Endpoint Protection 11.0 Architecture, Sizing, and Performance Recommendations

Deep Security Vulnerability Protection Summary

10 easy steps to secure your retail network

Architecture Overview

How To Protect Your Network From Intrusions From A Malicious Computer (Malware) With A Microsoft Network Security Platform)

ETHICAL HACKING APPLICATIO WIRELESS110 00NETWORK APPLICATION MOBILE MOBILE0001

NETWORK FIREWALL TEST METHODOLOGY 3.0. To receive a licensed copy or report misuse, Please contact NSS Labs at: or advisor@nsslabs.

DDoS ATTACKS: MOTIVES, MECHANISMS AND MITIGATION

Symantec AntiVirus Corporate Edition Patch Update

Types of cyber-attacks. And how to prevent them

Symantec Advanced Threat Protection: Network

Best Practices for Log File Management (Compliance, Security, Troubleshooting)

VULNERABILITY MANAGEMENT

Dell One Identity Cloud Access Manager How to Configure for High Availability

How to Protect against the Threat of Spearphishing Attacks

Guideline on Vulnerability and Patch Management

INSIDE. Management Process. Symantec Corporation TM. Best Practices Roles & Responsibilities. Vulnerabilities versus Exposures.

IBM Advanced Threat Protection Solution

Cloud- Based Security Is Here to Stay

Cloud Based Secure Web Gateway

The Evolving Threat Landscape and New Best Practices for SSL

Reference Architecture: Enterprise Security For The Cloud

Software- Defined Networking: Beyond The Hype, And A Dose Of Reality

Network Security Equipment The Ever Changing Curveball

ISS X-Force. IBM Global Services. Angel NIKOLOV Country Manager BG, CZ, HU, RO and SK IBM Internet Security Systems

Dell Advanced Network Monitoring Services Service Description

On-Premises DDoS Mitigation for the Enterprise

Intrusion Detection Systems

ZIMPERIUM, INC. END USER LICENSE TERMS

Getting Ahead of Malware

Best Practices for Running Symantec Endpoint Protection 12.1 on the Microsoft Azure Platform

Adobe ColdFusion. Secure Profile Web Application Penetration Test. July 31, Neohapsis 217 North Jefferson Street, Suite 200 Chicago, IL 60661

INDUSTRIAL CONTROL SYSTEMS CYBER SECURITY DEMONSTRATION

Total Cost of Ownership: Benefits of Comprehensive, Real-Time Gateway Security

The Essentials Series. PCI Compliance. sponsored by. by Rebecca Herold

SourceFireNext-Generation IPS

Introducing IBM s Advanced Threat Protection Platform

Symantec Enterprise Firewalls. From the Internet Thomas Jerry Scott

What to Look for When Evaluating Next-Generation Firewalls

Protecting Your Organisation from Targeted Cyber Intrusion

THREAT VISIBILITY & VULNERABILITY ASSESSMENT

SPEAR PHISHING UNDERSTANDING THE THREAT

Network- vs. Host-based Intrusion Detection

Transcription:

DATA CENTER IPS COMPARATIVE ANALYSIS Security 2014 Thomas Skybakmoen, Jason Pappalexis Tested Products Fortinet FortiGate 5140B, Juniper SRX 5800, McAfee NS- 9300, Sourcefire 8290-2

Data Center Overview Implementation of an intrusion prevention system (IPS) can be a complex process with multiple factors affecting the overall security effectiveness of the solution. These should be considered over the course of the useful life of the solution, and include: 1. Exploit block rate 2. Anti- evasion capabilities (resistance to common evasion techniques) 3. Device stability and reliability 4. Overall manageability (see Management Comparative Analysis Report (CAR) ) In order to determine the relative security effectiveness of devices on the market and facilitate accurate product comparisons, NSS Labs has developed a unique metric: Security Effectiveness = Exploit Block Rate 1 x Anti- Evasion Rating x Stability & Reliability Figure 1 Security Effectiveness Formula By focusing on overall security effectiveness instead of the exploit block rate alone, NSS is able to factor in the ease with which defenses can be bypassed, as well as the reliability of the device. Product Exploit Block Rate Anti- Evasion Rating Stability & Reliability Security Effectiveness Fortinet FortiGate 5140B 98.2% 100% 100% 98.2% Juniper SRX 5800 86.3% 100% 100% 86.3% McAfee NS- 9300 99.6% 100% 100% 99.6% Sourcefire 8290-2 99.4% 100% 100% 99.4% Figure 2 Security Effectiveness Because enterprise users consider effective management to be a critical component of any enterprise security deployment, this also should be factored into total cost of ownership (TCO) and overall product selection. This is outside the scope of this report, however. For more information, refer to the TCO and Management CARs. For a complete view of Security Effectiveness mapped against Value, refer to the Security Value Map (SVM) CAR. NSS research indicates that all enterprises tune their IPS devices when deployed in the data center. Therefore, for NSS testing of IPS products, the devices are deployed with a tuned policy. Every effort is made to deploy policies that ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the device in a live network environment. This provides readers with the most useful information on key IPS security effectiveness and performance capabilities based upon their expected usage. Evasion techniques are a means of disguising and modifying attacks in order to avoid detection and blocking by security products. Resistance to evasion is a critical component in an IPS. If a single evasion is missed, an attacker can utilize an entire class of exploits to circumvent the IPS, rendering it virtually useless. Many of the techniques used in this test have been widely known for years and should be considered minimum requirements for the IPS 1 Exploit Block Rate is defined as the number of exploits blocked under test 2013 NSS Labs, Inc. All rights reserved. 2

) NSS Labs Data Center product category, while others are more recent. This particular category of tests is critical in the final weighting with regard to product guidance. This chart depicts the relationship between protection and performance when tuned policies are used. Farther up indicates better security effectiveness, and farther to the right indicates higher throughput. 100%( 98%( 96%( McAfee(NS=9300( For$net(For$Gate(5140B( Sourcefire(8290=2( Security)Effec,veness) 94%( 92%( 90%( 88%( 86%( 84%( 82%( Juniper(SRX(5800( 80%( 0( 20,000( 40,000( 60,000( 80,000( 100,000( 120,000( 140,000( 160,000( NSS1Tested)Throughput)(Mbps)) Figure 3 Security Effectiveness and Performance When selecting products, those along the top line of the chart (closer to 100% security effectiveness) should be prioritized. The throughput is a secondary consideration and will be dependent on enterprise- specific deployment requirements. 2013 NSS Labs, Inc. All rights reserved. 3

Data Center Table of Contents Table of Contents... 4 Analysis... 5 Tuning... 5 Exploit Block Rate... 5 Exploit Block Rate by Year... 6 Exploit Block Rate by Attack Vector... 6 Exploit Block Rate by Impact Type... 7 Exploit Block Rate by Target Vendor... 8 Evasions... 8 Stability & Reliability... 10 Security Effectiveness... 10 Managed Security Effectiveness... 11 Test Methodology... 12 Contact Information... 12 Table of Figures Figure 1 Security Effectiveness Formula... 2 Figure 2 Security Effectiveness... 2 Figure 3 Security Effectiveness and Performance... 3 Figure 4 Exploit Block Rate by Year Default Policies... 6 Figure 5 Attacker- Initiated Exploit Block Rate... 7 Figure 6 Exploit Block Rate by Impact Type... 7 Figure 7 Exploit Block Rate by Target Vendor... 8 Figure 8 Exploits and Evasions (Attacker- Initiated)... 9 Figure 9 Evasion Resistance (I)... 9 Figure 10 Evasion Resistance (II) and Overall Evasion Results... 9 Figure 11 Stability and Reliability (I)... 10 Figure 12 Stability and Reliability (II)... 10 Figure 13 Security Effectiveness... 10 2013 NSS Labs, Inc. All rights reserved. 4

Analysis The threat landscape is evolving constantly; attackers are refining their strategies and increasing both the volume and intelligence of their attacks. Enterprises now must defend against targeted persistent attacks (TPA). Although attacks against desktop client applications are mainstream in typical enterprise perimeter deployments, servers will always be the primary target in a data center deployment, and therefore tuning is critical. Tuning Security products are often complex, and vendors are responding by simplifying the user interface and security policy selection to meet the usability needs of a broadening user base. Indeed, many organizations accept and deploy the default settings, understanding these to be the best recommendations from the vendor. In this, IPS are the exception to the rule. NSS research indicates that all enterprises tune their IPS devices when deployed in the data center. In general, accepting a vendor s defaults is likely to result in the omission of a significant number of deployment- specific signatures, which could leave an organization at risk. With the shortage of skilled and experienced practitioners, it is important to consider the time and resources required to properly install, maintain, and tune the solution. Failure to do so could result in products not achieving their full security potential. Therefore, all IPS products are tuned prior to testing to eliminate false positives and provide the most appropriate coverage for the systems to be protected. Typically, tuning is carried out by experienced system engineers from the vendor company, but where this is not possible, NSS engineers will perform the necessary tuning. NSS engineers may also amend the configuration of a device under test (DUT), where specific characteristics of the DUT or its configuration interfere with the normal operation of any of the tests, or where the results obtained from those tests would, in the opinion of those engineers, misrepresent the true capabilities of the DUT. Every effort is made to ensure the optimal combination of security effectiveness and performance, as would be the aim of a typical customer deploying the DUT in a live network environment. Tuning an IPS is a potentially complicated endeavor that must be performed uniquely for each environment. Many factors affect which signatures or rules should be enabled on an IPS, for example, network architecture, target assets, patch levels, allowed protocols, etc. Exploit Block Rate NSS security effectiveness testing leverages the deep expertise of our engineers to generate the same types of attacks used by modern cyber criminals, utilizing multiple commercial, open source, and proprietary tools as appropriate. With over 800 live exploits specifically targeted at data center servers and applications, this is the industry s most comprehensive test to date. Most notable, all of the live exploits and payloads in these tests have been validated such that a: A reverse shell is returned A bind shell is opened on the target allowing the attacker to execute arbitrary commands A malicious payload is installed The system is rendered unresponsive 2014 NSS Labs, Inc. All rights reserved. 5

Exploit Block Rate by Year Contrary to popular belief, the biggest risks are not always driven by the latest Patch Tuesday disclosures. NSS threat research reveals that many older attacks are still in circulation, and therefore remain relevant. Different vendors take different approaches to adding coverage once a vulnerability is disclosed. An attempt to provide rapid coverage for vulnerabilities that are not fully understood can result in multiple exploit- specific signatures that may be inaccurate, ineffective, or prone to false positives. Vendors that have the resources available to fully research a vulnerability will hopefully produce vulnerability- oriented signatures that provide coverage for all exploits written to take advantage of that flaw. This approach provides more effective coverage with fewer false positives. Where a product has performance limitations, vendors may retire older signatures in an attempt to alleviate those limitations, resulting in inconsistent coverage for older vulnerabilities. This results in varying levels of protection across products. The following table classifies coverage by disclosure date, as tracked by CVE numbers. The table is sorted by total protection, and the green sections of the heat map indicate vendors with higher coverage for the given year (columns). Product <=2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 Total Fortinet FortiGate 5140B 100.0% 99.0% 100.0% 99.2% 98.6% 93.8% 98.4% 100.0% 96.6% 95.7% 98.2% Juniper SRX 5800 100.0% 87.0% 92.5% 90.0% 83.1% 84.6% 84.9% 85.7% 87.4% 69.6% 86.3% McAfee NS- 9300 100.0% 100.0% 98.8% 100.0% 100.0% 96.9% 100.0% 100.0% 100.0% 100.0% 99.6% Sourcefire 8290-2 100.0% 100.0% 98.8% 100.0% 98.6% 100.0% 100.0% 97.7% 98.9% 100.0% 99.4% Exploit Block Rate by Attack Vector Figure 4 Exploit Block Rate by Year Default Policies Since 2007, NSS researchers have observed a dramatic rise in the number of client- side exploits, because these can be easily launched by an unsuspecting user who visits an infected website. Despite the difficulty of providing extensive coverage for client- side attacks, the IPS industry has attempted to provide more complete client- side coverage. NSS utilizes the following definitions: Attacker- Initiated: The threat/exploit is executed by the attacker remotely against a vulnerable application and/or operating system. These attacks traditionally target servers (which is why they are often referred to as server- side attacks). Target- Initiated: The threat/exploit is initiated by the vulnerable target (which is why they are often referred to as client- side attacks). The attacker has little or no control as to when the target user or application will execute the threat. These attacks traditionally target desktop client applications. Target examples include Internet Explorer, Adobe, Firefox, QuickTime, Office applications, etc. While client- side attacks are on the rise in the enterprise, the typical data center will only be vulnerable to server- side (attacker initiated) attacks. 2014 NSS Labs, Inc. All rights reserved. 6

100%& 98.2%& 99.6%& 99.4%& 90%& 86.3%& 80%& 70%& 60%& 50%& 40%& 30%& 20%& 10%& 0%& For1net&For1Gate&5140B& Juniper&SRX&5800& McAfee&NSD9300& Sourcefire&8290D2& Figure 5 Attacker- Initiated Exploit Block Rate Exploit Block Rate by Impact Type The most serious exploits are those that result in a remote system compromise, providing the attacker with the ability to execute arbitrary system- level commands. Most exploits in this class are weaponized and offer the attacker a fully interactive remote shell on the target client or server. Slightly less serious are attacks that result in an individual service compromise, but not arbitrary system- level command execution. Typical attacks in this category include service- specific attacks, such as SQL injection, that enable an attacker to execute arbitrary SQL commands within the database service. These attacks are somewhat isolated to the service and do not immediately result in full system- level access to the operating system and all services. However, by using additional localized system attacks, it may be possible for the attacker to escalate from the service level to the system level. Finally, there are the attacks (often target initiated) which result in a system or service- level fault that crashes the targeted service or application and requires administrative action to restart the service or reboot the system. These attacks do not enable the attacker to execute arbitrary commands. Still, the resulting impact to the business could be severe, as the attacker could crash a protected system or service. Product System Exposure Service Exposure System or Service Fault Fortinet FortiGate 5140B 98.7% 98.2% 96.3% Juniper SRX 5800 84.9% 89.1% 89.7% McAfee NS- 9300 99.6% 100.0% 99.3% Sourcefire 8290-2 99.5% 99.1% 99.3% Figure 6 Exploit Block Rate by Impact Type See individual Product Analysis Reports (PAR) for more information. 2014 NSS Labs, Inc. All rights reserved. 7

Exploit Block Rate by Target Vendor The NSS exploit library covers a wide range of protocols and applications representing a wide range of software vendors. This chart shows coverage for 5 of the top vendor targets (out of more than 70), as determined by the number of vendor- specific data center exploits in the NSS exploit library for this round of testing. Description Microsoft Oracle Novell CA IBM Fortinet FortiGate 5140B 94.4% 97.1% 100.0% 97.8% 100.0% Juniper SRX 5800 84.3% 77.1% 87.0% 87.0% 92.1% McAfee NS- 9300 97.2% 100.0% 100.0% 100.0% 100.0% Sourcefire 8290-2 99.1% 98.6% 100.0% 97.8% 100.0% Figure 7 Exploit Block Rate by Target Vendor See individual Product Analysis Reports (PAR) for more information. Evasions Evasion techniques are a means of disguising and modifying attacks at the point of delivery in order to avoid detection and blocking by security products. Failure of a security device to handle correctly a particular type of evasion potentially will allow an attacker to use an entire class of exploits for which the device is assumed to have protection. This renders the device virtually useless. Many of the techniques used in this test have been widely known for years and should be considered minimum requirements for the IPS product category. Providing exploit protection results without fully factoring in evasion can be misleading. The more classes of evasion that are missed IP fragmentation, TCP segmentation, RPC fragmentation and FTP evasion the less effective the device. For example, it is better to miss all techniques in one evasion category (say, FTP evasion) than one technique in each category, which would result in a broader attack surface. Furthermore, evasions operating at the lower layers of the network stack (IP fragmentation or TCP segmentation) will have a greater impact on security effectiveness than those operating at the upper layers (HTTP obfuscation or FTP evasion.) This is because lower- level evasions will impact potentially a wider number of exploits; therefore, missing TCP segmentation is a much more serious issue than missing FTP evasions. A product s effectiveness is significantly handicapped if it fails to detect exploits that employ obfuscation or evasion techniques, and the NSS product guidance is adjusted to reflect this. As with exploits, evasions can be employed specifically to obfuscate attacks that are initiated either locally by the target (client- side), or remotely by the attacker against a server (server- side). The following chart depicts attacker- initiated exploits and evasions combined. 2014 NSS Labs, Inc. All rights reserved. 8

100%& 98.2%& 99.6%& 99.4%& 90%& 86.3%& 80%& 70%& 60%& 50%& 40%& 30%& 20%& 10%& 0%& For1net&For1Gate&5140B& Juniper&SRX&5800& McAfee&NSD9300& Sourcefire&8290D2& Figure 8 Exploits and Evasions (Attacker- Initiated) The following figures provide details on evasion resistance for the tested products. Product IP Packet Fragmentation TCP Stream Segmentation RPC Fragmentation FTP Evasion Fortinet FortiGate 5140B PASS PASS PASS PASS Juniper SRX 5800 PASS PASS PASS PASS McAfee NS- 9300 PASS PASS PASS PASS Sourcefire 8290-2 PASS PASS PASS PASS Product Figure 9 Evasion Resistance (I) IP Fragmentation + TCP Segmentation IP Fragmentation + MSRPC Fragmentation Overall Evasion Results Fortinet FortiGate 5140B PASS PASS PASS Juniper SRX 5800 PASS PASS PASS McAfee NS- 9300 PASS PASS PASS Sourcefire 8290-2 PASS PASS PASS Figure 10 Evasion Resistance (II) and Overall Evasion Results All devices proved effective against all evasion techniques tested. 2014 NSS Labs, Inc. All rights reserved. 9

Stability & Reliability Long- term stability is particularly important for an in- line device, where failure can produce network outages. These tests verify the stability of the DUT along with its ability to maintain security effectiveness while under normal load and while passing malicious traffic. Products that are not able to sustain legitimate traffic (or that crash) while under hostile attack will not pass. The DUT is required to remain operational and stable throughout these tests, and to block 100% of previously blocked traffic, raising an alert for each attack. If any prohibited traffic passes successfully, caused by either the volume of traffic or the device under test failing open for any reason, this will result in a FAIL. Product Blocking Under Extended Attack Passing Legitimate Traffic Under Extended Attack Behavior Of The State Engine Under Load Attack Detection/Blocking Normal Load State Preservation Normal Load Pass Legitimate Traffic Normal Load Fortinet FortiGate 5140B PASS PASS PASS PASS PASS PASS Juniper SRX 5800 PASS PASS PASS PASS PASS PASS McAfee NS- 9300 PASS PASS PASS PASS PASS PASS Sourcefire 8290-2 PASS PASS PASS PASS PASS PASS Product State Preservation - Maximum Exceeded Figure 11 Stability and Reliability (I) Drop Traffic Maximum Exceeded Protocol Fuzzing & Mutation Power Fail Redundancy Persistence of Data Stability And Reliability Score Fortinet FortiGate 5140B PASS PASS PASS PASS YES PASS PASS Juniper SRX 5800 PASS PASS PASS PASS YES PASS PASS McAfee NS- 9300 PASS PASS PASS PASS YES PASS PASS Sourcefire 8290-2 PASS PASS PASS PASS YES PASS PASS Figure 12 Stability and Reliability (II) Security Effectiveness The security effectiveness of a device is determined by factoring the results of evasions testing and stability & reliability testing into the exploit block rate. Figure 13 depicts the security effectiveness of each device. Product Exploit Block Rate Anti- Evasion Rating Stability & Reliability Security Effectiveness Fortinet FortiGate 5140B 98.2% 100% 100% 98.2% Juniper SRX 5800 86.3% 100% 100% 86.3% McAfee NS- 9300 99.6% 100% 100% 99.6% Sourcefire 8290-2 99.4% 100% 100% 99.4% Figure 13 Security Effectiveness 2014 NSS Labs, Inc. All rights reserved. 10

Managed Security Effectiveness Security devices are complicated to deploy; essential systems such as centralized management console options, log aggregation, and event correlation/management systems further complicate the purchasing decision. It is vital that enterprise security professionals are able to deploy and manage multiple firewalls throughout the organization in a secure and effective manner. If a device cannot be managed effectively, the security effectiveness of that device is compromised. As part of this test, NSS performed in- depth technical evaluations of the main features and capabilities of the enterprise management systems offered by each vendor, covering the following key areas: General Management and Configuration how easy is it to install and configure devices, and deploy multiple devices throughout a large enterprise network? Policy Handling how easy is it to create, edit, and deploy complicated security policies across an enterprise? Alert Handling how accurate and timely is the alerting, and how easy is it to drill down to locate critical information needed to remediate a security problem? Reporting how effective is the reporting capability, and how readily can it be customized? The results of these tests are reported, along with detailed cost models, in the Management CAR and Total Cost of Ownership (TCO) CAR. 2014 NSS Labs, Inc. All rights reserved. 11

Test Methodology Methodology Version: Data Center IPS Test Methodology v1.1.1 A copy of the test methodology is available on the NSS Labs website at www.nsslabs.com. Contact Information NSS Labs, Inc. 206 Wild Basin Rd Buliding A, Suite 200 Austin, TX 78746 +1 (512) 961-5300 info@nsslabs.com www.nsslabs.com This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse, please contact NSS Labs at +1 (512) 961-5300 or sales@nsslabs.com. 2014 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval system, or transmitted without the express written consent of the authors. Please note that access to or use of this report is conditioned on the following: 1. The information in this report is subject to change by NSS Labs without notice. 2. The information in this report is believed by NSS Labs to be accurate and reliable at the time of publication, but is not guaranteed. All use of and reliance on this report are at the reader s sole risk. NSS Labs is not liable or responsible for any damages, losses, or expenses arising from any error or omission in this report. 3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS LABS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON- INFRINGEMENT ARE DISCLAIMED AND EXCLUDED BY NSS LABS. IN NO EVENT SHALL NSS LABS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet the reader s expectations, requirements, needs, or specifications, or that they will operate without interruption. 5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report. 6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners. 2014 NSS Labs, Inc. All rights reserved. 12