The CISO s Guide to the Importance of Testing Security Devices

Similar documents
DATA CENTER IPS COMPARATIVE ANALYSIS

Breach Found. Did It Hurt?

DATA CENTER IPS COMPARATIVE ANALYSIS

Internet Advertising: Is Your Browser Putting You at Risk?

Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities?

Evolutions in Browser Security

Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT

SSL Performance Problems

2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

An Old Dog Had Better Learn Some New Tricks

NEXT GENERATION FIREWALL COMPARATIVE ANALYSIS

Mobile App Containers: Product Or Feature?

ENTERPRISE EPP COMPARATIVE REPORT

Multiple Drivers For Cyber Security Insurance

Why Is DDoS Prevention a Challenge?

WEB APPLICATION FIREWALL COMPARATIVE ANALYSIS

TEST METHODOLOGY. Hypervisors For x86 Virtualization. v1.0

Achieve Deeper Network Security

TEST METHODOLOGY. Network Firewall Data Center. v1.0

Types of cyber-attacks. And how to prevent them

CORPORATE AV / EPP COMPARATIVE ANALYSIS

Streamlining Web and Security

How To Sell Security Products To A Network Security Company

Beyond the Hype: Advanced Persistent Threats

WEB APPLICATION FIREWALL PRODUCT ANALYSIS

TEST METHODOLOGY. Distributed Denial-of-Service (DDoS) Prevention. v2.0

TEST METHODOLOGY. Web Application Firewall. v6.2

How To Create A Firewall Security Value Map (Svm) 2013 Nss Labs, Inc.

Achieve Deeper Network Security and Application Control

BROWSER SECURITY COMPARATIVE ANALYSIS

Cisco Advanced Services for Network Security

Application Security in the Software Development Lifecycle

Lifecycle Solutions & Services. Managed Industrial Cyber Security Services

How To Protect Your Network From Intrusions From A Malicious Computer (Malware) With A Microsoft Network Security Platform)

The Business Case for Security Information Management

PATCH MANAGEMENT. February The Government of the Hong Kong Special Administrative Region

TEST METHODOLOGY. Endpoint Protection Evasion and Exploit. v4.0

CPNI VIEWPOINT CYBER SECURITY ASSESSMENTS OF INDUSTRIAL CONTROL SYSTEMS

How To Test A Ddos Prevention Solution

WHITE PAPER AUTOMATED, REAL-TIME RISK ANALYSIS AND REMEDIATION

COUNTERSNIPE

Network Security Equipment The Ever Changing Curveball

The Dirty Secret Behind the UTM: What Security Vendors Don t Want You to Know

CPNI VIEWPOINT CONFIGURING AND MANAGING REMOTE ACCESS FOR INDUSTRIAL CONTROL SYSTEMS

Architecture Overview

Securing Amazon It s a Jungle Out There

NEXT GENERATION FIREWALL COMPARATIVE ANALYSIS

Multi-layered Security Solutions for VoIP Protection

Check Point submitted the SWG Secure Web Gateway for

Protect Your Connected Business Systems by Identifying and Analyzing Threats

The Advantages of a Firewall Over an Interafer

Applying machine learning techniques to achieve resilient, accurate, high-speed malware detection

CSIS Security Research and Intelligence Research paper: Threats when using Online Social Networks Date: 16/

Lab Testing Summary Report

10 easy steps to secure your retail network

Deep Security Vulnerability Protection Summary

Information Security Services

Symantec Advanced Threat Protection: Network

How To Protect A Web Application From Attack From A Trusted Environment

Fail-Safe IPS Integration with Bypass Technology

When attackers have reached this stage, it is not a big issue for them to transfer data out. Spencer Hsieh Trend Micro threat researcher

AIRDEFENSE SOLUTIONS PROTECT YOUR WIRELESS NETWORK AND YOUR CRITICAL DATA SECURITY AND COMPLIANCE

Dell Advanced Network Monitoring Services Service Description

Cisco IPS Tuning Overview

Information Supplement: Requirement 6.6 Code Reviews and Application Firewalls Clarified

High Availability Configuration Guide Version 9

Module 1: Introduction to Designing Security

Securing the Intelligent Network

Cautela Labs Cloud Agile. Secured. Threat Management Security Solutions at Work

Requirements When Considering a Next- Generation Firewall

IBM Security IBM Corporation IBM Corporation

Mitigating Risks and Monitoring Activity for Database Security

case study Core Security Technologies Summary Introductory Overview ORGANIZATION: PROJECT NAME:

FIREWALLS & NETWORK SECURITY with Intrusion Detection and VPNs, 2 nd ed. Chapter 4 Finding Network Vulnerabilities

IDS or IPS? Pocket E-Guide

Proactive Performance Management for Enterprise Databases

The Evolving Threat Landscape and New Best Practices for SSL

Best Practices in ICS Security for System Operators. A Wurldtech White Paper

For more information on SQL injection, please refer to the Visa Data Security Alert, SQL Injection Attacks, available at

End-user Security Analytics Strengthens Protection with ArcSight

Security-as-a-Service (Sec-aaS) Framework. Service Introduction

Banking Security using Honeypot

43% Figure 1: Targeted Attack Campaign Diagram

What Do You Mean My Cloud Data Isn t Secure?

How To Secure Your Store Data With Fortinet

Performance of Cisco IPS 4500 and 4300 Series Sensors

A Decision Maker s Guide to Securing an IT Infrastructure

1 Introduction Product Description Strengths and Challenges Copyright... 5

Transcription:

ANALYST BRIEF The CISO s Guide to the Importance of Testing Security Devices Author Bob Walder Overview Selecting security products is a complex process that carries significant risks if not executed correctly; poorly chosen products can fail to protect against serious threats, cause serious performance problems for enterprise networks and waste scarce financial resources. CISOs, CIOs, and other security professionals should develop and execute an enterprise- specific in- house testing plan using outside resources, where appropriate before evaluating and purchasing security products. Security professionals who fail to test security products before buying them risk performance limitations, security failures and overspending. Weaknesses in security coverage can often remain undiscovered for long periods of time. Installing in- line security devices such as firewalls, intrusion prevention systems (IPS), and secure web gateways can lead to a false sense of security unless vendor claims are verified. Critical servers often remain unpatched in the belief they are protected by an IPS, when claimed coverage is actually less effective than promised. In addition, fear of false positives can lead enterprises to run IPS devices in a less secure IDS mode, thereby forfeiting protective properties and increasing operating costs and risk. Selecting the wrong network security device can thus expose a company to serious threats from both inside and outside the network perimeter. Poor performance from an in- line device once placed in a live network can also have serious consequences as latency increases to unacceptable levels. High latency or frequent fail closed events can result in active devices being redeployed in a passive state or having blocking disabled, significantly reducing their effectiveness Cost is also an issue. Without performing relevant tests in- house, organizations could be persuaded to overspend significantly, purchasing devices with performance and coverage levels that are not required. Use testing procedures designed for your enterprise and your threat environment to determine the best in- line network security products for your enterprise's and IT organization's needs. Once deployed, employ continuous testing initiatives to ensure that the required levels of security effectiveness and performance are maintained.

NSS Labs Findings Security vendors claimed effectiveness and performance frequently turn out to be difficult, if not impossible, to achieve once products are deployed in- line on a live network. Vendor tests are often carefully designed specifically to make their offerings look better (and their competitors look worse) rather than demonstrate real- world performance and security effectiveness capabilities. Value added resellers (VARS) and system integrators usually do not have the required in- house expertise to conduct testing of products they sell and will usually parrot the vendors marketing statistics. While recommendations from peers (along with analyst research based purely on such feedback) can be a useful data point, bear in mind that their environment will be very different to yours, and they may not have the required in- house expertise to discover flaws and weaknesses in the products they have deployed. Poor security product selection can cause serious risk exposure, negatively impact network performance and waste scarce financial resources. Installing untested in- line security devices can lead to a false sense of security, leaving critical servers and other network assets dangerously unprotected. Resulting breaches can be costly in terms of reputation, financially, or through loss of compliance status. Factors such as total cost of ownership (TCO), impact on network performance and the usability of management systems are just as important as security effectiveness. Independent test reports and third- party testing can be very helpful in certain areas of security device selection, but in- house testing, based on clearly defined, enterprise- specific criteria, is crucial. Threats have become increasingly sophisticated. Recreating those threats in a realistic manner within a test environment such that results reflect real- world protection capabilities of a security product is correspondingly difficult. Where it proves difficult to perform thorough testing in- house, external test labs can be an invaluable resource to help with the most appropriate product selection. NSS Labs Recommendations Do not base the purchase of critical security equipment on test results or opinions from security vendors, VARs, system integrators, or your peers. Develop a clear, enterprise- specific test plan based on the most- appropriate use case for your needs before beginning the product selection process. Make use of independent test reports when developing an initial shortlist of security vendors and products, but be prepared to perform your own testing before final selection. Consider the use of an independent third- party testing house one that specializes in testing security products to help draw up your test plan, evaluate the products on your shortlist, interpret the results, or even conduct an entire outsourced competitive analysis on your behalf. Conduct detailed testing in- house supplemented by external resources, where appropriate and make benchmarking and testing a continuous process even after implementation. 2

Table of Contents Overview... 1 NSS Labs Findings... 2 NSS Labs Recommendations... 2 Analysis... 4 Create a Test Plan... 4 Define Evaluation Criteria... 5 Security Effectiveness... 5 Performance... 6 Ease of Administration... 7 Analysis of Results... 8 Total Cost of Ownership... 8 Continuous Testing Is Vital... 8 Leveraging External Resources... 9 Reading List... 11 Contact Information... 12 3

Analysis Performance and security effectiveness claims made by vendors of security products rarely prove accurate once products are deployed in production environments. Vendor claims for 10Gbps throughput may look impressive on paper, but actual performance will often be significantly lower when all required security features are enabled. Latency can increase to unacceptable levels, leading to active devices being redeployed in a passive state or having blocking disabled, significantly reducing their effectiveness. Vendors may also claim coverage for particular threats, but the quality of that coverage can vary significantly from product to product. Installing an in- line security device such as an IPS can thus lead to a false sense of security, as CIOs believe they are protected fully from the latest threats. The result may be that critical servers are left unpatched for longer periods, which can expose them to unnecessary risk should security coverage subsequently prove to be ineffective. In addition, fear of false positives can lead enterprises to run IPS devices in a less secure IDS mode, thereby forfeiting protective properties and increasing operating costs and risk. Testing is not always about proving that the biggest, fastest, and most expensive device is the best choice, either. With a well- constructed test plan, organizations may be able to determine that a lower level of performance may be acceptable at certain points on the network, thus reducing overall purchase and deployment costs. Without performing relevant tests in- house, organizations could be persuaded to overspend significantly, purchasing devices with performance and coverage levels that are not required. If testing is not performed, management must be prepared to explicitly accept the risks involved. If it is decided that no in- house testing can or should be performed, users should ensure that only products that have been thoroughly tested by independent, security- specific third- party test organizations are placed on the short list for purchase. This will at least provide a minimum comfort level during product selection. Whilst budget may be difficult to find for this additional step in the purchasing process, it is reasonable to spend a modest percentage of the total project cost on technical due diligence. Some enterprises have been successful at getting vendors to participate in cost sharing to validate their products. Ultimately, management must be prepared to sign off explicitly on accepting the risks involved in not performing in- house testing. There are several key steps involved in a successful testing project: Create a Test Plan Define Evaluafon Criteria Test & Analyze Results Perform Confnuous Tesfng over Product Lifecycle Create a Test Plan Testing network security products can be complicated, and never more so than when there is no plan covering exactly how each product should be tested. Creation of the test plan should be the first step in any product evaluation. 4

Users should first seek out sources of good, independent, security product testing. Vendors who have never submitted their products for such tests or who rely purely on magazine reviews to market their wares should be treated with additional caution. It is possible that they have concerns about the performance of their own product, or simply do not understand the value of independent testing in their own QA process. Ensure the test plan is realistic. There is little point in attempting to reproduce every test performed by an independent test lab, for example. Instead, use its reports and testing methodologies as a starting point to create a customized test plan that reflects the concerns of your individual organization. This could mean subjecting the device under test (DUT) to your own network traffic (to test for false positives) or to performance testing with a policy customized to your own environment. Determination of expected use cases is thus imperative in order to test effectively. Are you looking to protect a network core or perimeter; retail store or data center; do you use Apache or IIS Web servers? Data throughput and composition of traffic for inspection will vary greatly between each use case, and some products are more suitable than others in specific environments. For example, a test lab may comment adversely on a lack of coverage for Linux servers, but if your organization is Windows- only that would be of little concern. Your own test plan should then ensure that a Windows- centric policy is deployed. When selecting independent test reports, be wary of those where the test engineers and methodology designers do not recognize the value of use- case testing. The next step is to determine where the products are to be tested. A dedicated test environment will allow an organization to more effectively evaluate security products under extreme conditions, but this is not always possible. Where this approach proves unfeasible, it is still important to evaluate potential purchases either by using external testing resources or by installing test products in a live network. In the latter case, however, it should be recognized that the type of testing available would be more restricted in order to ensure there is no adverse impact on business processes. Define Evaluation Criteria Where possible, security products should be evaluated against three main criteria: Security Effectiveness Performance Ease of Administration Security Effectiveness This is often the most difficult area to evaluate, requiring expertise with attack traffic and live exploits to execute effectively. For this reason, it is often preferable to rely mainly on the testing performed by independent test houses (having verified that they too will be using live exploits in their testing). When studying independent test reports, ensure that the breadth of testing in this area is sufficient for your needs. Although it is neither necessary nor desirable to verify the effectiveness of every signature included with an IPS or anti- malware product, some minimal level of in- house verification of fitness for purpose is desirable. A basic security effectiveness test bed can be created at relatively low cost using virtualization technology and commonly 5

available test tools. Virtual machines can be used to create an environment that is safe and repeatable, ensuring that live exploits or malware samples do not accidentally infect other machines on the network. This type of testing should verify that the device will not block your own legitimate traffic (accuracy) and is capable of detecting accurately and reacting to a wide range of common exploits (breadth). Breadth of protection is extremely important, but detection accuracy is absolutely critical, because an in- line device that blocks legitimate traffic will not stay in line for long. Testing for false positives is the area where independent testing houses are weakest, for the obvious reason that they cannot reproduce your network traffic. That is why it is important to deploy each device in line on your own network, in blocking mode, and for a reasonable amount of time. This makes it possible to determine how difficult it will be to remove false- positive- prone signatures from your security policy. Another area where IPS products can be clearly differentiated is in evasion protection. This ensures that attackers cannot make minor changes to their attacks in order to bypass poorly written IPS signatures. Some anti- evasion techniques are straightforward to implement, whilst others are extremely difficult and resource- intensive. Effective testing of anti- evasion methods is extremely difficult without extensive expertise. However, you should never let a vendor convince you that evasion is not important. A product can have the best coverage in the world, but if it is trivial to evade those signatures then there is no point in deploying them. Performance Performance testing is another area where your own bake- off (in- house or outsourced) is critical. Third- party performance tests will tend to push devices to the extreme in order to find all the corner cases and ensure that they can hold up in the widest possible choice of deployments. However, your network may consist of a different traffic mix, different average packet size, and different average load levels. Your environment may therefore be considerably less demanding than third- party test environments, and this may allow you to save money by purchasing a device with lower specifications than you originally thought necessary. Even so, you should pay careful attention to the maximum performance and latency of a device. These factors are very difficult to ascertain accurately on a live network, and vendors may use that fact to sell lower- cost devices that may cause long- term problems. Any in- line security device is expected to be reliable (not crash), to never block legitimate traffic, and to not unduly affect network performance. Testing needs to be performed both under normal traffic loads and under more extreme loads to ensure performance does not degrade. One of the key aims of performance testing is to identify devices with sufficient capacity to handle not only today s requirements, but also any changes in policy (which might require processing additional signatures) and increases in size of signature packs that may occur over the projected life of the device. You may be able to save money by purchasing a device with lower specifications than you originally thought necessary. The latency and raw throughput of an in- line device must be on a par with other equipment in the network on which it is deployed. An in- line network IPS (NIPS) product, for example, must strive to perform much more like a switch than a typical passive security device, especially when it is necessary to install more than one appliance in the same data path. 6

Stability and reliability are also of vital importance. Some devices may perform well under high levels of attack traffic for a short period of time, but may begin to degrade after a period of time, or even fail altogether. Accurate performance testing of complex networking devices is not straightforward. Security professionals considering purchasing these devices will benefit from expert guidance. Independent labs that specialize in testing security products are well placed to offer this guidance, and so are the vendors of the high- performance, hardware- based load generation tools used in this type of testing. Where purchase of expensive hardware test tools for one- off projects is not an option, investigate the possibility of renting. In the absence of in- house expertise on these testing tools, the tools vendors themselves typically provide excellent on- site support and consulting services. Another alternative would be to engage an independent test lab to perform a full product bake- off on your behalf. Finally, be wary of sharp practices by unscrupulous sales personnel. Almost every security device has some weakness that will not usually manifest itself on a normal network. There is always a trade- off between performance and security, and the real difference between many of the products out there is where that trade- off has been made in the architecture of the device. However, some vendors have been known to craft special packet captures or custom utilities in order to exploit directly the specific architectural shortcoming of a competitor s device and show it at its worst. Be very skeptical of such demonstrations, and take such vendor's ethical limitations into consideration when evaluating their products. Remember that what really matters is how a device performs on your network, with your traffic. Ease of Administration It is important not to neglect device management/ui testing. The effect of a poorly designed management interface is to reduce the effectiveness of a security solution; if a task is too difficult to perform, it will never be executed. When large enterprises are looking to protect branch office environments, they do not often have the luxury of technical support staff on- site at each branch; the importance of plug n play when rolling out large numbers of new devices is therefore key. Ideally, non- technical on- site staff should be able to connect power and network cables, and have all provisioning of the device performed remotely; this capability is easily verified via on- site testing. Where both an enterprise console and direct device management options are available, both should be evaluated thoroughly. Where small numbers of devices are being managed, the direct device management option is often adequate, offering the option to save significantly on deployment costs by omitting the purchase of an enterprise- class management system. Conversely, perceived shortcomings in the system caused by limited direct device management are often eliminated when using a centralized console. This is one area of testing that is straightforward to perform in- house. Ease of use of devices and UIs is often highly subjective, and your own personnel will often have more valuable input than third parties. 7

Analysis of Results While many of the performance testing tools currently available give the impression of plug and play capability, just plugging in and pressing the Go button will rarely produce useful results. The output will invariably be an impressive graph, but interpreting that is quite an art. Part of the process of defining the test plan will be to describe the type of traffic mix (protocols, average packet sizes, etc.) required to stress the DUT in the most appropriate way. Part of the skill in interpreting the results is to determine at what point the DUT begins to fail gracefully, and at what point it fails completely. At what point do you start to notice excessive delays, dropped connections, and a generally unacceptable user experience? It is also important to recognize when your test is not stressful enough has the device more than enough headroom, or did you simply not generate sufficient load to tax it? After several successful test runs it soon becomes second nature to extract the critical breaking point from a mass of Excel spreadsheets and line graphs, but initially it can be a challenge. This is another area where you can make use of external resources, either from an independent security test lab or from the test tool vendor, in order to help with your initial analyses. Total Cost of Ownership The purchase price of any product is only the beginning of the expense. Prospective purchasers should calculate TCO over the expected life of the product. Update services can be costly, and TCO calculations should include the cost of such services, both in the first year (when the cost is often rolled into the initial purchase price) and in subsequent years. Where multifunction devices are deployed, it is important to factor in update services for each security module. The cost of day- to- day management and regular tuning should also be considered. When devices are operating at their design limits and administrators are forced to tune aggressively to achieve the required performance, the ongoing management cost will be higher than with devices with sufficient headroom built in. Such aggressive tuning is also error- prone, making it more likely that underspecified devices will leave critical network assets unprotected, risking compromise and resulting in loss of customer confidence and possible loss of compliance status. Total cost of ownership (TCO), impact on network performance, and the usability of management systems are just as important as security effectiveness. Continuous Testing Is Vital It is important to make testing an integral part of the lifecycle of individual security products and larger, multi- device projects. Limiting testing to the purchasing process alone can leave an organization open to increased security risks as threats evolve, and environments and products change. 8

NSS has seen instances in its labs of a single poorly written signature crippling the performance of an IPS. Firmware updates can also break previously solid inspection processes (anti- evasion techniques appear to be particularly prone to disruption between firmware updates). It is also important to bear in mind that attackers and their threats are evolving constantly. New vulnerabilities and evasion techniques are discovered daily, and security devices that demonstrate superior performance during the purchasing phase may be barely adequate a year later. Perform a full benchmark test after tool selection and initial deployment are complete to establish a baseline for your existing deployments. It is not always changes to security products that cause problems, either. Internal changes in security policy to accommodate new business processes can lead to misconfiguration of security devices that would remain undetected for long periods of time without an effective continuous testing initiative. Once tool selection and initial deployment are complete, perform a full benchmark test to establish a baseline for your existing deployments. Every time a new firmware upgrade, signature pack update or change in security policy is applied however minor it may seem the device should be retested, and the results compared against the baseline. This process of continuous monitoring makes it possible to monitor, identify and correct adverse impacts on performance or security effectiveness. If you cannot do this in house, make sure you have access to up- to- date tests from external test sources to keep track of those products that remain effective against the threats most applicable to your organization as new threats are discovered, signature packs are updated, and new software and hardware versions are released. Leveraging External Resources Load- generating test equipment can be very expensive and complicated to operate for one- off projects. One alternative is to consider renting this equipment from specialist companies. Although the test equipment vendors themselves are not usually open to hiring out equipment they would rather you purchase outright, they do usually provide excellent consulting services. Dedicated personnel can help you to configure the equipment, run your tests and analyze the results. Utilize external test labs to perform a full competitive analysis using your custom methodology and test data. Not all test tools are created equal, however. It is important to verify ahead of time that the type of synthetic traffic you need to use to simulate your own network can be generated at the appropriate speeds necessary to place the device under test under sufficient load to determine its effective real- world performance capabilities. Quis custodiet ipsos custodes? Or in this case, Who tests the test tools? An incorrect selection of test tools can lead to mistakes as costly as the incorrect selection of the device under test! Tools used to verify security effectiveness are equally varied. Beware of those providing only the means to replay packet captures of canned exploits. While useful as a quick and dirty method to verify that a network security device is operational and generating alerts, they are not sufficiently flexible to enable a comprehensive evaluation 9

of the real security coverage of a device. Many of these will provide false results, either positive or negative, leading to lengthy arguments with vendors and inaccurate or incomplete evaluations. The use of live exploits and malware samples is essential when testing devices expected to detect and block malicious traffic, as is the use of multiple variants of each malicious sample rather than a single proof of concept example. The collection and, more importantly, validation of a comprehensive threat library capable of providing a thorough evaluation of the security effectiveness of modern network security devices is not a trivial task, and may be beyond the expertise of many security professionals outside of the testing industry. A better alternative, therefore, may be to engage the third- party test labs that produce the reports you use to make your short list decisions. These organizations can be used for any or all of the following as your needs dictate: Creation of testing methodology Selection of products for testing Configuration of the complex test/load generation equipment Supervision/operation of the tests Interpretation of the test results Fully- outsourced competitive bakeoff on your behalf RFI creation and contract review Continuous testing to ensure that your network remains secure Where it is not feasible to perform extensive performance and security effectiveness testing on your live corporate network, and where the resources do not exist to created a dedicated test bed in- house, the external test lab may also be able to provide the means to perform a full competitive analysis on their own test network using your custom methodology and test data. Make sure the test lab you select has a solid pedigree in testing network security devices. This will provide a much more efficient and cost- effective engagement since they will already have all the necessary test equipment and suitable test bed configuration in place. They will also be much more effective when it comes to analyzing the results. 10

Reading List Evaluating Products Based on Appropriate Usage. NSS Labs https://www.nsslabs.com/reports/appropriate- usage- product- evaluation Vulnerability- Based Protection and the Google Operation Aurora Attack. NSS Labs https://www.nsslabs.com/reports/vulnerability- based- protection- and- google- operation- aurora- attack Next Generation Firewall (NGFW) Test Methodology v4.0. NSS Labs https://www.nsslabs.com/reports/categories/methodologies Network Firewall Test Methodology v3.0. NSS Labs https://www.nsslabs.com/reports/categories/methodologies Network Intrusion Prevention Systems Test Methodology v6.1. NSS Labs https://www.nsslabs.com/reports/categories/methodologies 11

Contact Information NSS Labs, Inc. 206 Wild Basin Rd Building A, Suite 200 Austin, TX 78746 USA +1 (512) 961-5300 info@nsslabs.com www.nsslabs.com This analyst brief was produced as part of NSS Labs independent testing information services. Leading products were tested at no cost to the vendor, and NSS Labs received no vendor funding to produce this analyst brief. 2011 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, photocopied, stored on a retrieval system, or transmitted without the express written consent of the authors. Please note that access to or use of this report is conditioned on the following: 1. The information in this report is subject to change by NSS Labs without notice. 2. The information in this report is believed by NSS Labs to be accurate and reliable at the time of publication, but is not guaranteed. All use of and reliance on this report are at the reader s sole risk. NSS Labs is not liable or responsible for any damages, losses, or expenses arising from any error or omission in this report. 3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY NSS LABS. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON- INFRINGEMENT ARE DISCLAIMED AND EXCLUDED BY NSS LABS. IN NO EVENT SHALL NSS LABS BE LIABLE FOR ANY CONSEQUENTIAL, INCIDENTAL OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet the reader s expectations, requirements, needs, or specifications, or that they will operate without interruption. 5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report. 6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners. 12