ENTERPRISE EPP COMPARATIVE REPORT



Similar documents
WEB APPLICATION FIREWALL COMPARATIVE ANALYSIS

NEXT GENERATION FIREWALL COMPARATIVE ANALYSIS

Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities?

Evolutions in Browser Security

DATA CENTER IPS COMPARATIVE ANALYSIS

Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT

CORPORATE AV / EPP COMPARATIVE ANALYSIS

2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

Breach Found. Did It Hurt?

Internet Advertising: Is Your Browser Putting You at Risk?

DATA CENTER IPS COMPARATIVE ANALYSIS

2013 Thomas Skybakmoen, Francisco Artes, Bob Walder, Ryan Liles

SSL Performance Problems

How To Sell Security Products To A Network Security Company

Multiple Drivers For Cyber Security Insurance

An Old Dog Had Better Learn Some New Tricks

BROWSER SECURITY COMPARATIVE ANALYSIS

WEB APPLICATION FIREWALL PRODUCT ANALYSIS

Mobile App Containers: Product Or Feature?

INDEPENDENT VALIDATION OF FORTINET SOLUTIONS. NSS Labs Real-World Group Tests

TEST METHODOLOGY. Distributed Denial-of-Service (DDoS) Prevention. v2.0

NEXT GENERATION FIREWALL COMPARATIVE ANALYSIS

Anti-Virus Comparative

Security Industry Market Share Analysis

How To Create A Firewall Security Value Map (Svm) 2013 Nss Labs, Inc.

Anti-Virus Comparative

Symantec Endpoint Protection Integration Component 7.5 Release Notes

Terms & Conditions. Introduction. The following terms and conditions govern your use of this website (VirginiaHomeRepair.com).

Compliance in the Age of Cloud

ADDENDUM ThomasNet Mirrored Site Program

Cyber Advanced Warning System

TEST METHODOLOGY. Endpoint Protection Evasion and Exploit. v4.0

GENERAL TARIFF. 2) Internet Service is provided at C&W's discretion, depending on the availability of suitable facilities and equipment.

Security Industry Market Share Analysis

Why Is DDoS Prevention a Challenge?

RELOCATEYOURSELF.COM B.V - TERMS OF USE OF SERVICES

How to Protect against the Threat of Spearphishing Attacks

Copyright Sagicor Life Insurance Company. All rights reserved.

ACHILLES CERTIFICATION. SIS Module SLS 1508

PCSL. PCSL IT Consulting Institute 手 机 安 全 软 件 病 毒 检 测 率 测 试. Malware Detection Test. Celebrating Technology Innovation

Covered California. Terms and Conditions of Use

BlackBerry Professional Software For Microsoft Exchange Compatibility Matrix January 30, 2009

END USER LICENSE AGREEMENT

TEST METHODOLOGY. Data Center Firewall. v2.0

CORPORATE AV / EPP COMPARATIVE ANALYSIS

End User License Agreement Easygenerator

Dell One Identity Cloud Access Manager How to Configure vworkspace Integration

Terms and Conditions- OnAER Remote Monitoring Service

Anti-Virus Comparative

TERMS & CONDITIONS: LIMITED LICENSE:

TERMS AND CONDITIONS

Evaluating Insurers Enterprise Risk Management Practice

BlackBerry Business Cloud Services. Version: Release Notes

Jozii LLC WEBSITE TERMS OF SERVICE

ZIMPERIUM, INC. END USER LICENSE TERMS

Hamilton.net User Agreement Revised August 31, Acceptance of Terms Through Use

PointCentral Subscription Agreement v.9.2

ASYMMETRIC DIGITAL SUBSCRIBER LINE INTERNET ACCESS TERMS

CITRIX SYSTEMS, INC. SOFTWARE LICENSE AGREEMENT

Types of cyber-attacks. And how to prevent them

Open Source Used In Cisco Instant Connect for ios Devices 4.9(1)

END USER LICENSE AGREEMENT ( EULA )

The CISO s Guide to the Importance of Testing Security Devices

TEST METHODOLOGY. Hypervisors For x86 Virtualization. v1.0

If you do not wish to agree to these terms, please click DO NOT ACCEPT and obtain a refund of the purchase price as follows:

Anti-Virus Comparative - Performance Test (AV Products) May 2014

Dell InTrust Preparing for Auditing Cisco PIX Firewall

How To Test For Performance On A 64 Bit Computer (64 Bit)

When attackers have reached this stage, it is not a big issue for them to transfer data out. Spencer Hsieh Trend Micro threat researcher

Windows Hard Disk Encryption

Sukuk Liquidity Trends

As threat actors target various types of networks, companies with improperly configured network infrastructures risk the following repercussions:

The Business Case for Security Information Management

AGREEMENT BETWEEN USER AND Global Clinical Research Management, Inc.

Terms and Conditions For Online-Payments

GENOA, a QOL HEALTHCARE COMPANY WEBSITE TERMS OF USE

Terms and Conditions For Online-Payments

These TERMS AND CONDICTIONS (this Agreement ) are agreed to between InfluencersAtWork,

Online Study Affiliate Marketing Agreement

Transcription:

ENTERPRISE EPP COMPARATIVE REPORT Security Stack: Socially Engineered Malware Authors Bhaarath Venkateswaran, Randy Abrams, Thomas Skybakmoen Tested Products Bitdefender Endpoint Security v5.3.15.539 ESET Endpoint Security v 6.1.2109.0 Fortinet FortiClient v5.2.3.0633 F-Secure Client Security Premium 11.60 build 284 G DATA Security Client v13.1.0.224 Kaspersky Endpoint Security v10.2.2.10535 McAfee Endpoint Protection v8.0 Panda Endpoint Security v7.20.0 Sophos Endpoint Security v10.3 Symantec Endpoint Security v12.1.4112.4156 Trend Micro Endpoint Security v11.0.1057 Environment Windows 7 SP1 32-Bit Windows Firewall disabled Windows automatic updates disabled Internet Explorer 10

Overview The endpoint user creates a broad attack surface that is difficult to defend. Socially engineered malware (SEM) and the exploitation of software vulnerabilities are among the most common and effective cyberattacks that enterprises face today. Endpoint protection (EPP) products must provide robust defenses against these threats. NSS tested 11 enterprise-class EPP products for their ability to protect against exploits and SEM. The results presented in this report were obtained via 24x7 continuous testing over a period of 24 days. Average Block Rate Figure 1 depicts the number of SEM samples blocked throughout the test. EPP product updates may contain new engine updates, heuristic updates, and malware-specific detection updates, all of which affect the quality of protection against exploits and malware. This test focuses on SEM. Tests for exploit protection are published at www.nsslabs.com. Product Average Block Rate NSS Rating Kaspersky Endpoint Security v10.2.2.10535 100.0% Recommended McAfee Endpoint Protection v8.0 100.0% Recommended Symantec Endpoint Security v12.1.4112.4156 100.0% Recommended ESET Endpoint Security v6.1.2109.0 99.9% Recommended Fortinet FortiClient v5.2.3.0633 99.9% Recommended Sophos Endpoint Security v10.3 99.0% Recommended Trend Micro Endpoint Security v11.0.1057 99.0% Recommended F-Secure Client Security Premium 11.60 build 284 98.1% Neutral G DATA Security Client v13.1.0.224 94.8% Neutral Bitdefender Endpoint Security v5.3.15.539 92.8% Neutral Panda Endpoint Security v7.20.0 82.8% Caution Figure 1 Average Block Rate This test was composed of over 1,700 test cases that included 400 unique attacks (URLs) and 304 unique SEM samples (hashes). A unique SEM sample has a unique hash. A unique attack URL may contain duplicate SEM samples that are part of a unique URL; http://a.edu/abc/malware.exe and http://a.edu/malware.exe are unique URLs. 2

Response Times New SEM is continuously added to the harness throughout the test. Most EPP products are able to block many samples at zero hour. Figure 2 depicts the average percentage of new samples blocked during the first seven days after they are discovered. When products achieve equal maximum protection rates, response time becomes a differentiator. The products are ranked in the order of zero-hour protection. Figure 2 Seven-Day Response Histogram 3

Average Time to Add Protection Figure 3 reports the average amount of time required to add protection for all attacks throughout the test. Vendor Hours Fortinet 0.00 Kaspersky 0.00 Intel Security 0.00 Symantec 0.00 ESET 0.03 Bitdefender 0.05 F-Secure 0.09 Trend Micro 0.28 Sophos 0.42 G DATA 0.64 Panda 1.17 Consistency of Protection Figure 3 Average Time to Add Protection It is important to maintain a consistent SEM protection rate throughout the test. Negative fluctuations can indicate decreased protection rates for new samples, erratic protection for new and existing samples, or both. Details on how each vendor performed can be found in individual test reports, published at www.nsslabs.com. 4

Test Methodology Security Stack: Test Methodology v1.5 A copy of the test methodology is available at www.nsslabs.com. Contact Information NSS Labs, Inc. 206 Wild Basin Rd Building A, Suite 200 Austin, TX 78746 info@nsslabs.com www.nsslabs.com This and other related documents available at: www.nsslabs.com. To receive a licensed copy or report misuse, please contact NSS Labs. 2015 NSS Labs, Inc. All rights reserved. No part of this publication may be reproduced, copied/scanned, stored on a retrieval system, e-mailed or otherwise disseminated or transmitted without the express written consent of NSS Labs, Inc. ( us or we ). Please read the disclaimer in this box because it contains important information that binds you. If you do not agree to these conditions, you should not read the rest of this report but should instead return the report immediately to us. You or your means the person who accesses this report and any entity on whose behalf he/she has obtained this report. 1. The information in this report is subject to change by us without notice, and we disclaim any obligation to update it. 2. The information in this report is believed by us to be accurate and reliable at the time of publication, but is not guaranteed. All use of and reliance on this report are at your sole risk. We are not liable or responsible for any damages, losses, or expenses of any nature whatsoever arising from any error or omission in this report. 3. NO WARRANTIES, EXPRESS OR IMPLIED ARE GIVEN BY US. ALL IMPLIED WARRANTIES, INCLUDING IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT, ARE HEREBY DISCLAIMED AND EXCLUDED BY US. IN NO EVENT SHALL WE BE LIABLE FOR ANY DIRECT, CONSEQUENTIAL, INCIDENTAL, PUNITIVE, EXEMPLARY, OR INDIRECT DAMAGES, OR FOR ANY LOSS OF PROFIT, REVENUE, DATA, COMPUTER PROGRAMS, OR OTHER ASSETS, EVEN IF ADVISED OF THE POSSIBILITY THEREOF. 4. This report does not constitute an endorsement, recommendation, or guarantee of any of the products (hardware or software) tested or the hardware and/or software used in testing the products. The testing does not guarantee that there are no errors or defects in the products or that the products will meet your expectations, requirements, needs, or specifications, or that they will operate without interruption. 5. This report does not imply any endorsement, sponsorship, affiliation, or verification by or with any organizations mentioned in this report. 6. All trademarks, service marks, and trade names used in this report are the trademarks, service marks, and trade names of their respective owners. 5