Comparing Industry-Leading Anti-Spam Services

Similar documents
Opus One PAGE 1 1 COMPARING INDUSTRY-LEADING ANTI-SPAM SERVICES RESULTS FROM TWELVE MONTHS OF TESTING INTRODUCTION TEST METHODOLOGY

Spam Testing Methodology Opus One, Inc. March, 2007

Introduction. How does filtering work? What is the Quarantine? What is an End User Digest?

K7 Mail Security FOR MICROSOFT EXCHANGE SERVERS. v.109

Solutions IT Ltd Virus and Antispam filtering solutions

PROOFPOINT - SPAM FILTER

WATCHGUARD IRONPORT KEY SALES PITCH TRUTH BEHIND THE PITCH

Copyright 2011 Sophos Ltd. Copyright strictly reserved. These materials are not to be reproduced, either in whole or in part, without permissions.

Cisco IronPort C670 for Large Enterprises and ISPs

Barracuda Spam Control System

Anti-spam Comparison Report

Barracuda Spam Firewall User s Guide

eprism Security Appliance 6.0 Intercept Anti-Spam Quick Start Guide

INSIDE. Neural Network-based Antispam Heuristics. Symantec Enterprise Security. by Chris Miller. Group Product Manager Enterprise Security

YSU Spam Solution Guide to Using Proofpoint

Microsoft and Windows are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

About this documentation

Cisco IronPort X1070 Security System

COMBATING SPAM. Best Practices OVERVIEW. White Paper. March 2007

Comprehensive Anti-Spam Service

Commtouch RPD Technology. Network Based Protection Against -Borne Threats

Purchase College Barracuda Anti-Spam Firewall User s Guide

System Compatibility. Enhancements. Operating Systems. Hardware Requirements. Security

How to Use Red Condor Spam Filtering

Using Security to Protect Against Phishing, Spam, and Targeted Attacks: Combining Features for Higher Education

Cisco Cloud Security Interoperability with Microsoft Office 365

eprism Security Suite

Trend Micro Hosted Security. Best Practice Guide

Cisco IronPort C370 for Medium-Sized Enterprises and Satellite Offices

Groundbreaking Technology Redefines Spam Prevention. Analysis of a New High-Accuracy Method for Catching Spam

Proofpoint Anti-SPAM Quarantine System

FireEye Threat Prevention Cloud Evaluation

FILTERING FAQ

Using the Barracuda Spam Firewall to Filter Your s

Junk Filtering System. User Manual. Copyright Corvigo, Inc All Rights Reserved Rev. C

Mailwall Remote Features Tour Datasheet

Symantec Hosted Mail Security Getting Started Guide

Panda Cloud Protection

Intercept Anti-Spam Quick Start Guide

Barracuda Spam Firewall User s Guide

GFI Product Comparison. GFI MailEssentials vs Barracuda Spam Firewall

Symantec Advanced Threat Protection: Network

The Growing Problem of Outbound Spam

Migration Project Plan for Cisco Cloud Security

Barracuda Spam & Virus Firewall User's Guide 5.x

Quick Reference. Administrator Guide

INLINE INGUARD GUARDIAN

Reviewer s Guide. PureMessage for Windows/Exchange Product tour 1

Using the Barracuda to Filter Your s

Anti Spam Best Practices

How To Manage Your Quarantine On A Blackberry.Com

Quarantined Messages 5 What are quarantined messages? 5 What username and password do I use to access my quarantined messages? 5

How to Access Your Private Message Center if you need more control

Version 3.x. Barracuda Spam & Virus Firewall User s Guide. Barracuda Networks Inc S. Winchester Blvd Campbell, CA

BARRACUDA. N e t w o r k s SPAM FIREWALL 600

Deploying Layered Security. What is Layered Security?

Trend Micro Hosted Security Stop Spam. Save Time.

Spam Filtering Sophos

Version 5.x. Barracuda Spam & Virus Firewall User s Guide. Barracuda Networks Inc S. Winchester Blvd Campbell, CA

SPAMfighter Mail Gateway

IronPort Plug-in for Outlook VERSION 1.8 ADMINISTRATOR GUIDE

How does the Excalibur Technology SPAM & Virus Protection System work?

Stop Spam. Save Time.

Service Launch Guide (US Customer) SEG Filtering

PureMessage for Microsoft Exchange Help. Product version: 4.0

Barracuda Spam Firewall

An Overview of Spam Blocking Techniques

How To Use Puremessage For Microsoft Exchange

ContentCatcher. Voyant Strategies. Best Practice for Gateway Security and Enterprise-class Spam Filtering

Archiving/Retention Frequently Asked Questions (FAQ)

Barracuda Spam Firewall Users Guide. How to Download, Review and Manage Spam

IronPort C10 for Small and Medium Businesses

ANTI-SPAM SOLUTIONS TECHNOLOGY REPORT FEBRUARY SurfControl Filter.

Barracuda Spam Firewall User s Guide

WATCHGUARD BARRACUDA KEY SALES PITCH TRUTH BEHIND THE PITCH

Microsoft. Student Edition. The Richard Stockton College of New Jersey. CustomGuide. Computer Courseware

Service Level Agreement for Microsoft Online Services

HOW TO: Use the UWITC Barracuda Spam Filter System

Chapter 7: Configuring ScanMail emanager

Archive Server for MDaemon Keep track of all your ! Save that information in a safe place and retrieve it in a snap.

Symantec Messaging Gateway powered by Brightmail

KUMC Spam Firewall: Barracuda Instructions

Setting up Microsoft Outlook to reject unsolicited (UCE or Spam )

PureMessage for Microsoft Exchange Help. Product version: 3.1

eprism Security Appliance 6.0 Release Notes What's New in 6.0

Symantec Protection Suite Add-On for Hosted and Web Security

Transcription:

Comparing Industry-Leading Anti-Spam Services Results from Twelve Months of Testing Joel Snyder Opus One April, 2016 INTRODUCTION The following analysis summarizes the spam catch and false positive rates of the leading antispam vendors. Compiled by Opus One, an independent research firm, this report provides data to objectively compare the market s most popular anti-spam solution. Working with Cisco, we identified six enterprise-focused vendors to compare to Cisco s Email Security Solution (Appliance and Cloud-Based): Barracuda Networks, Intel Security (formerly McAfee), Microsoft Office 365, Proofpoint, Sophos, and Symantec. The only vendor mentioned by name in the rest of this report is Cisco. The remaining vendor names have been anonymized. TEST METHODOLOGY To ensure consistency and reliability, Opus One operated within the following parameters during the 12-month long analysis from April 2015 to March 2016: Approximately 10,000 messages were selected at random for testing each month, with a total of 145,709 messages in the final evaluation set Messages were drawn from actual corporate production mail streams Messages were received live and tested with less than a one-second delay Tested products were acquired directly from the vendor or through normal distribution channels and were under active support contracts. Cloud-based solutions were only used when an appliance-based solution was not available. Tested products were up to date with current released software and signature updates and all settings were reviewed by each vendor s own technical support team Messages were hand classified as spam and not spam to ensure data validity Each of the tested products included the vendor-recommended or integrated reputation service in the results Table of Contents Introduction and Test Methodology 1 Test Results 2 Spam Catch Rate Results 3 False Positive Results 4 Summary 5 Appendix 6 The test results reported here are taken from Opus One s continuing anti-spam testing program. With over ten years of monthly results, Opus One is uniquely positioned to provide objective efficacy reporting across all major anti-spam products. While testing occurred in North America, message sources were global. See the appendix at the conclusion of this report for further test methodology details and definitions of terms.

TEST RESULTS Cisco s email security solution demonstrated the most accurate rate of detection and the best spam capture rate. The results are remarkable given the tradeoff between spam capture and false positive rates. For example, a vendor can catch 100% of spam if they block every message but then the false positive rate would also be 100%, which is obviously unacceptable. In our testing, there are months when other vendors out-performed Cisco in catching spam, but this always caused a jump in their false positive rates. We saw only a single month in the last 12 months where another product matched Cisco Email Security on both spam catch rate and false positive rate: in June, 2015, vendor E had a 0.1% better catch rate, and the same false positive rate at Cisco Email Security. The results showing false positive rate and spam catch rate are summarized in the graph below. The data points shown are averages across the entire year, scaled to fit. Best Comparative Anti-Spam Efficacy Spam Accuracy Rate (1-FP rate) Vendor A Vendor C Vendor E Vendor F Vendor D Cisco Worst Vendor B Spam Capture Rate Best OPUS ONE 2

SPAM CATCH RATE RESULTS The spam catch rate has a direct impact on end-users satisfaction and productivity. With the high daily global volume of spam, even the slightest reduction in catch rates can have a major adverse effect. Cisco s own anti-spam engine had the highest catch rate averaged across the year-long period covered by this report. The table below compares other vendors to Cisco by showing how much spam they missed compared to Cisco. For example, a user protected by Vendor C would have more than twice as much spam as their inbox than a user protected by Cisco. Cisco Vendor n/a Vendor B 130% Vendor F 132% Vendor E 141% Vendor A 163% Vendor C 179% Vendor D 209% Missed Spam Rate Relative to Leader Average annual spam catch rate results by vendor over the testing period are graphed below. The average for the year, as well as worst and best monthly performance values, are shown. Anti-Spam Catch Rate Average, Best, and Worst Values for 12 months 100.00% Best 97.50% Average Worst 95.00% 92.50% Cisco A B C D E F OPUS ONE 3

FALSE POSITIVE RESULTS Because of the mission critical nature of email, it is essential that an enterprise s anti-spam solution deliver a low false positive rate. Messages incorrectly quarantined and blocked pose a serious loss of time and productivity for system administrators and end-users. In some cases, false positives also have a negative financial impact on the organization. The relative results over the year-long period ending March 2016 are as follows: Cisco Vendor n/a Vendor A 143% Vendor E 180% Vendor C 188% Vendor F 358% Vendor D 1577% Vendor B 4705% False Positive Rate Relative to Leader The false positive performance of each vendor is shown in the graph below. The average for the entire year, along with worst and best performance are graphed. In this graph, higher numbers show higher accuracy and are better. Anti-Spam False Positive Accuracy Average, Best and Worst Values for 12 months 100.00% 99.50% 99.00% 98.50% 98.00% Cisco A E C F D B OPUS ONE 4

SUMMARY Our testing shows that Cisco offers an industry-leading solution to blocking unwanted spam in enterprises and organizations around the world. Given the essential role of email in the operations of modern enterprises, spam poses a serious threat to their success. When a spam message finds its way into a user s inbox or a legitimate message is incorrectly identified as spam and quarantined, there is an immediate impact on productivity. While performance of the solutions evaluated in this analysis may vary by only a few percentage points, it s important to recognize that this difference can translate into hundreds, if not thousands, of unwanted and potentially problematic messages infiltrating a network. With outstanding performance in catching spam, and avoiding false positives, Cisco s anti-spam technology should be on the short-list for anyone considering a new email security gateway. Over the years, much ground has been gained in the battle against spam. Nevertheless, the number of threat messages continues to rise, demanding increasingly sophisticated and capable defense systems. The productivity of the global marketplace demands it. ABOUT OPUS ONE Opus One is an information technology consultancy with experience in the areas of messaging, security, and networking. Opus One has provided objective testing results for publication and private use since 1983. This document is copyright 2016 Opus One, Inc. OPUS ONE 5

APPENDIX DEFINITION OF TERMS Spam is unsolicited commercial bulk email. We consider messages to be spam if there is no business or personal relationship between sender and receiver and which are obviously bulk in nature. Mail messages that may not have been solicited, but which show a clear business or personal relationship between sender and receiver, or are obviously a one-to-one message, even if unsolicited and unwanted, are not considered spam. Spam catch rate measures how well the spam filter catches spam. We have used the commonly accepted definition of specificity, which is the number of spam messages caught divided by the total number of spam messages received. The missed spam is one minus the spam catch rate. False positive rate measures the number of legitimate emails misclassified as spam. Different vendors and testing services define false positive rate in different ways, typically either specificity or positive predictive value. In this report, false positive rate is defined using positive predictive value as (1 ((messages marked as spam false positives)/(total messages marked as spam))). The spam accuracy rate is one minus the false positive rate. TESTING METHODOLOGY Anti-spam products were evaluated by installing them in a production mail stream environment. The test simultaneously feeds the same production stream to each product, recording the verdict (typically spam, not spam, or suspected spam ) for later comparison. Each product tested was acquired directly from the vendor or through normal distribution channels. Each product tested was under an active support contract, and was believed to be upto-date with publicly released software and signature updates. Where multiple versions were available from a vendor, the technical support team for each vendor was consulted to determine the recommended platform and version for use. To minimize confusion, products were not upgraded during the test cycle. Each product was connected to the Internet to retrieve signature and software updates as often as recommended by the vendor. If vendor technical support teams recommend a shorter update cycle, this recommendation was implemented. All systems were able to connect to the Internet for updates and DNS lookups. A firewall was placed between each product and the Internet to block inbound connections, while outbound connections were completely unrestricted on all ports. Each product was configured based on the product manufacturer s recommended settings. In cases where obviously inappropriate settings were included by default, these settings were changed to support the production mail stream. Maximum message size -- to accommodate messages of varying sizes -- was the most commonly changed setting. The tests drew on the real.com corporate message stream because this message stream contains no artificial content and best represents the normal enterprise stream. No spurious spam or non-spam content was injected into the stream. No artificial methods to attract spam were employed. OPUS ONE 6

Because products were not receiving email directly from the Internet, the reputation service of each product had to be individually configured to support the multi-hop configuration. In cases where products were unable to handle a multi-hop configuration with reputation service, the reputation service results were gathered at the edge of the network and then re-combined with the anti-spam results after the test was completed. Once the messages were received, Opus One manually read through every single message, classifying it as spam, not spam, or unknown according to the definitions above. All mailing lists which have legitimate subscriptions were considered not spam, irrespective of the content of any individual message. Messages were classified as unknown if they could not be definitively categorized as spam or not spam based on content, or if they were so malformed that it could not be determined that they were spam, viruses, or corrupt software. All "unknown" messages were deleted from the data set, and do not factor into the result statistics. The total number of unknown messages in the sample was small, typically less than 0.1% of the total sample size. Once the manual qualification of messages was completed, all results were placed in an SQL database. Queries were then run to create false positive and false negative (missed spam) lists. False positives and false negatives for each product were evaluated and any errors in the original manual classification were fixed. Once the data sets were determined to be within acceptable error rates, the databases were reloaded and the queries recreated. Each anti-spam engine provides a verdict on messages. While this is often internally represented as a number, the verdict in most products is reduced to a categorization of each message as being spam or not spam. In many anti-spam products, a third category is included, typically called suspected spam. In this test, products were configured at the factory-default settings, where possible, to have three verdicts (spam, suspect spam, and not spam). Where products have three verdicts, suspect spam is considered to be spam. As a result, suspect spam was included in the catch rate and false positive rate calculations. The one exception to this is Vendor E; in this product, suspected spam is actually marketing mail and not considered spam. Catch rate refers to the number of spam messages caught out of the total number of spam messages received. When spam is not caught, it is called a false negative. False negative means the test said this was not spam, and it was. False positive means the test said this was spam, and it wasn t. Spam catch rate is calculated and compared using Sensitivity; false positive rate is calculated and compared using the inverse Positive Predictive Value. OPUS ONE 7