Anti-Virus Comparative

Similar documents
Anti-Virus Comparative

Anti-Virus Comparative

Anti-Virus Comparative - Proactive/retrospective test May 2009

Anti-Virus Comparative

Anti-Virus Comparative

Anti-Virus Comparative

Anti-Virus Comparative

Anti Phishing Test July 2013

Fully supported Antivirus software (Managed Antivirus)

Anti-Virus Comparative

Anti-Virus Comparative

Anti Virus Comparative Performance Test (AV Products) November 2011

Anti-Virus Comparative No.22

IT Security Survey 2014

AV-Comparatives. Support-Test (UK) Test of English-Language Telephone Support Services for Windows Consumer Security Software 2016

How to Determine the Performance of a Computer System

Whole Product Real World Dynamic Protection Test (August November) 2011

IT Security Survey 2012

Anti-Virus Comparative

Supported Anti Virus from ESAP 2-6-1

AV-Comparatives. Mobile Security Test. Language: English. February 2015 Last revision: 30 th March

Security Industry Market Share Analysis

Anti-Virus Comparative

Anti Virus Comparative Performance Test (AV Products) October 2012

IT Security Survey 2015

26 Protection Programs Undergo Our First Test Using Windows 8

Approved Anti-Virus Software

Anti-Phishing Test August 2015

Anti-Virus Comparative - Performance Test (AV Products) May 2014

IT Security Survey 2016

KASPERSKY LAB PROVIDES BEST IN THE INDUSTRY PROTECTION*

Whole Product Dynamic Real-World Protection Test (August-November 2013)

Anti Virus Comparative Performance Test (Suite Products) May 2012

PCSL. PCSL IT Consulting Institute 机 安 全 软 件 病 毒 检 测 率 测 试

Security Industry Market Share Analysis

Whole Product Dynamic Real-World Protection Test (March-June 2014)

Products supported by ESAP FIREWALL PRODUCTS: Product Name. AOL Firewall (1.x) AOL Privacy Wall (2.x) AVG 8.0 [Firewall] (8.

How To Test For Performance On A 64 Bit Computer (64 Bit)

Anti-Virus Comparative

Whole Product Dynamic Real-World Protection Test (March-June 2015)

KASPERSKY LAB PROVIDES BEST IN THE INDUSTRY PROTECTION*

List of Products supported by ESAP 2.2.1

Products supported by ESAP FIREWALL PRODUCTS:

Products supported by ESAP 1.4.8

Whole Product Dynamic Real World Protection Test (March June 2013)

Whole Product Dynamic Real-World Protection Test (February-June 2016)

Henry Ford Health System Remote Access Support Details

Ad-Aware Total Security [Firewall] (3.x) Ad-Aware Total Security [Firewall] (3.x)

Firewall Test. Firewall protection in public networks. Commissioned by CHIP. Language: English. Last Revision: 11 th April 2014

Products supported by ESAP FIREWALL PRODUCTS:

PCSL. PCSL IT Consulting Institute 手 机 安 全 软 件 病 毒 检 测 率 测 试. Malware Detection Test. Celebrating Technology Innovation

Computer infiltration

Performance test November 2014 / 1 INTRODUCTION... 1 TESTED PROGRAM VERSIONS..2 WHAT AND HOW WE TESTED. 3 OTHER PRINCIPLES...

Detection of Linux malware

Standalone Sidegrade Tool

Host Checker Security software requirements

Test of the Antivirus Software For antivirus solutions, the test was divided up into two typical infection scenarios.

Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities?

Anti-Virus Comparative

10 Security Packages for Mac OS X: No less than 5 products achieved a perfect score of 100 percent in detection (AV-TEST April 2015).

MRG Effitas 360 Assessment & Certification Programme Q4 2014

Windows Updates vs. Web Threats

FAKE ANTIVIRUS MALWARE This information has come from - a very useful resource if you are having computer issues.

Anti-Virus Comparative

Contact details For contacting ENISA or for general enquiries on information security awareness matters, please use the following details:

ENTERPRISE EPP COMPARATIVE REPORT

MaaS360 Application Support Matrix

Anti-Spam Test. Anti-Spam Test. (Consumer Products) Language: English. March 2016 Last Revision: 20 th April

ESAP Release Notes

EXCERPTS FROM VIRUS BULLETIN COMPARATIVE REVIEWS AUGUST DECEMBER 2010

The Antivirus Industry: Quo Vadis? VirusBulletin

Global Antivirus Software Package Market

AV-TEST Examines 22 Antivirus Apps for Android Smartphones and Tablets

Mobility Security Product Test and Certificate.

Endurance Test: Does antivirus software slow

CORPORATE AV / EPP COMPARATIVE ANALYSIS

Anti Virus Comparative Summary Report 2011 December 2011

Single Product Review - Bitdefender Security for Virtualized Environments - November 2012

Mobility Security Product Test and Certificate. For Android

DSD avast! Pro Antivirus 1-PC 1 jaar 33,99 41,13 DSD avast! Pro Antivirus 3-PC 1 jaar 42,99 52,02

Anti-Virus Protection and Performance

26.6% 20.2% 28% 70% 67% Market Share Analysis of Antivirus & Operating Systems. Contents. Report Highlights. Introduction.

ESAP Release Notes. Version Published

How To Understand What A Virus Is And How To Protect Yourself From A Virus

Online Payments Threats

Welcome To The L.R.F.H.S. Computer Group Wednesday 27 th November 2013

Single Product Review. SoftSphere Technologies. DefenseWall HIPS. Language: English May 2009 Revised last:

Mobility Security Product Test and Certificate

MRG Effitas Online Banking / Browser Security Assessment Project Q Results

Mobile Security Apps. Hendrik Pilz Director Technical Lab / Mobile Security hpilz@av-test.de

Parental Control Single Product Test

Sérgio Martinho Microsoft Portugal

B-HAVE the road to success

Windows Antivirus Inspector Support Chart

OPSWAT Antivirus Integration SDK

ENTERPRISE EPP COMPARATIVE ANALYSIS

ESET a pioneer of IT security since 1987

ODOT UAG Tips/Troubleshooting Guide

Android Malware Detection Test 手 机 安 全 软 件 病 毒 检 测 率 测 试 Dec. Celebrating Technology Innovation

Netsafe/ AVG New Zealand. NetSafe/ AVG New Zealand. Cyber Security Research. Cyber Security Research. Supplementary Tables. March 2011.

Transcription:

Anti-Virus Comparative On-demand Detection of Potentially Unwanted Applications (incl. Adware, Spyware and Rogue Software) Language: English December 2010 Last Revision: 22 nd December 2010

Table of Contents Tested Products 3 Introduction 4 Test Results 5 Summary Results 5 Award levels reached in this test 6 Copyright and Disclaimer 7-2 -

Tested Products avast! Free 5.0 AVG Anti-Virus 2011 AVIRA AntiVir Premium 10.0 BitDefender Antivirus 2011 escan AntiVirus 11.0 ESET NOD32 Antivirus 4.2 F-Secure Anti-Virus 2011 G DATA AntiVirus 2011 K7 TotalSecurity 10 Kaspersky Anti-Virus 2011 Kingsoft AntiVirus 2011 McAfee Antivirus Plus 2011 Microsoft Security Essentials 1.0 Norman Antivirus & Anti-Spyware 8.0 Panda Antivirus Pro 2011 PC Tools Spyware Doctor with AV 8.0 Sophos Anti-Virus 9.5 Symantec Norton AntiVirus 2011 Trend Micro Titanium Antivirus 2011 Trustport Antivirus 2011-3 -

Introduction The amount of adware, spyware and especially of fraudulent software circulating on the Internet has increased a lot over the past few years. Such applications are not typical malware and their classification is sometimes not an easy task; they are usually described using the term potentially unwanted application (PUA). Under some circumstances, certain potentially unwanted applications are accepted/wanted in some countries, depending on cultural background or legal system, due to which legal disputes sometimes come up as to whether a program can be considered to be malware or not. The term potentially unwanted covers this grey area. Usually our malware test sets do not include this kind of threat, but users may want to know how well their Anti-Virus program detects potentially unwanted software. The PUA (Potentially Unwanted Applications) test set used for this test contains 82036 samples. It includes only program executable files and covers mainly rogue software (e.g. fake antivirus and other misleading or unwanted/unsafe applications), adware (e.g. Virtumonde, browser hijackers) and spyware (e.g. keyloggers) gathered or re-seen between July 2010 and November 2010. Some products may classify some PUAs as Trojans, while some other products may not want to add detection for some potentially unwanted applications as their company policy or due possible legal reasons. The test-set is so small/limited because we tried to remove in advance certain software types that could be disputed or do not belong to this testset (e.g. CasinoClients, Games, Toolbars, Utilities, etc.). Vendors of which products score high in this test may only exclude a detection when the legal implications become serious, otherwise they continue to detect it if it warrants detection from a technical perspective. Especially if products have special checkboxes that the user has to activate manually to get informed about potentially unwanted or unsafe stuff, such stuff should be detected, otherwise it should be in the normal detections or not reported at all. If some vendors are more reluctant to inform the user about potentially unwanted files, it may be observed in those test results. If some vendors simply add detection for everything (including for such files which are in the grey area), it may be possible to see in the false alarm tests 1 that can be found on our website. The PUA sets were frozen on the 6 th November 2010. The AV products were last updated on the 1 st December 2010. We tested all the products with highest settings (except F-Secure and Sophos on their own request). AV-Comparatives also publishes some other test reports which cover different aspects/features of the antivirus products. Please have a look at our website for further information. Even if we produce various tests and show different aspects of Anti-Virus software, users are advised to evaluate the software by themselves and form their own opinion of it. Test data or reviews just provide guidance on some aspects that users cannot evaluate by themselves. We encourage readers to consider other independent test results provided by various well-known and established independent testing organizations. This will enable them to get a better overview of the detection and protection capabilities of the various products over different test scenarios and various test sets. Due to the challenge of testing PUA in that every AV vendor undergoes its own criteria to define the PUA bar, we are not planning to conduct a separate PUA test next year. 1 http:///comparativesreviews/false-alarm-tests - 4 -

Test Results Graph of missed samples Summary results Detection rates for potentially unwanted software 2 : 1. Panda 99.9% 2. Symantec 99.6% 3. Trustport 99.5% 4. AVIRA 99.4% 5. PC Tools, G DATA 99.3% 6. escan, F-Secure, Bitdefender, McAfee 98.7% 7. ESET 97.7% 8. Kaspersky 97.6% 9. Trend Micro 97.1% 10. Avast 96.9% 11. K7 95.6% 12. Microsoft 92.7% 13. Norman 90.7% 14. Kingsoft 74.9% 2 Users that would like to be informed about potentially unwanted software of which they would be concerned about having on their PC without their knowledge may prefer the higher scoring products, while users which prefer that their AV products does not inform them about such potentially unwanted software may prefer products which are more conservative about what they report. - 5 -

Award levels reached in this test AV-Comparatives provides a 4-level ranking system: Tested, STANDARD, ADVANCED and ADVANCED+. The groups have been defined by consulting/using the hierarchical clustering method. AWARDS (based on detection of potentially unwanted programs) PRODUCTS (in no specific order) 3 Panda Symantec Trustport AVIRA PC Tools G DATA escan F-Secure Bitdefender McAfee ESET Kaspersky Trend Micro Avast K7 Microsoft Norman Kingsoft NOT INCLUDED 4 Sophos* AVG* The above Awards are based only on detection rates for potentially unwanted/unsafe programs. To see detection rates for malware like Trojans, backdoors, viruses, etc., as well as for false alarm rates of the products, please refer to the other test reports available on our website. 3 We suggest considering all products with same the award to be as good as each other. 4 AVG and Sophos decided to not get included in this report and to renounce to get awarded. As those products are included in our yearly public test-series, they are listed even if those vendors decided to do not get included. - 6 -

Copyright and Disclaimer This publication is Copyright 2010 by AV-Comparatives e.v.. Any use of the results, etc. in whole or in part, is ONLY permitted if the explicit written agreement of the management board of AV- Comparatives e.v. is given prior to any publication. AV-Comparatives e.v. and its testers cannot be held liable for any damage or loss which might occur as a result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives e.v. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data. AV-Comparatives e.v. is a registered Austrian Non-Profit-Organization. For more information about AV-Comparatives and the testing methodologies, please visit our website. AV-Comparatives e.v. (December 2010) - 7 -