Business Products Testing Report Performed by AV-Test GmbH January 2011 1
Business Products Testing Report - Performed by AV-Test GmbH Executive Summary Overview During November 2010, AV-Test performed a comparative review of six business security products to determine their real-world protection capabilities against threats that are encountered by businesses of all sizes every day. The test was designed to challenge the products against zero-day attacks from the internet, including the most common infection vectors currently. The samples were accessed via direct links to malicious executable files, by drive-by-download websites that utilize exploits and by opening mail attachments. Additionally, a static detection test of recent samples of two very prevalent malware families FakeAV and Zbot was carried out to show the detection capabilities of the products. Overall, delivered the best detection rates, coming top in both parts of the test detecting and blocking 96% of the zero-day threats, as well as blocking 99.74% of the prevalent malware samples. reached an equally good result for zero-day threats, also blocking 96%. The on-demand detection of prevelant malware was at 97.16%. Trend Micro and were very close to the top two products regarding the blocking of zero-day threats, both reached 92% here. Trend Micro had the second best ondemand detection rate of Zbot and FakeAV with 99.59%, reached 98.83%. Finally and were able to detect 80% of the zero-day threats and 91.38% () resp. 98.14% () in case of the two prevalent malware families. The volume of new samples that anti-malware vendors have to process in order to protect their customers is creating problems. It is not always possible to deploy a signature for a certain binary in time. To combat this, security vendors provide additional functionalities that detect new threats from heuristics and generic detections, to URL blocking and exploit detection. These features can be used individually to protect against threats, however, due to the massive amount of malware threats and sources, as well as different behaviors. This test considers all of the protection mechanisms that are included in today s security software and challenges them against real-world threats in order to determine the real protection capabilities of the products. Products Tested The tests were carried out on the latest versions (at the time of the test) of each of the following 6 products: 6.0 Total for 4.5 1.5 9.5 11 Trend Micro OfficeScan 10.5 2
Business Products Testing Report - Performed by AV-Test GmbH Test Results 1. Real-Time The best results for combined detection with all protection features turned on were achieved by and, both detecting 24 out of 25. and Trend Micro are only one threat behind with 23 detected samples. and detected 20 out of 25 tested URLs and e-mails. i. URL blocking The test started by accessing malicious URLs and determined which products blocked access to the URLs and which didn t. This prevents malicious code from ever reaching the endpoint, minimizing the risk of getting infected. All vendors other than offer this feature. 25 20 10 23 20 20 Figure 1: Combined Detection (URL, Static or Dynamic) 24 24 23 The best result was achieved by and Trend Micro who both blocked access to 17 out of 23 URLs. and were a bit behind with 13 and 12 blocked URLs respectively. blocked access to 2 URLs. However, it is important to note, that offers a protection feature called SmartScreen in the Internet Explorer which is also able to block access to malicious websites. This wasn t included in the test as our focus was on the protection features in the security software. 25 20 17 17 13 12 10 5 0 Figure 2: Blocked Access to Malicious URLs 2 0 3
Business Products Testing Report - Performed by AV-Test GmbH ii. Static detection In case the URL blocking fails or cannot be used, then traditional static malware detection becomes important. Therefore we tested the same 23 URLs again as well as the two malicious e-mail attachments with the on-demand scanner. All products tested provide this protection feature, as it is the traditional way of detecting and blocking malware. The best results were achieved by, detecting 19 out of 25 cases, closely followed by which detected 17 cases. 25 2. Detection of prevalent malware The second type of testing performed was the static detection of FakeAV and ZBot, two very prevalent malware families for many businesses today. and Trend Micro achieved the highest results detecting over 99% of the files for both test sets., and also had a very good detection rate with over 98% for FakeAV and over 92% for ZBot. was behind in this test, despite their in-the-cloud service, primarily because the sensitivity of this service is set to low in the default configuration to minimize the risk of false positives. This shows that you should always consider configuration settings and whether increasing protection rates will increase the risk of false positives or not. 20 19 100% 17 13 14 90% 10 11 Figure 3: Static detection of malware components iii. Dynamic detection 80% 70% 60% When both the URL filtering as well as the static file scanning fail and don t detect anything the malware could be executed on the system. At this point the third protection layer becomes important. This is the dynamic detection of threats, analyzing the behavior of the threat and blocking suspicious actions and removing related components. However it is not easy to test the dynamic detection separately (because the static detection would have to be artificially disabled), so the overall scores in figure 1 give an idea of the impact this has on the full protection rates. 50% Figure 4: Static detection results Fake AV ZBot 4
Business Products Testing Report - Performed by AV-Test GmbH Methodology Platform All tests have been performed on identical PCs equipped with the following hardware: The real-world blocking test was performed according to the methodology explained below: Intel Xeon Quad-Core X3360 CPU 4 GB Ram 500 GB HDD (Western Digital) Intel Pro/1000 PL (Gigabit Ethernet) NIC The operating system was Windows XP Service Pack 3 with updates as of 31.10.2010. Additionally, the following applications have been installed to provide a vulnerable system for the URLs that use exploits to infect the system. Developer Product Version Adobe Flash Player 10 ActiveX 10.0.12.36 Adobe Flash Player 10 Plugin 10.0.12.36 Adobe Acrobat Reader V8 or v9 ICQ ICQ6 6.00.0000 Sun Java SE Runtime 1.6.0.10 Environment 6 Update 1 Mozilla Firefox (2.0.0.4) 2.0.0.4 (en-us) Apple QuickTime 7.3.0.70 Real Networks RealPlayer 10.5 WinZip Computing LP WinZip 10.0(6667) Yahoo! Inc Messenger 8.1.0.413 Testing methodology The general pre-requisites were as follows: 1. Clean system for each sample. The test systems were restored to a clean state before being exposed to each malware sample. 2. Physical machines. The test systems used were actual physical machines. No Virtual Machines were used. 1. Sample introduction vector. Each sample should be introduced to the system in as realistic a method as possible. This will include sending samples that are collected as email attachments in the real world as attachments to email messages. Web-based threats are downloaded to the target systems from an external web server in a repeatable way. 2. Sample Cloud/Internet accessibility. If the malware uses the cloud/internet connection to reach other sites in order to download other files and infect the system, care should be taken to make sure that the cloud access is available to the malware sample in a safe way such that the testing network is not under the threat of getting infected. 3. Allow time for sample to run. Each sample should be allowed to run on the target system for 10 minutes to exhibit autonomous malicious behavior. This may include initiating connections to systems on the internet, or installing itself to survive a reboot (as may be the case with certain key-logging Trojans that only activate fully when the victim is performing a certain task). 4. Actions of the security software. If the security software prompts the user for a decision, always the same action shall be chosen. The action to be chosen should be removal or blocking, depending on what is being offered. 5. Recoding of the result. After the security software has finished all blocking or removal steps, the final system state is captured which allows to determine the success in detection, blocking and/or removing the threat by looking at the following points: a. Which network traffic has been allowed or blocked? b. Which modification to the file system have been allowed or blocked? c. Which modifications to the registry have been allowed or blocked? 3. Product configuration. All products were run with their default, out-of-the-box configuration and updated to their latest versions. The Internet was available to all tested products so they can use the cloud as part of their protection strategy. 5
Business Products Testing Report - Performed by AV-Test GmbH Appendix Version information of the tested software Developer, Distributor Product name Program version Engine/ signature version Lab 6.0.4.1424 (a) n/a Total for 4.5.0.1270 5400.18/ 6162.0000 1.5.1993.0 1.1.6301.0 / 1.93.1733 9.5.4 3.13.1 / 4.59G 11.0.6100.645 20101.2.0.161 / 121111bd Trend Micro OfficeScan 10.5.1083 9.120.1004 / 1.240.00 Samples The malware test corpus for the real-world protection test consisted of 25 samples, including direct downloads, driveby-downloads and malicious mail attachments. The samples were collected, analyzed and chosen by AV-Test. All samples have been tested at the same day they were discovered by AV-Test to ensure that only the latest threats are used. The test corpus for the static detection test consisted of 6,059 samples of the FakeAV and ZBot families. The samples were collected during a time frame of 6 weeks before starting the test. The analysis of the files and the decision which files shall be included in the test set was made by AV-Test. List of malware samples for the real-world test URLs http://109.235.249.37/ http://174.139.92.41/go/ http://188.65.74.37/ http://74.82.183.52/ http://91.211.117.76/ http://alexastatscounter.info/... http://askstats.info/... http://carolinaporn.fileave.com/files/ http://clean-domain.com/... http://data.fuskbugg.se/dipdip/ http://domainscrawl.info/ http://fedar.net/... http://feraus.com/ http://fsdfile.ru/fraud_application/directory/ http://ghostbustards.ru/bunghole/ http://googlraiting.info/ http://ilker.org/ http://ipdnsue.ru/v2/out/ http://todohi5.powweb.com/file/ http://tosyahoo.info/... http://www.cadstock.com/system/ http://www.futuremediagroup.se/ http://eventyline.com/... Malicious E-Mail attachments DHL Delivery Problem S.NR47621864 Facebook Service. Your password has been 6
Business Products Testing Report - Performed by AV-Test GmbH Copyright 2010 by AV-Test GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60 Fax +49 (0) 391 60754-69 Web http://www.av-test.org