Real World Protection and Remediation Testing Report

Similar documents
Real World and Vulnerability Protection, Performance and Remediation Report

Endpoint Business Products Testing Report. Performed by AV-Test GmbH

Windows 8 Malware Protection Test Report

Virtual Desktops Security Test Report

Proactive Rootkit Protection Comparison Test

Patch Management Solutions Test

Tracking Anti-Malware Protection 2015

Zscaler Cloud Web Gateway Test

Anti-Virus Protection and Performance

Home Anti-Virus Protection

26 Protection Programs Undergo Our First Test Using Windows 8

Endurance Test: Does antivirus software slow

Online Payments Threats

æ æœ 語 English Deutsch Español Franà ais Italiano Polski æ æœ 語 æ± è - OS Windows OS Windows OS Mac App Manager ã ムã ンãƒ-ームã ã OS Windows

Enterprise Anti-Virus Protection

MRG Effitas Online Banking / Browser Security Assessment Project Q Results

Enterprise Anti-Virus Protection

Performance test November 2014 / 1 INTRODUCTION... 1 TESTED PROGRAM VERSIONS..2 WHAT AND HOW WE TESTED. 3 OTHER PRINCIPLES...

Anti-Virus Comparative

How To Test For Performance On A 64 Bit Computer (64 Bit)

Enterprise Anti-Virus Protection

Small Business Anti-Virus Protection

Anti-Virus Comparative - Performance Test (AV Products) May 2014

A Best Practice Approach to Third Party Patching

Small Business Anti-Virus Protection

Security Industry Market Share Analysis

Small Business Anti-Virus Protection

Deep Security Vulnerability Protection Summary

Enterprise Anti-Virus Protection

Nessus and Antivirus. January 31, 2014 (Revision 4)

Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities?

CORPORATE AV / EPP COMPARATIVE ANALYSIS

How To Test Security Products

Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT

Contact details For contacting ENISA or for general enquiries on information security awareness matters, please use the following details:

Anti-Virus Comparative

Anti-Virus Comparative

Security Industry Market Share Analysis

Online Banking and Endpoint Security Report October 2012

Computer Viruses: How to Avoid Infection

ESET SMART SECURITY 9

Supported Anti Virus from ESAP 2-6-1

Get Started Guide - PC Tools Internet Security

Fully supported Antivirus software (Managed Antivirus)

Comprehensive Malware Detection with SecurityCenter Continuous View and Nessus. February 3, 2015 (Revision 4)

What Do You Mean My Cloud Data Isn t Secure?

Miradore Management Suite Application support for Patch Management

Symantec Endpoint Protection Datasheet

Spine Warranted Environment Specification

Norton 360. Benefits. Our ultimate protection, now even more so. Introducing the new Norton 360.

ESET SMART SECURITY 6

FOR MAC. Quick Start Guide. Click here to download the most recent version of this document

Endpoint protection for physical and virtual desktops


Using Windows Update for Windows XP

Best Practice Configurations for OfficeScan (OSCE) 10.6

Transcription:

Real World Protection and Remediation Testing Report A test commissioned by Symantec Corporation and performed by AV-Test GmbH Date of the report: August 9 th, 2012, last update: August 9 th, 2012 Executive Summary In July 2012, AV-Test performed a comparative review of 13 home user security products to determine their real-world protection and remediation capabilities. In addition to the core product, dedicated removal tools as well as bootable rescue media (which are being offered by some of the vendors) were added to the test. The malware test corpus for the remediation consisted of 32 samples (16 Fake Antivirus samples and 16 other assorted threats). The false positive corpus consisted of 30 known clean applications. To perform the single test runs, a clean Windows XP image was used on several identical PCs. This image was then infected with one of the malware samples. The next step was trying to install the security product, scanning the PC and removing any threats that have been found. If one of these steps could not be carried out successfully, additional freely available removal tools or rescue media were used, if available, from the respective vendor. The false positive testing was performed in the same way. However, the desired result was to not detect any of the 30 clean applications. The malware test corpus for the real-world test consisted of 54 samples, including direct downloads and drive-by-downloads. The false positive corpus consisted of 50 known clean applications. To perform the single test runs, a clean Windows XP image was used on several identical PCs. On this image, the security software was installed and then the infected website or e-mail was accessed. Any detection by the security software was noted. Additionally the resulting state of the system was compared with the original state before the test in order to determine whether the attack was successfully blocked or not. For the false positive part, 50 known clean applications were installed and any false detection from the security products was noted. The best result in the described test was achieved by the Symantec product. It reached the highest overall score as well as the highest individual scores for the remediation test. It shared the top position with a few other vendors in the real-world test. Furthermore, no false positives occurred for this product. Overview With the increasing number of threats that is being released and spreading through the Internet these days, the danger of getting infected is increasing as well. A few years back there were new 1

viruses released every few days. This has grown to several thousand new threats per hour. New unique samples added to AV-Test's malware repository (2000-2010) 20,000,000 18,000,000 16,000,000 14,000,000 12,000,000 10,000,000 8,000,000 6,000,000 4,000,000 2,000,000 0 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 Dec Nov Oct Sep Aug Jul Jun May Apr Mar Figure 1: New samples added per year In the year 2000, AV-Test received more than 170,000 new samples, and in 2009, the number of new samples grew to over 19,000,000 new samples. The numbers continue to grow in the year 2012. The growth of these numbers is displayed in Figure 1. The volume of new samples that have to be processed by anti-malware vendors in order to protect their customers can create problems. It is not always possible to successfully protect a PC in time. It is possible that a PC can get infected, even if up-to-date anti-malware software is installed because signatures are provided only every few hours, which sometimes may be too late. Infections create financial loss, either because sensitive data is stolen or because the PC cannot be used for productive work anymore until the malware has completely removed from the system. Therefore remediation techniques become more important to get an infected PC up and running again. In that process it is imperative that the cleaning process is reliable in two ways: 1. The malware and all of its components have to be removed and any malicious system changes have to be reverted 2. No clean applications or the system itself must be harmed by the cleaning process Fulfilling these two requirements is not easy. In order to be able to handle the high volume of different malware samples and different behavior it would be necessary to apply more generic cleaning techniques, because there is simply no time to deploy a dedicated cleaning routine for every single malware sample. As soon as generic techniques are used, the risk of false positives (and therefore the risk of harming the system and clean software) increases. On the other hand, malware uses a lot of techniques to avoid successful detection (e.g. rootkit techniques are used to hide files, registry entries and processes) or removal (e.g. the anti-malware software is blocked from starting up). In order to cope with these problems, some vendors provide specific removal tools and rescue media, that don t face the problems of the regular anti-malware software. 2

All these aspects have been considered in this test and the corresponding details will be presented on the next few pages. Products Tested The testing occurred in July 2012. AV-Test used the latest releases available at the time of the test of the following thirteen products: Avast Software avast! Internet Security 7.0 AVG Internet Security 2012 Avira Internet Security 2012 Bitdefender Total Security 2013 ESET Smart Security 5 F-Secure Internet Security 2012 Kaspersky Internet Security 2012 McAfee Total Protection 2012 Microsoft Security Essentials 4 Panda Internet Security 2012 Symantec Norton Internet Security 2013 Trend Micro Titanium Maximum Security 2012 Webroot SecureAnywhere Complete 8 Methodology and Scoring Platform All tests have been performed on identical PCs equipped with the following hardware: Intel Xeon Quad-Core X3360 CPU 4 GB Ram 500 GB HDD (Western Digital) Intel Pro/1000 PL (Gigabit Ethernet) NIC The operating system was Windows XP Service Pack 3 with only those hotfixes that were part of SP3 as well as all patches that were available on July 1 st 2012. Additionally, the following applications have been installed to provide a vulnerable system for the URLs that use exploits to infect the system. Developer Product Version Adobe Flash Player 10 ActiveX 10.0.12.36 Adobe Flash Player 10 Plugin 10.0.12.36 Adobe Acrobat Reader V8 or v9 ICQ ICQ6 6.00.0000 Sun Java SE Runtime Environment 6 Update 1 1.6.0.10 Mozilla Firefox (2.0.0.4) 2.0.0.4 (en-us) Apple QuickTime 7.3.0.70 Real Networks RealPlayer 10.5 3

WinZip Computing LP WinZip 10.0(6667) Yahoo! Inc Messenger 8.1.0.413 Testing methodology Remediation Test The remediation test has been performed according to the methodology explained below. 1. Clean system for each sample. The test systems should be restored to a clean state before being exposed to each malware sample. 2. Physical Machines. The test systems used should be actual physical machines. No Virtual Machines should be used. 3. Internet Access. The machines had access to the Internet at all times, in order to use in-thecloud queries if necessary. 4. Product Configuration. All products and their accompanying remediation tools or bootable recovery tools were run with their default, out-of-the-box configuration. 5. Infect test machine. Infect native machine with one threat, reboot and make sure that threat is fully running. 6. Sample Families and Payloads. No two samples should be from the same family or have the same payloads. 7. Remediate using all available product capabilities. a. Try to install security product in default settings. Follow complete product instructions for removal. b. If a. doesn t work, try standalone fixtool/rescue tool solution (if available). c. If b. doesn t work, boot standalone boot solution (if available) and use it to remediate. 8. Validate removal. Manually inspect PC to validate proper removal and artifact presence. 9. Score removal performance. Score the effectiveness of the tool and the security solution as a whole using the agreed upon scoring system. 10. Overly Aggressive Remediation. The test should also measure how aggressive a product is at remediating. For example some products will completely remove the hosts file or remove an entire directory when it is not necessary to do so for successful remediation. This type of behavior should count against the product. 11. False Positive Testing. The test should also run clean programs and applications to make sure that products do not mistakenly remove such legitimate software. In addition to the above, the following items had to be considered: Fixtools: No threat-specific fixtools should be used for any product s remediation. Only generic remediation standalone/fixtools and bootable tools should be used. Licensed vs. Unlicensed Bootable or Remediation tool: Only licensed bootable or other generic remediation tools offered by vendors as part of their security product or pointed to by their infection UI workflow should be included in the test. No unlicensed tools should be used in the test. 4

Microsoft s Malicious Malware Removal Tool: This is part of the windows update and as such a part of the Windows OS. This tool should not be used as a second layer of protection for any participating vendor s products. Real-World Test The real-world test has been performed according to the methodology explained below. 1. Clean system for each sample. The test systems should be restored to a clean state before being exposed to each malware sample. 2. Physical Machines. The test systems used should be actual physical machines. No Virtual Machines should be used. 3. Product Cloud/Internet Connection. The Internet should be available to all tested products that use the cloud as part of their protection strategy. 4. Product Configuration. All products were run with their default, out-of-the-box configuration. 5. Sample variety. In order to simulate the real world infection techniques, malware samples should be weighted heavily (~80 per cent) towards web-based threats (of these, half should be manual downloads like Fake AV and half should be downloads that leverage some type of exploited vulnerability i.e. a drive-by download). A small set of the samples (5 10%) may include threats attached to emails. 6. Unique Domains per sample. No two URLs used as samples for this test should be from the same domain (e.g. xyz.com) 7. Sample introduction vector. Each sample should be introduced to the system in as realistic a method as possible. This will include sending samples that are collected as email attachments in the real world as attachments to email messages. Web-based threats are downloaded to the target systems from an external web server in a repeatable way. 8. Real World Web-based Sample User Flow. Web-based threats are usually accessed by unsuspecting users by following a chain of URLs. For instance, a Google search on some high trend words may give URLs in the results that when clicked could redirect to another link and so on until the user arrives at the final URL which hosts the malicious sample file. This test should simulate such real world user URL flows before the final malicious file download happens. This ensures that the test exercises the layers of protection that products provide during this real world user URL flow. 9. Sample Cloud/Internet Accessibility. If the malware uses the cloud/internet connection to reach other sites in order to download other files and infect the system, care should be taken to make sure that the cloud access is available to the malware sample in a safe way such that the testing network is not under the threat of getting infected. 10. Allow time for sample to run. Each sample should be allowed to run on the target system for 10 minutes to exhibit autonomous malicious behavior. This may include initiating connections to systems on the internet, or installing itself to survive a reboot (as may be the case with certain key-logging Trojans that only activate fully when the victim is performing a certain task). 11. Measuring the effect. A consistent and systematic method of measure the impact of malicious threats and the ability of the products to detect them shall be implemented. The following should be observed for each tested sample: 5

a. Successful Blocking of each threat. The method of notification or alert should be noted, including any request for user intervention. If user intervention is required, the prompted default behavior should always be chosen. Any additional downloads should be noted. The product should be able to block the malware from causing any infection on the system. This could mean that the malware executes on the system before it tries to do any malicious action, it is taken out by the product. b. Successful Neutralization of each threat. The notification/alert should be noted. If user intervention is required, the prompted default behavior should always be chosen. Successful neutralization should also include any additional downloads. Additionally, indicate whether all aspects of the threat were completely removed or just all active aspects of the threat. c. Threat compromises the machine. Information on what threat aspects were found on the system and were missed by the product should be provided. Efficacy Rating Remediation Test For each sample tested, apply points according to the following schedule: a. Malware completely removed (5) b. Malware removed, some unimportant traces left (4) c. Malware removed, but annoying or potentially dangerous problems remaining (2) d. Malware not removed (0) e. Product is overly aggressive (e.g. takes out the entire hosts file, entire directory containing threat file etc.) (-2) f. Product s remediation renders the machine unbootable or unusable (-5) The scoring should not take into consideration which of the available techniques were needed to remove the malware. All techniques should however, be applied. When a product cleans out the entries in the hosts file that relate to that very product and leave the machine uninfected and the product functional and updateable, it should be given full credit for remediation even if entries for other security vendors remain in the hosts file. Real-World Test For each sample tested, apply points according to the following schedule: a. Malware is Blocked from causing any infection on the system by the product (+2) b. Malware infects the system but is Neutralized by the product such that the malware remnants cannot execute any more (+1) c. Malware infects the system and the product is unable to stop it (-2) The scoring should not depend on which of the available protection technologies were needed to block/neutralize the malware. All technologies and the alerts seen should be noted as part of the report however. 6

Samples Remediation Test Two distinct sets of malware were used for the testing. The first set contained 16 Fake Antivirus programs and the second set contained 16 other assorted threats. In addition to this, 30 known clean programs were used for the false positive testing. The details of the samples used can be found in the appendix. Real-World Test The malware set contains 54 samples which are split into 52 web bases threats and 2 mails with malicious attachments. In addition to this, 50 known clean programs were used for the false positive testing. The details to the samples used can be found in the appendix. 7

Test Results To calculate an overall score that shows how well a product protects a user whether they are already infected or are more proactive and already have a product installed a normalized sum of the overall scores of the two individual tests has been created. The maximum score that could be achieved was 50 for each test, 100 in total. 120 100 80 60 40 20 0 Overall Protection and Remediation Score 35 39 42 43 44 40 43 46 42 41 42 46 48 45 49 43 50 50 50 31 26 35 42 44 36 40 Overall Protection Score Overall Remediation Score Figure 2: Overall Protection and Remediation Score The best overall result has been achieved by Symantec with 96 out of 100, closely followed by Kaspersky with 93 and Bitdefender (92) as well as F-Secure (90). Avast, AVG, Avira, ESET and Webroot are also scoring over 80. The remaining products score between 70 and 79 (McAfee, Panda and Trend Micro) besides Microsoft which comes in last with 68 points. The individual scores for the two tests, remediation as well as real-world, can be found below. Remediation Test Symantec achieved the best overall removal score for, as can be seen in Figure 3. It should be kept in mind that the numbers shown here are the result of the combined effort of the core product and additional removal tools and rescue media, if available. 8

Overall Remediation Score 160 140 120 112 126 134 138 140 128 137 135 134 130 147 133 142 100 80 60 40 20 0 Figure 3: Overall Remediation Score The maximum score that could be reached was 160. The best score was 147, achieved by Symantec closely followed by Webroot with 142. The worst score was 112. The average score was 134 and the median score 134. This means that eight products were better than or equal to the average and five products were worse than the average. The third best product (ESET) is very close with 140 points as well as the fourth (Bitdefender) with 138. When looking at the individual scores similar observations can be made. In the case of the removal of other malware Bitdefender (72) and Norton (74) again gained the highest score of all products. Out of a maximum achievable score of 80, the worst result was 58, while the average was at 67 and the median at 69. Eight products scored better than the average and five were worse. ESET and Kaspersky achieved the third place with 71 points and Microsoft, Panda and Webroot share the fourth place with 69. The scores for the removal of Fake AV show the same picture. Out of the maximum score of 80 in the Fake AV category, Norton and Webroot share the first place with each 73 points. ESET, F-Secure, Trend Micro and McAfee with 69 each are very close behind. In the false positive testing section, there were only two products that reported something. Avira as well as McAfee warned about network traffic from CyberLink 12. Additionally Avira warned about network traffic of Google Desktop 5.9. Both warning messages did not affect the usage of the programs. A few observations can be made when looking at the individual results. Symantec, ESET and Webroot perform well on both test sets and therefore achieve the three first spots in the test. What is especially interesting is the very good result for the remediation of Fake AV software. This section 9

shows better results than the other malware. It seems the vendors have recognized that this is one of the most prevalent threats to users. Real-World Test F-Secure, Kaspersky and Symantec achieved the best overall score. This is the combined result of the individual test sets that the products were tested against. 108 96 84 72 60 48 36 24 12 0 100 104 98 Overall Protection Score 106 108 108 92 67 56 76 108 78 87 Figure 4: Overall Score In Figure 4 the overall result is given. Out of 108 possible points, F-Secure, Kaspersky and Symantec achieved 108, which was the best result in the test. Those products are closely followed by Avast, AVG and Bitdefender. Avira and ESET are the only other products with a score of over 90. The average was at 91 and the median at 98. All in all 8 products scored equal or above the average. Besides the detection and blocking of malware, it is important to have a well balanced product so that no clean applications will be blocked or detected as malware. Therefore, 50 widely known applications were used to determine whether any product would report them as being suspicious or malicious. AVG, Avira, F-Secure, Kaspersky, McAfee and Panda did present warnings or even blocked the execution of certain clean software. The other products did not trigger any false positives. F- Secure, McAfee and Panda warned only, but did not block anything. AVG warned about six applications and blocked two. Avira warned about six and blocked seven. Kaspersky blocked one application. 10

8 7 6 5 4 3 2 1 0 False Positive Testing Warning messages (negative) Blocked Programs/Installations (negative) Figure 5: False positive results The individual scores clearly show that there exist differences between the tested products. While the results are actually encouraging since all products showed a good performance, there are still products that are clearly at the top. There are a few products that successfully combine static and dynamic detection with URL blocking or exploit detection. These achieve, not surprisingly, the best scores in the test and provide the most reliable protection: Avast, AVG, Bitdefender, F-Secure, Kaspersky and Norton. However the other products do introduce similar features and therefore are close to the top products. 11

Appendix Version information of the tested software Developer, Product name Program Engine/ signature version Distributor version Avast Software avast! Internet Security 7.0 7.0.1456 120719-1 AVG AVG Internet Security 2012 2012.0.2197 2437/5141 Avira Avira Internet Security 2012 12.0.0.1088 8.02.10.114/ 7.11.36.218 Bitdefender Bitdefender Internet Security 2013 16.16.0.1349 7.42929 ESET ESET Smart Security 5 5.2.9.1 1363/7311 F-Secure F-Secure Internet Security 2012 12.56 build 100 Aquarius 11.00.01 (2012-07-19), Hydra 5.07.7855 (2012-07-19), Gemini 3.02.101 (2012-06-25), Online 11.00.18240 (unknown), BlackLight 2.4.1099 (2009-09-22) / 2012-07-19_04 Kaspersky Lab Kaspersky Internet Security 2012 12.0.0.374 (i) 16.5.0.1 McAfee McAfee Total Protection 2012 15.0.302 5400.1158/ 6776 Microsoft Microsoft Security Essentials 4.0.1526.0 1.1.8601.0/1.131.220.0 Panda Security Panda Internet Security 2012 17.01.00 2.3.1511.0 Symantec Norton Internet Security 2013 20.0.0.132 n/a Trend Micro Trend Micro Titanium Maximum Security 5.2.1035 9.500.1008/ 9.267.95 2012 Webroot Webroot SecureAnywhere Complete 8.0.1.203 n/a Table of rescue media and removal tools Developer, Removal Tool Rescue Media Comment Distributor Avast - - Boot Time scan has been used when possible AVG - Boot CD 12.0.1786 Avira Avira Removal Tool Boot CD 3.7.16 3.0.1.17 Bitdefender - Boot CD 2.1 ESET - Boot CD 5.2.91 F-Secure Fseasyclean 1.2.18070.17 Boot CD 3.14-44905 Kaspersky Virus Removal Tool 11.0 Boot CD 10.0 McAfee Stinger 10.2.0.686 - Microsoft - Microsoft System Sweeper/Windows Defender Offline Rescue Disk 4.0.1517 Panda - Boot CD Symantec Norton Power Eraser Boot CD 3.0.0.21 3.0.0.21 Trend Micro SysClean 1.2.2.1001 Boot CD 9.2.1012 Webroot - - List of used malware samples (Remediation Test) 12

Other malware (SHA256) 0x1111a635e29502f1569c6c4b5f94efda0127c5908cc945ea962876c 0af7efeaf 0x2a41085c032ba5c81f625ad9a816c5c08151aa450f71a971798037 08e8e229c2 0x45269a2adcc6763a7aa521b6606812408054f5b02a0238a7f9d22e c6bb045256 0x5cc55e2def37e7d967d74cbe6ef27fe788be556c7965cd1bc80f961 5de472b42 0x7f30bb0efa6b2cdda287461730e0170feb3dd450a8c246d6ad74d3 b9ea012590 0x8433c63a8e7b8ca589e88f008c2904a4fc48fab32d973afefa38ebe 73c2656f9 0x9ddcf44082189da2d046dacf4c590d9f727a51d7a211b1747d35b4 e3921a4a94 0xac805be314ba5f3a10d75ded8b4a639f9f146bb8fc359390bc1334 db751b7331 0xb8d03d6c8a160038ac6ed6b73d08bb3d68bb6c39f659b3be7ff617 27a995e568 0xbbd87b28c777ff9906da6ccec53cab0dd4a246a5923090b524506a 70c7019285 0xc6cc99d32fb926d961dc2452fc4d804c8f9e7da9418c2b03cc8e1e7 d8a5f8309 0xd5ebd68de7eda3eb5e7c22505a4cbbe4648aa3427d5fe2db625d6 86230591ee1 0xdadbdaebbdf15bf75ff19dc5c9340d8d339b6b352d38093ec24302 cf0d3135db 0xdf8d9266ba0eafd857b0d91c377a1b7e82b79fe7860ca58b5089ef fc9c846018 0xed9d76569987e2a48a13e16a1057e7ccf6761cf47587694b7b9a66 69c4db3845 0xf2adf4ea2487399002da8ccf3d7996272345e17efc4048abd1c1bc9 f0317962a Fake AV (SHA256) 0x178759085f5f14ffc7cc68b99a48fe0df536ffa1877d26d9879cc64f bb2910c1 0x21a51341a8852076f807aeaea4a99ae562b60b5967086dd8cc179 6485d954c53 0x24940ca59d0b61267b3939efae57b959e7d44b9d1835b3e5228c 5df2ec225979 0x43ebf32ff184df87ecfd68de0347bb6e8179f32c07207d088037aa 234dd83a14 0x4850e1acaaa981ef8d4410e1416299a02cec4e699c167617c4aad 9e7498b13fb 0x56aca05bb4159920398e001ae88ea4dc507ece34765fe16d31034 ad07d8090ad 0x6552027294e8e340c8a7ac7d3bf6e155a262bb08c9967c351cbdf d89c3a4dc08 0x72aa3ca948675328eed80a3086474128cd955d0a05b23ec543d1c a723507fef3 0x9d677cbcfaaad541ef391d4c2c7ebeb57dca0b55648c55f200933e 995f40fec7 0xa2cb3a7920f570dfe8870344713b906c443e0ffc817ed1e9a3a4c8 592172cdc9 0xb06bd0c214b924f5376d151f1573c6234a2a2cedb5170c46035dc 3dbba9ef856 0xb8017feea7157b02b3a7fbda54ce4b297f5a0a95965a08f0c68eb6 d68d095d85 0xc3d8eb8dab13b016d90b938cc3f1416a9ecc043971c8bab2f272e9 f8c45ab338 0xcd1781ebf5e5bc7171d97183fa6d8e517b1081d5294a99c73d02e 6825e0aee6d 0xe1ff7b005441eaa9de27111a3e50eec667d600c7b3c456e1f1ad77 e280349dfd 0xf98ec230ff36738a3fa9588adf9b35f3024dff3b2ce94c9eccd6fe29 4e1741e8 List of used clean samples (Remediation Test) Program name DVR Studio HD 2.23.5 RealPlayer 15.0.5.109 Ashampoo Photo Optimizer 5 MediaMonkey 4.0.5.1496 MetroTwit 10.0.30319.1 K-Meleon1.5.4 FreeRingtoneMaker 2.0.1.3 Alcohol52 2.0.2.3931 Window Media Player 11 CPU-Z 1.60.1 TrueCrypt 7.1a PowerDVD 12.0.9026.1417 XAMPP 1.7.7-VC9 Google Drive 1.2.3101.4994 Audiograbber 1.83 Google SketchUp 8.0.14346 DeepBurner 1.9.0.228 Virtualdj Home v7.0.5 Network Stumbler 0.4.0 TUGZip 3.5.0.0 7-Zip 9.20 Adobe Reader 10.1.3 Adobe FlashPlayer 11.3.300.262 itunes 10.6.3.25 Firefox 13.0.1 Picasa 3.9 Daemon Tools Lite 4.45.4 Distribution 13

Google Desktop 5.9.1005.12335 Google Earth 6.1.0.5001.Net Framework 4 Millions of users Millions of users Millions of users List of used malware samples (Real-World Test) Direct Downloads and Drive-By-Downloads (2269) http://www.pincodesofindia.in/pincodes-of-india/tamilnadu/theni/theni%20.exe (2272) http://pluglist.mybutt.net/lurker/attach/3@20050703.235444.d0bf c87f.attach (2296) http://creativeeagle.com/images/images/headerze.exe (2300) http://79.137.214.18/fff/219.exe (2308) http://cewekcamfrog.altervista.org/windows_update.exe (2317) http://146.185.246.113/p98a.exe (2355) http://kodolanyiapartman.hu/qq9mq.exe (2448) http://64.89.16.188/xprt_fyn125_1002.exe?r=3d5002718 (2458) http://live.fai.novacon.net/iil92a0o/rjghppk0.exe (2816) http://www.1seslisohbet.com/wpcontent/themes/rt_affinity_wp/js/rokbox/themes/sample/sample. exe (2823) http://wdxdownloadmanager.org/downloadfast_0.6.2.exe (2824) http://fbt.co.cc/value.exe (2825) http://www.turkiyesohbet.gen.tr/turkiye-sohbetscript.exe (2828) http://kavox.net/servidor.exe (2857) http://racon.com.au/logs/rapport.pdf.exe (2927) http://www.ipsi.com.br/c4j5uu.exe (2977) http://173.201.98.138/xepf.exe (2994) http://www.moviestvmusic.com/wpcontent/uploads/galleries/images/images.exe (3039) http://millanta.com/kywjsdk7/kanx.exe (3043) http://zondersnoer.nl/rapport.pdf.exe (3053) http://td-zone.org/b4smix.exe (3054) http://bountyxtubetjq.co.cc/latest/xxxvideo.avi.exe (3056) http://sadsadasdasd1.googlecode.com/files/clickfifa2.exe (3080) http://146.185.246.65/bren.exe (3088) http://www.sovaya.com/938942-br.youtube.com (3089) http://146.185.246.110/dqs.exe (3133) http://filesabout.aboutisrael.co.il/zebyf7az/htxcrlay.exe (3158) http://bountyxtuberzt.co.cc/latest/amateur_dog_sex_01.avi.exe (3163) http://ateneaconsultora.com.ar/ctty7v1.exe (3169) http://tempo-www.asepta.com/leq.exe (3222) http://thecoffin.ghostdesign.com/nhk44jzg.exe (3276) http://dedetizadoraalexandrelopes.com.br/7aen.exe (3288) http://42up.com/browser/rundll.exe (3295) http://acessorest3.dominiotemporario.com/fotos/fotinhas.exe (3304) http://piramidemusic.com.br/ebx1plkt.exe (3317) http://stat.forcedlo.in/tr/bl/clc.exe (3324) http://viewncr.com/users_content/users_images/users_logos/us ers_logos%20.exe (3352) http://210.71.66.36/download/97-11-10_10/97-11- 10_10.exe (3353) http://boostdeeming.org/template/fastrunintel.exe (3372) http://fallspecialsvacation.com/auto/functions/emad11.exe (3393) http://denicola.com/coliposte_tracking_number.exe (3410) http://8ui.org/mess/after-buying-my-notebookincomplete/www%3b%20get%20bookscanner%20(.com)%20free% 20in%20exchange%20for%20book%20scans/www%3B%20Get%20 bookscanner%20(.com)%20free%20in%20exchange%20for%20book %20scans%60.exe (3414) http://207.204.5.234/yq563jw.exe (3432) http://veluwegroen.nl/pics/rapport.pdf.exe (3476) http://agilxsoftware.com/ua7cyjvh/w6ps4m.exe (3486) http://www.moneymakersigns.com/8kefmrlw/1xwnem.exe (3600) http://kukurus.ru/exp/load2.exe (3631) http://vl-auto-volam-mienphi- 2012.googlecode.com/files/VLAuto.exe (3660) (3710) http://www.lecfib.org/joomla/images/youtube.exe http://correiotelegrama.com/correios/produtos_servicos/telegra ma/telegrama_915204678.br.2012.exe (3726) http://theempiregame.net/ihig7.exe (3746) http://marianx.altervista.org/test/systr.exe Malicious E-Mails (02) "here is a pic of my pussy attached (04) "United Parcel Service notification #88372610" List of used clean samples (Real-World Test) Program name AnyPassword 1.44 Distribution 14

DVR Studio HD 2.23.5 RealPlayer 15.0.5.109 Ashampoo Photo Optimizer 5.0.2 Songbird_2.0.0-2311 TeamViewer 7.0.13852 Thunderbird 13.01 CDBurnerXP 4.4.1.3243 Free youtube to mp3 converter 3.11.24.608 VLC Player 2.0.2 MediaMonkey 4.0.5.1496 MetroTwit 10.0.30319.1 K-Meleon1.5.4 FreeRingtoneMaker 2.0.1.3 Alcohol52 2.0.2.3931 Java RE 7 Update 5 ImgBurn 2.5.7.0 IrfanView 4.33 Winamp 5.6.23 Opera 12 Skype 5.9.0.123 Recuva 1.42.544 FastStone Image Viewer 4.6 Defraggler 2.10.424 Frostwire 3.7.0 Windows Media Player 11 CPU-Z 1.60.1 TrueCrypt 7.1a PowerDVD 12.0.9026.1417 XAMPP 1.7.7-VC9 Google Drive 1.2.3101.4994 Audiograbber 1.83 Google SketchUp 8.0.14346 DeepBurner 1.9.0.228 Virtualdj Home v7.0.5b Network Stumbler 0.4.0 TUGZip 3.5.0.0 7-Zip 9.20 Adobe Reader 10.1.3 Adobe FlashPlayer 11.3.300.262 itunes 10.6.3.25 Firefox 13.0.1 Picasa 3.9 CCleaner 3.20 CloneDVD 2.9.3.0 RocketDock 1.3.5 Daemon Tools Lite 4.45.4 Google Desktop 5.9.1005.12335 Google Earth 6.1.0.5001.Net Framework 4 Millions of users Millions of users Millions of users Copyright 2012 by AV-Test GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web http://www.av-test.org 15