Real World Protection and Remediation Testing Report

Size: px
Start display at page:

Download "Real World Protection and Remediation Testing Report"

Transcription

1 Real World Protection and Remediation Testing Report A test commissioned by Symantec Corporation and performed by AV-Test GmbH Date of the report: August 9 th, 2012, last update: August 9 th, 2012 Executive Summary In July 2012, AV-Test performed a comparative review of 13 home user security products to determine their real-world protection and remediation capabilities. In addition to the core product, dedicated removal tools as well as bootable rescue media (which are being offered by some of the vendors) were added to the test. The malware test corpus for the remediation consisted of 32 samples (16 Fake Antivirus samples and 16 other assorted threats). The false positive corpus consisted of 30 known clean applications. To perform the single test runs, a clean Windows XP image was used on several identical PCs. This image was then infected with one of the malware samples. The next step was trying to install the security product, scanning the PC and removing any threats that have been found. If one of these steps could not be carried out successfully, additional freely available removal tools or rescue media were used, if available, from the respective vendor. The false positive testing was performed in the same way. However, the desired result was to not detect any of the 30 clean applications. The malware test corpus for the real-world test consisted of 54 samples, including direct downloads and drive-by-downloads. The false positive corpus consisted of 50 known clean applications. To perform the single test runs, a clean Windows XP image was used on several identical PCs. On this image, the security software was installed and then the infected website or was accessed. Any detection by the security software was noted. Additionally the resulting state of the system was compared with the original state before the test in order to determine whether the attack was successfully blocked or not. For the false positive part, 50 known clean applications were installed and any false detection from the security products was noted. The best result in the described test was achieved by the Symantec product. It reached the highest overall score as well as the highest individual scores for the remediation test. It shared the top position with a few other vendors in the real-world test. Furthermore, no false positives occurred for this product. Overview With the increasing number of threats that is being released and spreading through the Internet these days, the danger of getting infected is increasing as well. A few years back there were new 1

2 viruses released every few days. This has grown to several thousand new threats per hour. New unique samples added to AV-Test's malware repository ( ) 20,000,000 18,000,000 16,000,000 14,000,000 12,000,000 10,000,000 8,000,000 6,000,000 4,000,000 2,000, Dec Nov Oct Sep Aug Jul Jun May Apr Mar Figure 1: New samples added per year In the year 2000, AV-Test received more than 170,000 new samples, and in 2009, the number of new samples grew to over 19,000,000 new samples. The numbers continue to grow in the year The growth of these numbers is displayed in Figure 1. The volume of new samples that have to be processed by anti-malware vendors in order to protect their customers can create problems. It is not always possible to successfully protect a PC in time. It is possible that a PC can get infected, even if up-to-date anti-malware software is installed because signatures are provided only every few hours, which sometimes may be too late. Infections create financial loss, either because sensitive data is stolen or because the PC cannot be used for productive work anymore until the malware has completely removed from the system. Therefore remediation techniques become more important to get an infected PC up and running again. In that process it is imperative that the cleaning process is reliable in two ways: 1. The malware and all of its components have to be removed and any malicious system changes have to be reverted 2. No clean applications or the system itself must be harmed by the cleaning process Fulfilling these two requirements is not easy. In order to be able to handle the high volume of different malware samples and different behavior it would be necessary to apply more generic cleaning techniques, because there is simply no time to deploy a dedicated cleaning routine for every single malware sample. As soon as generic techniques are used, the risk of false positives (and therefore the risk of harming the system and clean software) increases. On the other hand, malware uses a lot of techniques to avoid successful detection (e.g. rootkit techniques are used to hide files, registry entries and processes) or removal (e.g. the anti-malware software is blocked from starting up). In order to cope with these problems, some vendors provide specific removal tools and rescue media, that don t face the problems of the regular anti-malware software. 2

3 All these aspects have been considered in this test and the corresponding details will be presented on the next few pages. Products Tested The testing occurred in July AV-Test used the latest releases available at the time of the test of the following thirteen products: Avast Software avast! Internet Security 7.0 AVG Internet Security 2012 Avira Internet Security 2012 Bitdefender Total Security 2013 ESET Smart Security 5 F-Secure Internet Security 2012 Kaspersky Internet Security 2012 McAfee Total Protection 2012 Microsoft Security Essentials 4 Panda Internet Security 2012 Symantec Norton Internet Security 2013 Trend Micro Titanium Maximum Security 2012 Webroot SecureAnywhere Complete 8 Methodology and Scoring Platform All tests have been performed on identical PCs equipped with the following hardware: Intel Xeon Quad-Core X3360 CPU 4 GB Ram 500 GB HDD (Western Digital) Intel Pro/1000 PL (Gigabit Ethernet) NIC The operating system was Windows XP Service Pack 3 with only those hotfixes that were part of SP3 as well as all patches that were available on July 1 st Additionally, the following applications have been installed to provide a vulnerable system for the URLs that use exploits to infect the system. Developer Product Version Adobe Flash Player 10 ActiveX Adobe Flash Player 10 Plugin Adobe Acrobat Reader V8 or v9 ICQ ICQ Sun Java SE Runtime Environment 6 Update Mozilla Firefox ( ) (en-us) Apple QuickTime Real Networks RealPlayer

4 WinZip Computing LP WinZip 10.0(6667) Yahoo! Inc Messenger Testing methodology Remediation Test The remediation test has been performed according to the methodology explained below. 1. Clean system for each sample. The test systems should be restored to a clean state before being exposed to each malware sample. 2. Physical Machines. The test systems used should be actual physical machines. No Virtual Machines should be used. 3. Internet Access. The machines had access to the Internet at all times, in order to use in-thecloud queries if necessary. 4. Product Configuration. All products and their accompanying remediation tools or bootable recovery tools were run with their default, out-of-the-box configuration. 5. Infect test machine. Infect native machine with one threat, reboot and make sure that threat is fully running. 6. Sample Families and Payloads. No two samples should be from the same family or have the same payloads. 7. Remediate using all available product capabilities. a. Try to install security product in default settings. Follow complete product instructions for removal. b. If a. doesn t work, try standalone fixtool/rescue tool solution (if available). c. If b. doesn t work, boot standalone boot solution (if available) and use it to remediate. 8. Validate removal. Manually inspect PC to validate proper removal and artifact presence. 9. Score removal performance. Score the effectiveness of the tool and the security solution as a whole using the agreed upon scoring system. 10. Overly Aggressive Remediation. The test should also measure how aggressive a product is at remediating. For example some products will completely remove the hosts file or remove an entire directory when it is not necessary to do so for successful remediation. This type of behavior should count against the product. 11. False Positive Testing. The test should also run clean programs and applications to make sure that products do not mistakenly remove such legitimate software. In addition to the above, the following items had to be considered: Fixtools: No threat-specific fixtools should be used for any product s remediation. Only generic remediation standalone/fixtools and bootable tools should be used. Licensed vs. Unlicensed Bootable or Remediation tool: Only licensed bootable or other generic remediation tools offered by vendors as part of their security product or pointed to by their infection UI workflow should be included in the test. No unlicensed tools should be used in the test. 4

5 Microsoft s Malicious Malware Removal Tool: This is part of the windows update and as such a part of the Windows OS. This tool should not be used as a second layer of protection for any participating vendor s products. Real-World Test The real-world test has been performed according to the methodology explained below. 1. Clean system for each sample. The test systems should be restored to a clean state before being exposed to each malware sample. 2. Physical Machines. The test systems used should be actual physical machines. No Virtual Machines should be used. 3. Product Cloud/Internet Connection. The Internet should be available to all tested products that use the cloud as part of their protection strategy. 4. Product Configuration. All products were run with their default, out-of-the-box configuration. 5. Sample variety. In order to simulate the real world infection techniques, malware samples should be weighted heavily (~80 per cent) towards web-based threats (of these, half should be manual downloads like Fake AV and half should be downloads that leverage some type of exploited vulnerability i.e. a drive-by download). A small set of the samples (5 10%) may include threats attached to s. 6. Unique Domains per sample. No two URLs used as samples for this test should be from the same domain (e.g. xyz.com) 7. Sample introduction vector. Each sample should be introduced to the system in as realistic a method as possible. This will include sending samples that are collected as attachments in the real world as attachments to messages. Web-based threats are downloaded to the target systems from an external web server in a repeatable way. 8. Real World Web-based Sample User Flow. Web-based threats are usually accessed by unsuspecting users by following a chain of URLs. For instance, a Google search on some high trend words may give URLs in the results that when clicked could redirect to another link and so on until the user arrives at the final URL which hosts the malicious sample file. This test should simulate such real world user URL flows before the final malicious file download happens. This ensures that the test exercises the layers of protection that products provide during this real world user URL flow. 9. Sample Cloud/Internet Accessibility. If the malware uses the cloud/internet connection to reach other sites in order to download other files and infect the system, care should be taken to make sure that the cloud access is available to the malware sample in a safe way such that the testing network is not under the threat of getting infected. 10. Allow time for sample to run. Each sample should be allowed to run on the target system for 10 minutes to exhibit autonomous malicious behavior. This may include initiating connections to systems on the internet, or installing itself to survive a reboot (as may be the case with certain key-logging Trojans that only activate fully when the victim is performing a certain task). 11. Measuring the effect. A consistent and systematic method of measure the impact of malicious threats and the ability of the products to detect them shall be implemented. The following should be observed for each tested sample: 5

6 a. Successful Blocking of each threat. The method of notification or alert should be noted, including any request for user intervention. If user intervention is required, the prompted default behavior should always be chosen. Any additional downloads should be noted. The product should be able to block the malware from causing any infection on the system. This could mean that the malware executes on the system before it tries to do any malicious action, it is taken out by the product. b. Successful Neutralization of each threat. The notification/alert should be noted. If user intervention is required, the prompted default behavior should always be chosen. Successful neutralization should also include any additional downloads. Additionally, indicate whether all aspects of the threat were completely removed or just all active aspects of the threat. c. Threat compromises the machine. Information on what threat aspects were found on the system and were missed by the product should be provided. Efficacy Rating Remediation Test For each sample tested, apply points according to the following schedule: a. Malware completely removed (5) b. Malware removed, some unimportant traces left (4) c. Malware removed, but annoying or potentially dangerous problems remaining (2) d. Malware not removed (0) e. Product is overly aggressive (e.g. takes out the entire hosts file, entire directory containing threat file etc.) (-2) f. Product s remediation renders the machine unbootable or unusable (-5) The scoring should not take into consideration which of the available techniques were needed to remove the malware. All techniques should however, be applied. When a product cleans out the entries in the hosts file that relate to that very product and leave the machine uninfected and the product functional and updateable, it should be given full credit for remediation even if entries for other security vendors remain in the hosts file. Real-World Test For each sample tested, apply points according to the following schedule: a. Malware is Blocked from causing any infection on the system by the product (+2) b. Malware infects the system but is Neutralized by the product such that the malware remnants cannot execute any more (+1) c. Malware infects the system and the product is unable to stop it (-2) The scoring should not depend on which of the available protection technologies were needed to block/neutralize the malware. All technologies and the alerts seen should be noted as part of the report however. 6

7 Samples Remediation Test Two distinct sets of malware were used for the testing. The first set contained 16 Fake Antivirus programs and the second set contained 16 other assorted threats. In addition to this, 30 known clean programs were used for the false positive testing. The details of the samples used can be found in the appendix. Real-World Test The malware set contains 54 samples which are split into 52 web bases threats and 2 mails with malicious attachments. In addition to this, 50 known clean programs were used for the false positive testing. The details to the samples used can be found in the appendix. 7

8 Test Results To calculate an overall score that shows how well a product protects a user whether they are already infected or are more proactive and already have a product installed a normalized sum of the overall scores of the two individual tests has been created. The maximum score that could be achieved was 50 for each test, 100 in total Overall Protection and Remediation Score Overall Protection Score Overall Remediation Score Figure 2: Overall Protection and Remediation Score The best overall result has been achieved by Symantec with 96 out of 100, closely followed by Kaspersky with 93 and Bitdefender (92) as well as F-Secure (90). Avast, AVG, Avira, ESET and Webroot are also scoring over 80. The remaining products score between 70 and 79 (McAfee, Panda and Trend Micro) besides Microsoft which comes in last with 68 points. The individual scores for the two tests, remediation as well as real-world, can be found below. Remediation Test Symantec achieved the best overall removal score for, as can be seen in Figure 3. It should be kept in mind that the numbers shown here are the result of the combined effort of the core product and additional removal tools and rescue media, if available. 8

9 Overall Remediation Score Figure 3: Overall Remediation Score The maximum score that could be reached was 160. The best score was 147, achieved by Symantec closely followed by Webroot with 142. The worst score was 112. The average score was 134 and the median score 134. This means that eight products were better than or equal to the average and five products were worse than the average. The third best product (ESET) is very close with 140 points as well as the fourth (Bitdefender) with 138. When looking at the individual scores similar observations can be made. In the case of the removal of other malware Bitdefender (72) and Norton (74) again gained the highest score of all products. Out of a maximum achievable score of 80, the worst result was 58, while the average was at 67 and the median at 69. Eight products scored better than the average and five were worse. ESET and Kaspersky achieved the third place with 71 points and Microsoft, Panda and Webroot share the fourth place with 69. The scores for the removal of Fake AV show the same picture. Out of the maximum score of 80 in the Fake AV category, Norton and Webroot share the first place with each 73 points. ESET, F-Secure, Trend Micro and McAfee with 69 each are very close behind. In the false positive testing section, there were only two products that reported something. Avira as well as McAfee warned about network traffic from CyberLink 12. Additionally Avira warned about network traffic of Google Desktop 5.9. Both warning messages did not affect the usage of the programs. A few observations can be made when looking at the individual results. Symantec, ESET and Webroot perform well on both test sets and therefore achieve the three first spots in the test. What is especially interesting is the very good result for the remediation of Fake AV software. This section 9

10 shows better results than the other malware. It seems the vendors have recognized that this is one of the most prevalent threats to users. Real-World Test F-Secure, Kaspersky and Symantec achieved the best overall score. This is the combined result of the individual test sets that the products were tested against Overall Protection Score Figure 4: Overall Score In Figure 4 the overall result is given. Out of 108 possible points, F-Secure, Kaspersky and Symantec achieved 108, which was the best result in the test. Those products are closely followed by Avast, AVG and Bitdefender. Avira and ESET are the only other products with a score of over 90. The average was at 91 and the median at 98. All in all 8 products scored equal or above the average. Besides the detection and blocking of malware, it is important to have a well balanced product so that no clean applications will be blocked or detected as malware. Therefore, 50 widely known applications were used to determine whether any product would report them as being suspicious or malicious. AVG, Avira, F-Secure, Kaspersky, McAfee and Panda did present warnings or even blocked the execution of certain clean software. The other products did not trigger any false positives. F- Secure, McAfee and Panda warned only, but did not block anything. AVG warned about six applications and blocked two. Avira warned about six and blocked seven. Kaspersky blocked one application. 10

11 False Positive Testing Warning messages (negative) Blocked Programs/Installations (negative) Figure 5: False positive results The individual scores clearly show that there exist differences between the tested products. While the results are actually encouraging since all products showed a good performance, there are still products that are clearly at the top. There are a few products that successfully combine static and dynamic detection with URL blocking or exploit detection. These achieve, not surprisingly, the best scores in the test and provide the most reliable protection: Avast, AVG, Bitdefender, F-Secure, Kaspersky and Norton. However the other products do introduce similar features and therefore are close to the top products. 11

12 Appendix Version information of the tested software Developer, Product name Program Engine/ signature version Distributor version Avast Software avast! Internet Security AVG AVG Internet Security /5141 Avira Avira Internet Security / Bitdefender Bitdefender Internet Security ESET ESET Smart Security /7311 F-Secure F-Secure Internet Security build 100 Aquarius ( ), Hydra ( ), Gemini ( ), Online (unknown), BlackLight ( ) / _04 Kaspersky Lab Kaspersky Internet Security (i) McAfee McAfee Total Protection / 6776 Microsoft Microsoft Security Essentials / Panda Security Panda Internet Security Symantec Norton Internet Security n/a Trend Micro Trend Micro Titanium Maximum Security / Webroot Webroot SecureAnywhere Complete n/a Table of rescue media and removal tools Developer, Removal Tool Rescue Media Comment Distributor Avast - - Boot Time scan has been used when possible AVG - Boot CD Avira Avira Removal Tool Boot CD Bitdefender - Boot CD 2.1 ESET - Boot CD F-Secure Fseasyclean Boot CD Kaspersky Virus Removal Tool 11.0 Boot CD 10.0 McAfee Stinger Microsoft - Microsoft System Sweeper/Windows Defender Offline Rescue Disk Panda - Boot CD Symantec Norton Power Eraser Boot CD Trend Micro SysClean Boot CD Webroot - - List of used malware samples (Remediation Test) 12

13 Other malware (SHA256) 0x1111a635e29502f1569c6c4b5f94efda0127c5908cc945ea962876c 0af7efeaf 0x2a41085c032ba5c81f625ad9a816c5c08151aa450f71a e8e229c2 0x45269a2adcc6763a7aa521b f5b02a0238a7f9d22e c6bb x5cc55e2def37e7d967d74cbe6ef27fe788be556c7965cd1bc80f961 5de472b42 0x7f30bb0efa6b2cdda e0170feb3dd450a8c246d6ad74d3 b9ea x8433c63a8e7b8ca589e88f008c2904a4fc48fab32d973afefa38ebe 73c2656f9 0x9ddcf da2d046dacf4c590d9f727a51d7a211b1747d35b4 e3921a4a94 0xac805be314ba5f3a10d75ded8b4a639f9f146bb8fc359390bc1334 db751b7331 0xb8d03d6c8a160038ac6ed6b73d08bb3d68bb6c39f659b3be7ff617 27a995e568 0xbbd87b28c777ff9906da6ccec53cab0dd4a246a b524506a 70c xc6cc99d32fb926d961dc2452fc4d804c8f9e7da9418c2b03cc8e1e7 d8a5f8309 0xd5ebd68de7eda3eb5e7c22505a4cbbe4648aa3427d5fe2db625d ee1 0xdadbdaebbdf15bf75ff19dc5c9340d8d339b6b352d38093ec24302 cf0d3135db 0xdf8d9266ba0eafd857b0d91c377a1b7e82b79fe7860ca58b5089ef fc9c xed9d e2a48a13e16a1057e7ccf6761cf b7b9a66 69c4db3845 0xf2adf4ea da8ccf3d e17efc4048abd1c1bc9 f a Fake AV (SHA256) 0x f5f14ffc7cc68b99a48fe0df536ffa1877d26d9879cc64f bb2910c1 0x21a51341a f807aeaea4a99ae562b60b dd8cc d954c53 0x24940ca59d0b61267b3939efae57b959e7d44b9d1835b3e5228c 5df2ec x43ebf32ff184df87ecfd68de0347bb6e8179f32c07207d088037aa 234dd83a14 0x4850e1acaaa981ef8d4410e a02cec4e699c167617c4aad 9e7498b13fb 0x56aca05bb e001ae88ea4dc507ece34765fe16d31034 ad07d8090ad 0x e8e340c8a7ac7d3bf6e155a262bb08c9967c351cbdf d89c3a4dc08 0x72aa3ca eed80a cd955d0a05b23ec543d1c a723507fef3 0x9d677cbcfaaad541ef391d4c2c7ebeb57dca0b55648c55f200933e 995f40fec7 0xa2cb3a7920f570dfe b906c443e0ffc817ed1e9a3a4c cdc9 0xb06bd0c214b924f5376d151f1573c6234a2a2cedb5170c46035dc 3dbba9ef856 0xb8017feea7157b02b3a7fbda54ce4b297f5a0a95965a08f0c68eb6 d68d095d85 0xc3d8eb8dab13b016d90b938cc3f1416a9ecc043971c8bab2f272e9 f8c45ab338 0xcd1781ebf5e5bc7171d97183fa6d8e517b1081d5294a99c73d02e 6825e0aee6d 0xe1ff7b005441eaa9de27111a3e50eec667d600c7b3c456e1f1ad77 e280349dfd 0xf98ec230ff36738a3fa9588adf9b35f3024dff3b2ce94c9eccd6fe29 4e1741e8 List of used clean samples (Remediation Test) Program name DVR Studio HD RealPlayer Ashampoo Photo Optimizer 5 MediaMonkey MetroTwit K-Meleon1.5.4 FreeRingtoneMaker Alcohol Window Media Player 11 CPU-Z TrueCrypt 7.1a PowerDVD XAMPP VC9 Google Drive Audiograbber 1.83 Google SketchUp DeepBurner Virtualdj Home v7.0.5 Network Stumbler TUGZip Zip 9.20 Adobe Reader Adobe FlashPlayer itunes Firefox Picasa 3.9 Daemon Tools Lite Distribution 13

14 Google Desktop Google Earth Net Framework 4 Millions of users Millions of users Millions of users List of used malware samples (Real-World Test) Direct Downloads and Drive-By-Downloads (2269) (2272) c87f.attach (2296) (2300) (2308) (2317) (2355) (2448) (2458) (2816) exe (2823) (2824) (2825) (2828) (2857) (2927) (2977) (2994) (3039) (3043) (3053) (3054) (3056) (3080) (3088) (3089) (3133) (3158) (3163) (3169) (3222) (3276) (3288) (3295) (3304) (3317) (3324) ers_logos%20.exe (3352) 10_10.exe (3353) (3372) (3393) (3410) 20in%20exchange%20for%20book%20scans/www%3B%20Get%20 bookscanner%20(.com)%20free%20in%20exchange%20for%20book %20scans%60.exe (3414) (3432) (3476) (3486) (3600) (3631) googlecode.com/files/VLAuto.exe (3660) (3710) ma/telegrama_ br.2012.exe (3726) (3746) Malicious s (02) "here is a pic of my pussy attached (04) "United Parcel Service notification # " List of used clean samples (Real-World Test) Program name AnyPassword 1.44 Distribution 14

15 DVR Studio HD RealPlayer Ashampoo Photo Optimizer Songbird_ TeamViewer Thunderbird CDBurnerXP Free youtube to mp3 converter VLC Player MediaMonkey MetroTwit K-Meleon1.5.4 FreeRingtoneMaker Alcohol Java RE 7 Update 5 ImgBurn IrfanView 4.33 Winamp Opera 12 Skype Recuva FastStone Image Viewer 4.6 Defraggler Frostwire Windows Media Player 11 CPU-Z TrueCrypt 7.1a PowerDVD XAMPP VC9 Google Drive Audiograbber 1.83 Google SketchUp DeepBurner Virtualdj Home v7.0.5b Network Stumbler TUGZip Zip 9.20 Adobe Reader Adobe FlashPlayer itunes Firefox Picasa 3.9 CCleaner 3.20 CloneDVD RocketDock Daemon Tools Lite Google Desktop Google Earth Net Framework 4 Millions of users Millions of users Millions of users Copyright 2012 by AV-Test GmbH, Klewitzstr. 7, Magdeburg, Germany Phone +49 (0) , Fax +49 (0) , Web 15

Real World and Vulnerability Protection, Performance and Remediation Report

Real World and Vulnerability Protection, Performance and Remediation Report Real World and Vulnerability Protection, Performance and Remediation Report A test commissioned by Symantec Corporation and performed by AV-Test GmbH Date of the report: September 17 th, 2014, last update:

More information

Endpoint Business Products Testing Report. Performed by AV-Test GmbH

Endpoint Business Products Testing Report. Performed by AV-Test GmbH Business Products Testing Report Performed by AV-Test GmbH January 2011 1 Business Products Testing Report - Performed by AV-Test GmbH Executive Summary Overview During November 2010, AV-Test performed

More information

Windows 8 Malware Protection Test Report

Windows 8 Malware Protection Test Report Windows 8 Malware Protection Test Report A test commissioned by Kaspersky Lab and performed by AV-Test GmbH Date of the report: January 11 th, 2013, last update: January 11 th, 2013 Executive Summary In

More information

Virtual Desktops Security Test Report

Virtual Desktops Security Test Report Virtual Desktops Security Test Report A test commissioned by Kaspersky Lab and performed by AV-TEST GmbH Date of the report: May 19 th, 214 Executive Summary AV-TEST performed a comparative review (January

More information

Proactive Rootkit Protection Comparison Test

Proactive Rootkit Protection Comparison Test Proactive Rootkit Protection Comparison Test A test commissioned by McAfee and performed by AV-TEST GmbH Date of the report: February 2 th, 213 Executive Summary In January 213, AV-TEST performed a comparative

More information

Patch Management Solutions Test

Patch Management Solutions Test Patch Management Solutions Test A test commissioned by Kaspersky Lab and performed by AV-TEST GmbH Date of the report: 5 th June, 2013, last update: 19 th July, 2013 Executive Summary From May to July

More information

Tracking Anti-Malware Protection 2015

Tracking Anti-Malware Protection 2015 Tracking Anti-Malware Protection 2015 A TIME-TO-PROTECT ANTI-MALWARE COMPARISON TEST Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com This report aims to measure

More information

Zscaler Cloud Web Gateway Test

Zscaler Cloud Web Gateway Test Zscaler Cloud Web Gateway Test A test commissioned by Zscaler, Inc. and performed by AV-TEST GmbH. Date of the report: April15 th, 2016 Executive Summary In March 2016, AV-TEST performed a review of the

More information

Anti-Virus Protection and Performance

Anti-Virus Protection and Performance Anti-Virus Protection and Performance ANNUAL REPORT 2015 Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com CONTENTS Annual Report 2015... 1 Contents... 2 Introduction...

More information

Home Anti-Virus Protection

Home Anti-Virus Protection Home Anti-Virus Protection APRIL - JUNE 2013 Dennis Technology Labs www.dennistechnologylabs.com This report aims to compare the effectiveness of anti-malware products provided by well-known security companies.

More information

26 Protection Programs Undergo Our First Test Using Windows 8

26 Protection Programs Undergo Our First Test Using Windows 8 Test: Internet Security Packages 1/2013 26 Protection Programs Undergo Our First Test Using Windows 8 Windows 8 is considered to be a secure system thanks to its internal protection package containing

More information

Endurance Test: Does antivirus software slow

Endurance Test: Does antivirus software slow 23rd April 2015 created by Markus Selinger Endurance Test: Does antivirus software slow down PCs? Critics maintain that protection software for Windows really puts the brakes on PCs. In a 14-month, extremely

More information

Online Payments Threats

Online Payments Threats July 3, 2012 Introduction...2 Tested Products...2 Used Configuration...3 Real Malware Inspiration...3 Total Scores Chart...4 Conclusion...4 About matousec.com...4 Detailed Descriptions of Tests...5 Detailed

More information

æ æœ 語 English Deutsch Español Franà ais Italiano Polski æ æœ 語 æ± è - OS Windows OS Windows OS Mac App Manager ã ムã ンãƒ-ームã ã OS Windows

æ æœ 語 English Deutsch Español Franà ais Italiano Polski æ æœ 語 æ± è - OS Windows OS Windows OS Mac App Manager ã ムã ンãƒ-ームã ã OS Windows æ æœ 語 English Deutsch Español Franà ais Italiano Polski æ æœ 語 æ± è - OS Windows OS Windows OS Mac App Manager ã ムã ンãƒ-ームã ã OS Windows OS Mac TechBeatムムー㠹 TechBeatã ã æœ æ ã æš

More information

Enterprise Anti-Virus Protection

Enterprise Anti-Virus Protection Enterprise Anti-Virus APRIL - JUNE 2013 Dennis Technology Labs www.dennistechnologylabs.com This report aims to compare the effectiveness of anti-malware products provided by well-known security companies.

More information

MRG Effitas Online Banking / Browser Security Assessment Project Q2 2013 Results

MRG Effitas Online Banking / Browser Security Assessment Project Q2 2013 Results MRG Effitas Online Banking / Browser Security Assessment Project Q2 2013 Results 1 Contents: Introduction 3 The Purpose of this Project 3 Tests employed 3 Security Applications Tested 4 Methodology Used

More information

Enterprise Anti-Virus Protection

Enterprise Anti-Virus Protection Enterprise Anti-Virus Protection JAN - MAR 2015 Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com This report aims to compare the effectiveness of anti-malware products

More information

Performance test November 2014 / www.avlab.pl 1 INTRODUCTION... 1 TESTED PROGRAM VERSIONS..2 WHAT AND HOW WE TESTED. 3 OTHER PRINCIPLES...

Performance test November 2014 / www.avlab.pl 1 INTRODUCTION... 1 TESTED PROGRAM VERSIONS..2 WHAT AND HOW WE TESTED. 3 OTHER PRINCIPLES... PE RF ORMANCET E S TOFF RE E ANT I VI RUSS OF T WARE ANDI NT E RNE TS E CURI T YS UI T E S NOVE MBE R14 Per f or medt estdat e: Oct ober-november14 Performance test November 14 / www.avlab.pl 1 INTRODUCTION...

More information

Anti-Virus Comparative

Anti-Virus Comparative Anti-Virus Comparative Performance Test (Suite Products) Impact of Internet Security Software on System Performance Language: English October 2013 Last Revision: 19 th November 2013 Table of Contents 1.

More information

How To Test For Performance On A 64 Bit Computer (64 Bit)

How To Test For Performance On A 64 Bit Computer (64 Bit) Anti-Virus Comparative Performance Test Impact of Security Software on System Performance Language: English May 2015 Last Revision: 30 th June 2015 Table of Contents 1. Introduction 3 2. Tested products

More information

Enterprise Anti-Virus Protection

Enterprise Anti-Virus Protection Enterprise Anti-Virus JULY - SEPTEMBER 2013 Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com This report aims to compare the effectiveness of anti-malware products

More information

Small Business Anti-Virus Protection

Small Business Anti-Virus Protection Small Business Anti-Virus Protection OCT - DEC 2014 Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com This report aims to compare the effectiveness of anti-malware

More information

Anti-Virus Comparative - Performance Test (AV Products) May 2014

Anti-Virus Comparative - Performance Test (AV Products) May 2014 Anti-Virus Comparative Performance Test (AV Products) Impact of Anti-Virus Software on System Performance Language: English May 2014 Last Revision: 10 th June 2014 Table of Contents 1. Introduction 3 2.

More information

A Best Practice Approach to Third Party Patching

A Best Practice Approach to Third Party Patching A Best Practice Approach to Third Party Patching Mike Grueber Senior Product Manager 1 Effective patch management is essential 90% of successful attacks occurred against previously known vulnerabilities

More information

Small Business Anti-Virus Protection

Small Business Anti-Virus Protection Small Business Anti-Virus Protection JANUARY - MARCH 2014 Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com This report aims to compare the effectiveness of anti-malware

More information

Security Industry Market Share Analysis

Security Industry Market Share Analysis Security Industry Market Share Analysis September 2011 Introduction The Research OPSWAT releases quarterly market share reports for several sectors of the security industry. This quarter s report includes

More information

Small Business Anti-Virus Protection

Small Business Anti-Virus Protection Small Business Anti-Virus Protection JULY - SEPTEMBER 2013 Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com This report aims to compare the effectiveness of anti-malware

More information

Deep Security Vulnerability Protection Summary

Deep Security Vulnerability Protection Summary Deep Security Vulnerability Protection Summary Trend Micro, Incorporated This documents outlines the process behind rules creation and answers common questions about vulnerability coverage for Deep Security

More information

Enterprise Anti-Virus Protection

Enterprise Anti-Virus Protection Enterprise Anti-Virus Protection APRIL - JUNE 2014 Dennis Technology Labs www.dennistechnologylabs.com Follow @DennisTechLabs on Twitter.com This report aims to compare the effectiveness of anti-malware

More information

Nessus and Antivirus. January 31, 2014 (Revision 4)

Nessus and Antivirus. January 31, 2014 (Revision 4) Nessus and Antivirus January 31, 2014 (Revision 4) Table of Contents Introduction... 3 Standards and Conventions... 3 Overview... 3 A Note on SCAP Audits... 4 Microsoft Windows Defender... 4 Kaspersky

More information

Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities?

Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities? ANALYST BRIEF Can Consumer AV Products Protect Against Critical Microsoft Vulnerabilities? Author Randy Abrams Tested Products Avast Internet Security 7 AVG Internet Security 2012 Avira Internet Security

More information

CORPORATE AV / EPP COMPARATIVE ANALYSIS

CORPORATE AV / EPP COMPARATIVE ANALYSIS CORPORATE AV / EPP COMPARATIVE ANALYSIS Exploit Evasion Defenses 2013 Randy Abrams, Dipti Ghimire, Joshua Smith Tested Vendors AVG, ESET, F- Secure, Kaspersky, McAfee, Microsoft, Norman, Panda, Sophos,

More information

How To Test Security Products

How To Test Security Products Virtual Desktop Anti-malware Protection A COMPARATIVE TEST BETWEEN SYMANTEC ENDPOINT PROTECTION AND TREND MICRO DEEP SECURITY Dennis Technology Labs, 05/04/2012 www.dennistechnologylabs.com This report

More information

Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT

Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT TESTED PRODUCTS: AVG Internet Security Network Edition v8.0 Kaspersky Total Space Security v6.0 McAfee Total Protection for Endpoint Sophos

More information

Contact details For contacting ENISA or for general enquiries on information security awareness matters, please use the following details:

Contact details For contacting ENISA or for general enquiries on information security awareness matters, please use the following details: Malicious software About ENISA The European Network and Information Security Agency (ENISA) is an EU agency created to advance the functioning of the internal market. ENISA is a centre of excellence for

More information

Anti-Virus Comparative

Anti-Virus Comparative Anti-Virus Comparative File Detection Test of Malicious Software including false alarm test Language: English March 2015 Last Revision: 30 th April 2015 Table of Contents Tested Products 3 Introduction

More information

Anti-Virus Comparative

Anti-Virus Comparative Anti-Virus Comparative File Detection Test of Malicious Software including false alarm test Language: English September 2015 Last Revision: 15 th October 2015 Table of Contents Tested Products 3 Introduction

More information

Security Industry Market Share Analysis

Security Industry Market Share Analysis Security Industry Market Share Analysis December Introduction The Research OPSWAT releases quarterly market share reports for several sectors of the security industry. This report includes both worldwide

More information

Online Banking and Endpoint Security Report October 2012

Online Banking and Endpoint Security Report October 2012 Online Banking and Endpoint Security Report October 2012 1 Contents: Introduction 3 The Purpose of this Report 3 Security Applications Tested 5 Methodology Used in the Test 5 Test Results 7 Analysis of

More information

Computer Viruses: How to Avoid Infection

Computer Viruses: How to Avoid Infection Viruses From viruses to worms to Trojan Horses, the catchall term virus describes a threat that's been around almost as long as computers. These rogue programs exist for the simple reason to cause you

More information

ESET SMART SECURITY 9

ESET SMART SECURITY 9 ESET SMART SECURITY 9 Microsoft Windows 10 / 8.1 / 8 / 7 / Vista / XP Quick Start Guide Click here to download the most recent version of this document ESET Smart Security is all-in-one Internet security

More information

Supported Anti Virus from ESAP 2-6-1

Supported Anti Virus from ESAP 2-6-1 Supported Anti Virus from ESAP 2-6-1 avast! Antivirus (4.8.x) avast! Antivirus (4.x) avast! Antivirus (managed) (4.x) avast! Antivirus Professional (4.8.x) avast! Antivirus Professional (4.x) avast! Business

More information

Get Started Guide - PC Tools Internet Security

Get Started Guide - PC Tools Internet Security Get Started Guide - PC Tools Internet Security Table of Contents PC Tools Internet Security... 1 Getting Started with PC Tools Internet Security... 1 Installing... 1 Getting Started... 2 iii PC Tools

More information

Fully supported Antivirus software (Managed Antivirus)

Fully supported Antivirus software (Managed Antivirus) You are here: Antivirus > Managed Antivirus Vendors Fully supported Antivirus software (Managed Antivirus) Antivirus (AV) vendors often release software updates. We hard-code the update into our RMM agent

More information

Comprehensive Malware Detection with SecurityCenter Continuous View and Nessus. February 3, 2015 (Revision 4)

Comprehensive Malware Detection with SecurityCenter Continuous View and Nessus. February 3, 2015 (Revision 4) Comprehensive Malware Detection with SecurityCenter Continuous View and Nessus February 3, 2015 (Revision 4) Table of Contents Overview... 3 Malware, Botnet Detection, and Anti-Virus Auditing... 3 Malware

More information

What Do You Mean My Cloud Data Isn t Secure?

What Do You Mean My Cloud Data Isn t Secure? Kaseya White Paper What Do You Mean My Cloud Data Isn t Secure? Understanding Your Level of Data Protection www.kaseya.com As today s businesses transition more critical applications to the cloud, there

More information

Miradore Management Suite Application support for Patch Management

Miradore Management Suite Application support for Patch Management Miradore Management Suite Application support for Patch Management This is a list of supported applications in Q1/2016. New software and software versions are added continuously. Vendor Product Min Version

More information

Symantec Endpoint Protection 12.1.5 Datasheet

Symantec Endpoint Protection 12.1.5 Datasheet Symantec Endpoint Protection 12.1.5 Datasheet Data Sheet: Endpoint Security Overview Malware has evolved from large-scale massive attacks to include Targeted Attacks and Advanced Persistent Threats that

More information

Spine Warranted Environment Specification

Spine Warranted Environment Specification Spine Warranted Environment Specification 2015 Richard Trusson June 2015 1 Copyright 2015, Health and Social Care Information Centre. Contents Introduction 3 Scope 3 Intended Audience 4 Approach 4 Changes

More information

Norton 360. Benefits. Our ultimate protection, now even more so. Introducing the new Norton 360.

Norton 360. Benefits. Our ultimate protection, now even more so. Introducing the new Norton 360. Norton 360 Our ultimate protection, now even more so. Introducing the new Norton 360. Our ultimate Internet and antivirus protection for all you do online Provides proactive protection, so you can do what

More information

ESET SMART SECURITY 6

ESET SMART SECURITY 6 ESET SMART SECURITY 6 Microsoft Windows 8 / 7 / Vista / XP / Home Server Quick Start Guide Click here to download the most recent version of this document ESET Smart Security provides state-of-the-art

More information

FOR MAC. Quick Start Guide. Click here to download the most recent version of this document

FOR MAC. Quick Start Guide. Click here to download the most recent version of this document FOR MAC Quick Start Guide Click here to download the most recent version of this document ESET Cyber Security Pro provides state-of-the-art protection for your computer against malicious code. Based on

More information

Endpoint protection for physical and virtual desktops

Endpoint protection for physical and virtual desktops datasheet Trend Micro officescan Endpoint protection for physical and virtual desktops In the bring-your-own-device (BYOD) environment, protecting your endpoints against ever-evolving threats has become

More information

Patch Management Overview Report Date generated: 24 Jul 2014 09:41AM G Lighting Site: Stl Office Device: VAULT Missing Security Update for Windows Server 2003 (KB2982792) Windows Malicious Software Removal

More information

Using Windows Update for Windows XP

Using Windows Update for Windows XP Using Windows Update for Windows XP Introduction This document provides instructions on updating Windows XP with the necessary patches. It is very important to update your operating system software in

More information

Best Practice Configurations for OfficeScan (OSCE) 10.6

Best Practice Configurations for OfficeScan (OSCE) 10.6 Best Practice Configurations for OfficeScan (OSCE) 10.6 Applying Latest Patch(es) for OSCE 10.6 To find out the latest patches for OfficeScan, click here. Enable Smart Clients 1. Ensure that Officescan

More information