Virtual Environment Protection Test Report

Similar documents
Banker Malware Protection Test Report

Windows 8 Malware Protection Test Report

Virtual Desktops Security Test Report

Proactive Rootkit Protection Comparison Test

Endpoint Business Products Testing Report. Performed by AV-Test GmbH

Real World and Vulnerability Protection, Performance and Remediation Report

Zscaler Cloud Web Gateway Test

Kaspersky Security. for Virtualization 1.1 and Trend Micro Deep. Security 8.0 virtual environment detection rate and performance testing by AV-Test

PTC Creo 2.0 Hardware Support Dell

SECURITY FOR VIRTUALIZATION: FINDING THE RIGHT BALANCE

F-Secure Internet Gatekeeper Virtual Appliance

Symantec Endpoint Protection Datasheet

VIRTUALIZATION SECURITY IS NOT AN OXYMORON. With Kaspersky, now you can. kaspersky.com/business Be Ready for What s Next

Symantec Endpoint Protection

VIRTUALIZATION SECURITY IN THE REAL WORLD

Internet Explorer Exploit Protection ENTERPRISE BRIEFING REPORT

How To Protect Your Cloud From Attack

Endurance Test: Does antivirus software slow

Patch Management Solutions Test

McAfee MOVE AntiVirus (Agentless) 3.6.0

Deep Security Intrusion Detection & Prevention (IDS/IPS) Coverage Statistics and Comparison

HOW TO PROTECT YOUR VIRTUAL DESKTOPS AND SERVERS? Security for Virtual and Cloud Environments

Security Suites for Mac OS X: For on-demand detection, only four products achieved the 100-percent mark (AV-TEST August 2014).

Trend Micro Incorporated reserves the right to make changes to this document and to the products described herein without notice.

Secure Clouds - Secure Services Trend Micro best-in-class solutions enable data center to deliver trusted and secure infrastructures and services

Deep Security Vulnerability Protection Summary

Core Protection for Virtual Machines 1

McAfee MOVE / VMware Collaboration Best Practices

VIRTUALIZATION SECURITY OPTIONS: CHOOSE WISELY

10 Security Packages for Mac OS X: No less than 5 products achieved a perfect score of 100 percent in detection (AV-TEST April 2015).

BITDEFENDER ENDPOINT SECURITY TOOLS

Deep Security/Intrusion Defense Firewall - IDS/IPS Coverage Statistics and Comparison

Symantec Endpoint Protection Analyzer Report

Endpoint protection for physical and virtual desktops

Kaspersky Whitelisting Database Test

Symantec Endpoint Protection

INTRODUCING: KASPERSKY SECURITY FOR VIRTUALIZATION LIGHT AGENT

Small Business Anti-Virus Protection

26 Protection Programs Undergo Our First Test Using Windows 8

Microsoft IT Increases Security and Streamlines Antimalware Management by Using Microsoft Forefront Endpoint. Protection 2010.

IBM Endpoint Manager for Core Protection

Kaspersky Fraud Prevention: a Comprehensive Protection Solution for Online and Mobile Banking

Imaging Computing Server User Guide

Trend Micro OfficeScan Best Practice Guide for Malware

Total Defense Endpoint Premium r12

Enterprise Anti-Virus Protection

McAfee MOVE AntiVirus Multi-Platform 3.5.0

System Compatibility. Enhancements. Operating Systems. Hardware Requirements. Security

Endpoint Security Solutions Comparative Analysis Report

Securing Your Business s Bank Account

MRG Effitas 360 Assessment & Certification Programme Q4 2014

Best Practice Configurations for OfficeScan (OSCE) 10.6

Trend Micro Incorporated reserves the right to make changes to this document and to the products described herein without notice.

OUR MISSION IS TO PROTECT EVERYONE FROM CYBERCRIME

Protecting the Irreplacable. November 2013 Athens Ian Whiteside, F-Secure

Data Sheet: Endpoint Security Symantec Endpoint Protection The next generation of antivirus technology from Symantec

Data Sheet: Endpoint Security Symantec Endpoint Protection The next generation of antivirus technology from Symantec

How to Install the VMware ESXi Hypervisor on Physical Hardware

Test of the Antivirus Software For antivirus solutions, the test was divided up into two typical infection scenarios.

GRAVITYZONE HERE. Deployment Guide VLE Environment

Small Business Anti-Virus Protection

Endpoint protection for physical and virtual desktops

How To Test Security Products

Two Great Ways to Protect Your Virtual Machines From Malware

ESET Endpoint Security 6 ESET Endpoint Antivirus 6 for Windows

Best Practice Configurations for OfficeScan (OSCE) 10.6

Tracking Anti-Malware Protection 2015

Get Started Guide - PC Tools Internet Security

Trend Micro OfficeScan 11.0 SP1. Best Practice Guide for Malware

WildFire Reporting. WildFire Administrator s Guide 55. Copyright Palo Alto Networks

Trend Micro Enterprise Security

Taking a Proactive Approach to Patch Management. B e s t P r a c t i c e s G u i d e

Kaspersky Endpoint Security and Virtualization

Enterprise Anti-Virus Protection

1 Main components... 3

Symantec Endpoint Protection

AT&T Global Network Client for Windows Product Support Matrix January 29, 2015

Are free Android virus scanners any good?

Maintaining, Updating, and Protecting Windows 7

McAfee Optimized Virtual Environments - Antivirus for VDI. Installation Guide

TELSTRA CLOUD SERVICES CLOUD INFRASTRUCTURE PRICING GUIDE AUSTRALIA

Endpoint Security Solutions (Physical & VDI Environment) Comparative Testing Analysis

Symantec Advanced Threat Protection: Network

OfficeScan 10 Enterprise Client Firewall Updated: March 9, 2010

Kaspersky Security 9.0 for Microsoft SharePoint Server Administrator's Guide

The Latest Internet Threats to Affect Your Organisation. Tom Gillis SVP Worldwide Marketing IronPort Systems, Inc.

Outline. Introduction Virtualization Platform - Hypervisor High-level NAS Functions Applications Supported NAS models

Enterprise Anti-Virus Protection

Automated Protection on UCS with Trend Micro Deep Security

Kaspersky Small Office Security User Guide

AVG File Server. User Manual. Document revision ( )

Symantec Endpoint Protection

First Look Trend Micro Deep Discovery Inspector

Symantec Endpoint Protection

INFORMATION PROTECTED

The Hillstone and Trend Micro Joint Solution

The evolution of virtual endpoint security. Comparing vsentry with traditional endpoint virtualization security solutions

Kaspersky Anti-Virus 2013 User Guide

Transcription:

Virtual Environment Protection Test Report A test commissioned by Kaspersky Lab and performed by AV-Test GmbH Date of the report: May 10 th, 2012, last update: May 14 th, 2012 Executive Summary In March and April 2012, AV-Test performed a comparative review of 2 security solutions for virtual environments to analyze their capabilities to protect against malware. The products under test were Kaspersky Security for Virtualization and Trend Micro Deep Security. Five individual tests have been performed. The first was a real world test of malicious URLs with 36 samples, the second was a dynamic detection test with five samples, the third was a static detection test with 141,290 samples, the fourth was a false positive test, and the final test determined the impact on the system performance by the products. To perform the test runs, two VMware ESXi environments were set up on identical servers. The security software used VMware vshield to protect the virtual machines. The virtual machines for the tests ran Windows XP with the latest Service Packs and Updates. In case of the real world and dynamic tests, the samples have been executed and any detection by the security software was noted. Additionally the resulting state of the system was compared with the original state before the test in order to determine whether the attack was successfully blocked or not. In case of the static detection test, the products had to scan a set of 141,290 malicious files. The Trend Micro product was tested with and without file and web reputation (in-the-cloud), which showed a significant difference in its detection rates. It is common sense in IT security that anti-virus protection is a must. The usual agent-based anti-virus software includes several layers such as static detection, dynamic detection, firewall and more. Agentless anti-virus solutions designed for virtual environments and tested in this research, have a narrower scope, providing traditional anti-virus protection only and preventing too heavy performance impact. There also may be circumstances where critical systems may require agentbased anti-virus applications with additional protection layers. This might create a mixture of both agent-based and agent-less anti-virus protection methods that must be administered and maintained. 1

Overview With the increasing number of threats that is being released and spreading through the Internet these days, the danger of getting infected is increasing as well. A few years back there were new viruses released every few days. This has grown to several thousand new threats per hour. New unique samples added to AV-Test's malware repository (2005-2012) 18 000 000 16 000 000 14 000 000 12 000 000 10 000 000 8 000 000 6 000 000 4 000 000 2 000 000 0 2005 2006 2007 2008 2009 2010 2011* 2012* Dec Nov Oct Sep Aug Jul Jun May Apr Mar Figure 1: New samples added per year In the year 2000, AV-Test received more than 170,000 new samples, and in 2010 and 2011, the number of new samples grew to nearly 20,000,000 new samples each. The numbers continue to grow in the year 2012 with already over 5 million new samples in the first quarter. The growth of these numbers is displayed in Figure 1. Since virtual infrastructures are an important topic for the enterprise, security vendors provide new products which are optimized for those environments. Products Tested The testing occurred in March and April 2012. AV-Test used the latest releases available at the time of the test of the following products: Kaspersky Security for Virtualization 1.1 Trend Micro Deep Security 8 Methodology and Scoring Platform All tests have been performed on identical servers equipped with the following hardware: Dell PowerEdge T310 Intel Xeon Quad-Core X3450 CPU 16 GB Ram 2

500 GB HDD The hypervisor was VMware ESXi 5 (Build 623860) with vshield 5 (Build 473791). The protected virtual machines were configured as follows: Windows XP Professional (32 Bit), SP3 + VMware Tools 1 CPU 2 GB Ram 50 GB HDD Testing methodology General 1. Clean system for each sample. The test virtual machines should be restored to a clean state before being exposed to each malware sample. 2. Product Cloud/Internet Connection. The Internet should be available to all tested products that use the cloud as part of their protection strategy. 3. Product Configuration. All products were run with their default, out-of-the-box configuration. For Trend Micro Deep Security reputation services are disabled by default for agentless setups. The tests were run with and without reputation services. 4. Sample Cloud/Internet Accessibility. If the malware uses the cloud/internet connection to reach other sites in order to download other files and infect the system, care should be taken to make sure that the cloud access is available to the malware sample in a safe way such that the testing network is not under the threat of getting infected. 5. Allow time for sample to run. Each sample should be allowed to run on the target system for 10 minutes to exhibit autonomous malicious behavior. This may include initiating connections to systems on the internet, or installing itself to survive a reboot (as may be the case with certain key-logging Trojans that only activate fully when the victim is performing a certain task). The procedures below are carried out on all tested programs and all test cases at the same time in order to ensure that all protection programs have the exact same test conditions. If a test case is no longer working or its behavior varies in different protection programs (which can be clearly determined using the Sunshine analyses), the test case is deleted. This ensures that all products were tested in the exact same test scenarios. All test cases are solely obtained from internal AV-TEST sources and are always fully analyzed by AV-TEST. We never resort to using test cases or analyses provided by manufacturers or other external sources. Dynamic Test/Real-World Test 1. The products are installed, updated and started up using standard/default settings. The protection program has complete Internet access at all times. 2. AV-TEST uses the analysis program Sunshine, which it developed itself, to produce a map of the non-infected system. 3. It then attempts to access the website resp. execute the malicious file. 4. If access to/execution of the sample is blocked with static or dynamic detection mechanisms by the program, this is documented. 3

5. Given that the detection of malicious components or actions is not always synonymous to successful blockage, Sunshine constantly monitors all actions on the computer in order to determine whether the attack was completely or partially blocked or not blocked at all. 6. A result for the test case is then determined based on the documented detection according to the protection program and the actions on the system recorded by Sunshine. Static Scanning Test 1. 141,290 malware samples have been scanned with the products with recent updates and connection to the cloud (Note: Trend Micro has been tested with the default setting File Reputation Off and additionally with File Reputation on ) 2. A rescan of all remaining samples has been performed 7 days later to determine the final detection rate Samples The malware set for the dynamic test contains 41 samples. The set is separated into 36 URLs with malicious downloads and 5 executable files from other sources like mail attachments or removable storage. These files have been collected during March 29 th and April 12 th 2012. Every sample has been tested on the day of its appearance in AV-TESTs analysis systems. The malware set for the static scanning test contains 141,290 samples of zoo malware. This includes files that were spread in the internet in the last few weeks and that were collected by AV-TEST during February and March 2012. 4

Test Results Real World Attacks The real world tests showed that Kaspersky uses signature based detection only while Trend Micro is also able to block entire URLs with their Web Reputation technique. Due to its high static detection rates and its short response times to new malware, Kaspersky was able to block 30 out of 36 malicious files, which were downloaded from the web. One sample was partially blocked. Trend Micro was able to block 24 URLs with its Web Reputation engine, this means the user was not able to download the malicious files. However, if a file was successfully downloaded, the file guard detected 15 files only. All in all Trend Micro blocked 27 samples. 35 Real World Detection Results 30 25 20 15 10 5 Partially blocked attacks Downloaded files blocked URLs blocked 0 Kaspersky Security for Virtualization Trend Micro Deep Security Figure 2: Real World Detection Results Figure 2 shows only three files blocked by the Trend Micro file guard. The remaining 12 files were also detected by the Web Reputation engine and were therefore counted as blocked URLs. Static Detection The static detection tests showed that Kaspersky and Trend Micro are on a similar level as long as file reputation is enabled for Trend Micro. The default option is to turn file reputation on for agents only. Because our setup did not use the agent, we tested both options to show the differences. 5

100,00% Static Detection of Malware 98,60% 98,84% 95,00% 90,00% 85,00% 80,00% 75,00% 74,22% 70,00% Kaspersky Security for Virtualization Trend Micro Deep Security Trend Micro with File Reputation Figure 4: Static Detection of Malware Figure 4 shows that the file reputation settings have a big impact on the detection results of Trend Micro. Dynamic Detection The next test determined the detection of new unknown malware through dynamic detection methods. These are files that are not detected with static detection. When looking at the very good static detection rates from above, it is obvious that only a small number of files has to be caught with dynamic detection. Kaspersky was able to block one out of five samples and Trend Micro detected two out of five samples, but it was not able to block them. It became clear that neither Kaspersky nor Trend Micro use dynamic detection methods in the tested configuration. The detections in this test were all based on signatures. The products use the VMware vshield Endpoint driver to access files on the protected virtual machine, which are then scanned by an additional virtual appliance. So the virtual machine itself does not have an anti-virus agent installed. Because it would cost too much performance to pass all events to the virtual appliance for behavior based analysis, such methods are not supported. The products can be configured with an anti-virus agent on each virtual machine, in case of Kaspersky the agent would be a normal Kaspersky Endpoint Security client. If an agent is installed it would be able to do behavior based analysis of malware and the detection rates would increase. It depends on the companies needs, whether to use a setup with or without agent. The setup without agent requires fewer resources per virtual machine and therefore more machines can run on a single host. 6

5 Dynamic Detection of Malware 4 3 2 1 0 Kaspersky Security for Virtualization Overall Detection (Warning) Rate Trend Micro Deep Security Overall Detection and Blocking Rate Figure 3: Dynamic Detection of Malware As figure 3 shows, dynamic detection methods do not work with an agentless setup. False Positives The false positive tests include the scan of two sets of files (static) and the installation of 20 clean applications (dynamic). The first set includes 11,604 files from several Windows and Office installations and detections in this set are therefore critical. Both products had no false positive detections in this set. The second set contains all kinds of files from popular programs, which were downloaded from major download sites. The total number of these less critical files is 231,872. Kaspersky had no false positive detections and Trend Micro detected two files. Due to the size of the set these numbers are very good. 2 False Positive Detections of non-critical Files 1,5 1 0,5 0 Kaspersky Security for Virtualization Trend Micro Deep Security Figure 5: False Positive Detections of less critical Files During the dynamic false positive tests Kaspersky had no false alarms. Trend Micro removed one language DLL of an IrfanView installation; however the program could start properly. 7

Performance The performance test measured several synthetic I/O operations like creating and opening files as well as real usage scenarios like downloading files and running applications. The cycle was repeated 7 times. Figure 6 shows the total average time for specific real-world, non-synthetic, operations. 900,00 800,00 700,00 600,00 500,00 400,00 300,00 Total Average Time of Specific Operations Copy files Download files Install applications Load websites 200,00 100,00 0,00 Reference Kaspersky Lab Trend Micro Run applications opening specific documents Figure 6: Total Average Time of Specific Operations (in seconds) Kaspersky provides the overall better performance, but compared to the reference even Kaspersky needs more than twice the time to copy a set of files, which is 3.4 GB in size. The most impact can be seen by Trend Micro when installing applications. The test shows a noticeable impact on performance by both security solutions for the copying files within one virtual machine and installing applications operations. However, in a real virtual environment such operations should be rare. Summary The above findings show that protecting virtual environments is different than protecting the usually desktop PC. The security vendors have to use different approaches to protect the systems and to minimize performance impact. It is obvious that a careful configuration has to be made to tailor the security solution to the specific environment. 8

Appendix Version information of the tested software Developer, Distributor Product name Virtual appliance version Management console version Kaspersky Lab Kaspersky Security for Virtualization 1.1.0.49 9.2.61 Trend Micro Deep Security 8 8.0.0.1199 8.0.1448 List of used malware samples Real World Attacks hxxp://fotolog12.beepworld.it/files/slide-orkut85.exe hxxp://www.haoxs.net/tools/file/c/q.exe hxxp://tenda.infosapobla.com/temp/syl-dc5.exe hxxp://www.romanhitechinstitute.com/newsimages/svchost.exe hxxp://www.clickplaystream.com/dl/java.exe hxxp://swordsoul.110mb.com/onepiece.com hxxp://schokoweiss.de/uploads/media/media.exe hxxp://heart-station.org/blog/f2.exe hxxp://down.nurungzi.co.kr/main/t5/hinnrz.exe hxxp://ceraxon.com/iemctsec/mvxrf0.exe hxxp://ahaliaexchange.com/java.exe hxxp://75.147.219.202/aspnet_client/system_web/receitanet_malha114001.exe hxxp://bot.iamsoninja.com/downloads/server.exe hxxp://uppdate.sytes.net/_u/stub.exe hxxp://clickplaystream.com/dl/camfrog.exe hxxp://facerboolksion.biz/fotoviews.php?= hxxp://adest.com.au/readers/adest/adest.exe hxxp://www.s3odicol.net/x5.exe hxxp://alias1.adobedownloadcentre.selfip.biz/data.php hxxp://test.ceuta-pesca.es/update.exe hxxp://gonadee.com/media/files/np.exe hxxp://www.dvornalipa.sk/foto_files/photos/1.exe hxxp://78.111.51.123/files/6f82c hxxp://smscrack.narod.ru/install_sms_cracker.exe hxxp://www.veterinary-management.com/tmp/install_4a405aa301785/css/correios-telegrama586655.exe hxxp://newserial.net/ah-istanbul-cengiz-ozkan.exe hxxp://exehost.net/uploads/adobe-udpate.exe hxxp://petrojobsearch.com/java/settings.exe hxxp://lebleb2011.com/install.exe hxxp://colegiowz.com.br/d&// hxxp://91.217.153.35/upeksvr.exe hxxp://www.motivity.com.tw/lib/thumb.php?redir= hxxp://thecaswellhouse.com/caswellhouse.exe hxxp://meinv.tv/5/steup.exe hxxp://www.inews365.com/xml/xml.exe hxxp://188.116.32.144/mgfugh/update.php?ver=1&type=movie Dynamic Detection 0x06d8fe2fa094401e0c06c9d26dc274c8 0x166e6e813478f8c92ec245ef3bac1f83 0x328d9ef6c3d8770c0b144a7bff99a530 0x329428230075cb168a5aaa33c6df1cb3 0x3f53ea54adceec86de26f9a23b7ec90d 0x54115b1ceb020baf7402e24da33f2a67 0x787806ddd76b6e2caf25ae0e1be82641 0x95349dc075008283fb832f8fca2b6e08 0xad05c3c63d5b50cd820b9c43aa4cd489 0xceaece2b59a512c1d8344a2ea051a6c1 Static Detection List of the 141,290 samples not given because of size, but it is available on request. 9

Copyright 2012 by AV-Test GmbH, Klewitzstr. 7, 39112 Magdeburg, Germany Phone +49 (0) 391 60754-60, Fax +49 (0) 391 60754-69, Web http://www.av-test.org 10