DETERMINATION OF THE PERFORMANCE



Similar documents
Are free Android virus scanners any good?

AV-TEST Examines 22 Antivirus Apps for Android Smartphones and Tablets

U.S. Cellular Mobile Data Security. User Guide Version 00.01

Endpoint Business Products Testing Report. Performed by AV-Test GmbH

How to Use Windows Firewall With User Account Control (UAC)

3. Security Security center. Open the Settings app. Tap the Security option. Enable the option Unknown sources.

Junos Pulse for Google Android

Point of View ProTab 3XXL IPS - Android 4.0 Tablet PC. Contents... 1 General notices for use... 2 Disclaimer... 2 Box Contents...

Web Conferencing Version 8.3 Troubleshooting Guide

Windows 8 Malware Protection Test Report

Virtual Desktops Security Test Report

Comodo Mobile Security for Android Software Version 3.0

Mobile App Testing Process INFLECTICA TECHNOLOGIES (P) LTD

Feature List for Kaspersky Security for Mobile

Point of View Mobii Android 4.2 Tablet PC. General notices for use... 2 Disclaimer... 2 Box Contents... 2

Kaspersky Whitelisting Database Test

NotifyMDM Device Application User Guide Installation and Configuration for Windows Mobile 6 Devices

Frequently Asked Questions Enterprise Mobile Manager

Enterprise Anti-Virus Protection

Mobile Performance Testing Approaches and Challenges

Wowza Media Systems provides all the pieces in the streaming puzzle, from capture to delivery, taking the complexity out of streaming live events.

Tracking Anti-Malware Protection 2015

User Reports. Time on System. Session Count. Detailed Reports. Summary Reports. Individual Gantt Charts

GO!Enterprise MDM Device Application User Guide Installation and Configuration for Android

Connecting Software Connect Bridge - Mobile CRM Android User Manual

How to train your Browser Controller

Features of AnyShare

Enterprise Anti-Virus Protection

VESZPROG ANTI-MALWARE TEST BATTERY

2. Installation and System requirements

ESET Endpoint Security 6 ESET Endpoint Antivirus 6 for Windows

Norton Mobile Privacy Notice

Image Area. White Paper. Best Practices in Mobile Application Testing. - Mohan Kumar, Manish Chauhan.

Home Anti-Virus Protection

Here is a demonstration of the Aqua Accelerated Protocol (AAP) software see the Aqua Connect YouTube Channel

Using 2Can. There are three basic steps involved in migrating all of your data from your BlackBerry to your Android phone:

This software will update your Samsung Galaxy S II to Android software version GB28.

NETWRIX USER ACTIVITY VIDEO REPORTER

IKARUS mobile.security for MDM Manual

Quick Start Guide. Version R9. English

International Journal of Advanced Engineering Research and Science (IJAERS) Vol-2, Issue-11, Nov- 2015] ISSN:

BROWSER AND SYSTEM REQUIREMENTS

avast! Mobile Security User Guide avast! Mobile Security User Guide

SECURE YOUR BUSINESS WHEREVER IT TAKES YOU. Protection Service for Business

Understanding Native Applications, Tools, Mobility, and Remote Management and Assistance

Anti-Virus Protection and Performance

GO!Enterprise MDM Device Application User Guide Installation and Configuration for ios Devices

Notices. Copyright 2016 Malwarebytes. All rights reserved.

Kaspersky Lab Mobile Device Management Deployment Guide

Product Overview. For a more detailed overview of the SafeTime system please view our video via our website:

Small Business Anti-Virus Protection

Small Business Anti-Virus Protection

Understanding the Performance of an X User Environment

Performance Analysis of Web-browsing Speed in Smart Mobile Devices

platforms Android BlackBerry OS ios Windows Phone NOTE: apps But not all apps are safe! malware essential

Mobile Testing That s Just a Smaller Screen, Right?

Comodo Mobile Security for Android Software Version 2.5

Jive Connects for Openfire

DOCSVAULT Document Management System for everyone

CYBERCRIMINAL IN BRAZIL SHARES MOBILE CREDIT CARD STORE APP

Hardware Requirements

Kaseya 2. User Guide. Version 1.0

GO!Enterprise MDM Device Application User Guide Installation and Configuration for Android with TouchDown

Small Business Anti-Virus Protection

ESET Mobile Security Business Edition for Windows Mobile

Real Performance? Ján Vrabec David Harley

Android App for SAP Business One. Z3moB1le App Version 1.00 Pagina 1 di 12.

Super Manager User Manual. English v /06/15 Copyright by GPC

Media Server Installation & Administration Guide

ANDROID GUEST GUIDE. Remote Support & Management PC Tablet - Smartphone. 1. An Introduction. Host module on your PC or device

Free and Legal Software You Can Download By Tom Krauser

Enterprise Anti-Virus Protection

How To Watch A Live Webcast On A Pc Or Mac Or Ipad (For Pc Or Ipa)

How To Protect Your Data From Being Hacked On Security Cloud

NAS 242 Using AiMaster on Your Mobile Devices

Mobility with Eye-Fi Scanning Guide

Trepn plug-in for Eclipse FAQ

Enterprise Anti-Virus Protection

Pre-Installation Instructions

Two Factor Authentication (TFA; 2FA) is a security process in which two methods of authentication are used to verify who you are.

Automation of Smartphone Traffic Generation in a Virtualized Environment. Tanya Jha Rashmi Shetty

Assignment # 1 (Cloud Computing Security)

Security Guide. BlackBerry Enterprise Service 12. for ios, Android, and Windows Phone. Version 12.0

Silk Test Testing Mobile Web Applications

Transcription:

DETERMINATION OF THE PERFORMANCE OF ANDROID ANTI-MALWARE SCANNERS AV-TEST GmbH Klewitzstr. 7 39112 Magdeburg Germany www.av-test.org 1

CONTENT Determination of the Performance of Android Anti-Malware Scanners... 1 Abstract... 2 Determination of the Malware Detection Rate... 3 Sample Set... 3 On-Demand Test... 3 On-Access Test... 3 False Positive Test... 3 Measuring the Performance Impact... 4 Concept... 4 Proof of Concept... 4 Implementation... 6 Implementation of Real World Cases... 7 Certification... 8 Protection Score... 8 Usability Score... 8 Additional Feature Point... 9 Frequently Asked Questions (FAQ)... 9 ABSTRACT Today's mobile devices are small computers whose functionality is similar to desktop computers. With the high number of different sensors they can even be used for tasks, that a desktop computer is not capable to achieve. But due to their mobility without a permanent power line their resources are rather limited to save battery power and to gain longer battery life. With the rise of malware for Android based mobile devices there is also the need for an Anti- Malware scanner on a device to protect the device itself and the personal data on it. There are many products on the market, but do they all have proper malware detection and do they use the limited resources economically? This paper shows a way, how the malware detection rate and resource usage of Anti-Malware products can be measured. With the data achieved by our method we are able to determine which Anti-Malware product provides good protection and whether it may lead to a shorter battery life. 2

DETERMINATION OF THE MALWARE DETECTION RATE SAMPLE SET For a malware detection test, it's very important, how the sample set is composed. We use only recent malware, which is not older than 4 weeks. The samples should also cover several relevant malware families. Our test set is usually composed out of more than 20 different families, with each family including a maximum of 60 samples. Families with less than 10 samples are classified as 'Other'. The total sample set consists of about 850 to 1000 samples. ON-DEMAND TEST The first approach to determine the malware detection rate is an on-demand scan, which is provided by most Anti-Malware products. To test a high number of malicious samples, we copy them to the SD card. If the tested Anti-Malware product provides several scan options, we choose the most comprehensive one (e.g. Full System Scan), which usually includes a scan of the SD card. All detected samples will be removed afterwards. All undetected samples will be tested on-access. ON-ACCESS TEST An on-access test uses the real-time protection feature of the Anti-Malware product to scan the malware samples. This is used to determine the complete malware detection rate in case a product doesn t handle all samples with the on-demand scan or doesn t even offer this feature. In the end only the combined score of on-demand and on-access is provided by AV-TEST. The samples are installed over the Android Debug Bridge. The Anti-Malware product has to provide a visible feedback to the user, when a sample was detected. FALSE POSITIVE TEST A malware detection rate is not meaningful without an associated false positive rate. For false positive tests we use the same methodology as for malware, performing on-demand as well as on-access tests. The samples are known clean apps obtained from Google Play. 3

MEASURING THE PERFORMANCE IMPACT CONCEPT Usually an Anti-Malware testing lab is an ordinary office with desks, PCs, network hardware and mobile devices. The environment doesn't meet laboratory conditions required to reliably test the influence on the battery life for different products in a comparable manner. Such laboratory conditions are e.g. a constant room temperature, an equal number of load cycles of the battery of the test device for each product and an equal age of the battery in case of several test devices. Today's batteries are very optimized and may survive thousands of load cycles before an impact on the battery life can be measured. Nevertheless we can't eliminate such impact by constant laboratory conditions and therefore we searched another approach. Our approach is to measure the CPU time used by the Anti-Malware while known clean apps from Google Play are installed on an Android device. The Anti-Malwares real-time protection feature will check the newly installed apps for malware and it will therefore consume CPU time. To compare different Anti-Malware products we calculate the CPU usage in %. We also monitor memory usage and the amount of network traffic for each running process. In parallel to the process monitoring the battery load is monitored and the battery state is logged every time the battery load changes. With this data we can verify our results from the process monitoring because higher CPU usage should lead to faster battery discharge. This may of course vary due to the external influence factors described above. Therefore battery discharge is not given as the measured value, but merely serves as verification that our approach of measuring the consumed CPU time is actually consistent and useful. PROOF OF CONCEPT To check whether our concept is feasible, we've implemented an Android app, which does the process monitoring and a Java program, which automatically installs applications from the Google Play website on a specific device using the Google account from the device. Read about details in the implementation part. As test device we used a Samsung Galaxy Nexus with Android 4.0.4. One test cycle includes the installation of 35 apps from Google Play. We ran five test cycles per product with the same 35 apps per each cycle. After one test cycle the apps were uninstalled. The products under test include several free and evaluation versions of popular Anti-Malware products. The following figures show the results from our test. Figure 1 compares the average CPU usage of the Anti-Malware products while 35 apps were installed from Google Play. 4

80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 10.00% 0.00% Product A Product B Product C Product D Product E FIGURE 1: AVERAGE CPU USAGE CHART DURING THE INSTALLATION OF 35 APPS FROM GOOGLE PLAY Figure 2 shows the discharge rate of the battery in percent per minute. 0.4 0.35 0.3 0.25 0.2 0.15 0.1 0.05 0 Product A Product B Product C Product D Product E Discharge rate in % per Minute FIGURE 2: HOW FAST WAS THE BATTERY DISCHARGED IN AVERAGE? Figure 3 projects the values of figure 2 to an estimated battery life when the test would run indefinitely and the battery would discharge from 100% to 0%. 5

400 350 300 250 200 150 Estimated battery life in minutes 100 50 0 Product A Product B Product C Product D Product E FIGURE 3: HOW LONG WOULD THE BATTERY LIVE WHEN THE TEST RUNS INDEFINITELY? We can see a clear difference in the figures 2 and 3 between product A with the lowest CPU usage and the products D and E with the highest CPU usage (figure 1). There may be other influences on the battery life like I/O operations or network access, which were not part of this analysis, but it became clear that an CPU efficient programming can increase the battery life. IMPLEMENTATION The test environment is set up as client-server-infrastructure where the mobile device is the client with a monitoring Android app and a Java program is the server. Communication is performed via HTTP over Wi-Fi. The client has a list of Android apps to be installed from Google Play. It sends a request to the server including the next app identified by its Android package name and the Google account the device is registered with. Using the Selenium browser automation framework the server will log into the Google Play website and start the installation of the requested app. 6

FIGURE 4: TEST ENVIRONMENT SCHEME The device installs the app automatically via Wi-Fi. A broadcast receiver listens for the PACKAGE_ADDED action and saves the current state of all running processes to a report file. Then the main activity is notified to send the next request to the server. The broadcast receiver also listens for the BATTERY_CHANGED action and saves the battery state to another report file every time the battery discharges by one percent. IMPLEMENTATION OF REAL WORLD CASES Basics In addition to the app installation we've implemented several real world use cases, to measure the performance impact during more common activities. The basic concept of the implementation of performance test cases is always the same. The controller program 1 starts an activity using the am command (Activity Manager), then it waits for a suitable time for the started activity to finish loading of the given data. The activity will be stopped and a custom broadcast will be sent to the performance monitoring app on the device. The monitoring app saves CPU and memory usage as well as the amount of network traffic for each process. The am command for starting an activity usually has the following scheme: $: am start -a android.intent.action.view -d <data> --activity-clear-task <package> 1 The test device is controlled over the Android Debug Bridge. 7

<data> contains an URL, e.g. http://www.google.com/ or file:///mnt/sdcard/download/document.pdf. <package> is optional, if omitted, the system will use the default app which is associated with the data type, e.g. the browser will display http://-data. Sometimes multiple apps can handle the data, which leads to a popup where the user has to select his preferred app. Therefore for automatic tests it's better to declare a specific package, which is used to display the data. The am command for closing an activity looks like this: $: am force-stop <package> $: am kill <package> The second line is optional as force-stop should "stop everything associated with <package>". Browsing Websites To browse websites, we use the Android default browser, which is in the package com.android.browser (for Android before 4.1.X) or com.google.android.browser. The wait time is set to 10 seconds. Watching Video For video playback we use videos hosted on YouTube and the preinstalled YouTube app with the package com.google.android.youtube. The wait time is set to the playback time of the video rounded up to the next minute plus one extra minute. E.g. when the video playback time is 2:48, the wait time is 4 minutes. Reading PDF Documents To open PDF documents, we use the Adobe Reader app. This app is part of our installation test cases, which usually run before the real world cases. The package is com.adobe.reader. The wait time is set to 10 seconds. Idling We've also included three test cases for idling. No specific commands are required here. The test cases cover idle times from 10 over 20 to 30 minutes. CERTIFICATION To receive an AV-TEST certification, a mobile security app has to achieve more than 8.0 points in our scoring scheme. The scoring is composed out of the 'Protection' category, which includes the malware detection results (combined on-demand and on-access), the 'Usability' category, which includes performance and false positives, and an extra point for additional features. PROTECTION SCORE The best detection rate in the test corresponds to 6.0 points. The average detection rate corresponds to 3.5 points. The detection rate interval for the 0.5 point steps between the average and maximum is calculated as the difference between the maximum and the average divided by 5. The 0.5 point steps between 1 and 3.5 base upon the same interval times 4. USABILITY SCORE The usability score is composed of the performance score and the false positive score. The performance score is 6.0 points minus 2.0 points for each performance impact. Performance 8

impacts may be 'impact on the battery life', 'high traffic usage' and 'slow down'. The false positive score is 6.0 points minus 1.0 point for each false positive detection. The usability score is the sum of the performance und false positive score divided by two. ADDITIONAL FEATURE POINT If the mobile security app provides at least two additional security features beside the malware protection, it receives an extra point. Additional security features may include Anti-Theft, Safe Browsing, Call Blocker, Message Filtering, Parental Control, Backup and Data Encryption. FREQUENTLY ASKED QUESTIONS (FAQ) Q: There are two products with 100% detection rate, but one got a protection score of 6.0 points and the other one got just 5.5 points. A: The reason is the rounding of the detection rates. The product with 6.0 points has detected 100% and the product with 5.5 points only detected 99.6%, which is rounded to 100%. 9