Modeling Mobile Application Test Platform and Environment: Testing Criteria and Complexity Analysis



Similar documents
Testing as a Service on Cloud: A Review

How To Choose A Mobile Application Testing Tool

REVIEW OF CLOUD TESTING, TYPES, CHALLENGES AND FUTURE SCOPE

WHITEPAPER BEST PRACTICES IN MOBILE APPLICATION TESTING

International Journal of Advanced Engineering Research and Science (IJAERS) Vol-2, Issue-11, Nov- 2015] ISSN:

Review of Mobile Applications Testing with Automated Techniques

Mobile Testing-as-a-Service (Mobile TaaS or MTaaS)

Cognizant Mobility Testing Lab. The faster, easier, more cost-effective way to test enterprise mobile apps.

Test Case Design by Means of the CTE XL

Sample Exam Foundation Level Syllabus. Mobile Tester

DEPLOYMENT. ASSURED. SEVEN ELEMENTS OF A MOBILE TEST STRATEGY. An Olenick & Associates White Paper

Testing & Assuring Mobile End User Experience Before Production. Neotys

Chapter 5. Regression Testing of Web-Components

Mobile Device and Technology Characteristics Impact on Mobile Application Testing

Application Centric Infrastructure Object-Oriented Data Model: Gain Advanced Network Control and Programmability

Take full advantage of IBM s IDEs for end- to- end mobile development

Going beyond Conventional Software Testing: Cloud Testing

Syllabus Version

IBM Endpoint Manager for Mobile Devices

Mobile Performance Testing Approaches and Challenges

Automation Testing in Mobile Applications Swati Hajela

Mobile Application Testing

Method of Fault Detection in Cloud Computing Systems

100% NO CODING NO DEVELOPING IMMEDIATE BUSINESS -25% -70% UNLIMITED SCALABILITY DEVELOPMENT TIME SOFTWARE STABILITY

Sample Exam Foundation Level Syllabus. Mobile Tester

StableNet Monitoring out of the Cloud. Using the unified StableNet OSS Solution in a cloud-based environment

[Rokadiya,5(4): October-December 2015] ISSN Impact Factor

SOFTWARE TESTING TRAINING COURSES CONTENTS

Cisco Enterprise Mobility Services Platform

Lee Barnes, CTO Utopia Solutions. Utopia Solutions

From Traditional Functional Testing to Enabling Continuous Quality in Mobile App Development

CMSC 435: Software Engineering Course overview. Topics covered today

Testing Mobile Application using Device Cloud

LEARNING SOLUTIONS website milner.com/learning phone

Development of a Service Robot System for a Remote Child Monitoring Platform

Mobile Application Testing Challenges & Best Practices

IBM EXAM QUESTIONS & ANSWERS

Software-Defined Networks Powered by VellOS

Business Value of Microsoft System Center 2012 Configuration Manager

Participatory Cloud Computing and the Privacy and Security of Medical Information Applied to A Wireless Smart Board Network

Web of Things Architecture

Survey of Web Testing Techniques

Getting Started with IBM Bluemix: Web Application Hosting Scenario on Java Liberty IBM Redbooks Solution Guide

Mobile and Cloud computing and SE

Comparative Study of Automated testing techniques for Mobile Apps

Virtual CPE and Software Defined Networking

vsphere Upgrade vsphere 6.0 EN

VALLIAMMAI ENGNIEERING COLLEGE SRM Nagar, Kattankulathur DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

Cisco Mobile Collaboration Management Service

AUTOMATED MOBILE TESTING REQUIRES BOTH REAL DEVICES AND EMULATORS

Gaming as a Service. Prof. Victor C.M. Leung. The University of British Columbia, Canada

CA Service Desk Manager - Mobile Enabler 2.0

Remote Desktop Access Through Android Mobile Phones and Reverse

Mobile Testing Preparing for a fast-changing mobile world. Sudheer M, Practice Head - Mobility Testing and Automation

StruxureWare TM Data Center Expert

On the Edge of Mobility Building a Bridge to Quality October 22, 2013

New Insights into WiFi-based Device-free Localization

Your Voice is Critical. OpenScape Enterprise voice solutions gives power to voice

Load and Performance Load Testing. RadView Software October

Apache Web Server Execution Tracing Using Third Eye

Testing Automation for Distributed Applications By Isabel Drost-Fromm, Software Engineer, Elastic

Simple Mobile Application & Browser Testing

SA Series SSL VPN Virtual Appliances

1. What are the System Requirements for using the MaaS360 for Exchange ActiveSync solution?

PRTG NETWORK MONITOR. Installed in Seconds. Configured in Minutes. Master Your Network for Years to Come.

Compatibility Testing for Mobile Apps and Websites

Syllabus Version 2.5_R ( )

Mobile application testing for the enterprise

Challenges in Android Application Development: A Case Study

Building and Deploying Enterprise M2M Applications with Axeda Platform

Addressing Mobile Load Testing Challenges. A Neotys White Paper

Cisco RV110W Wireless-N VPN Firewall

HP Intelligent Management Center Standard Software Platform

A Platform Independent Testing Tool for Automated Testing of Web Applications

PRTG NETWORK MONITOR. Installed in Seconds. Configured in Minutes. Masters Your Network for Years to Come.

Automated testing for Mobility New age applications require New age Mobility solutions

Horizontal IoT Application Development using Semantic Web Technologies

2) Xen Hypervisor 3) UEC

Service Virtualization

EvoDroid: Segmented Evolutionary Testing of Android Apps

ORACLE INFRASTRUCTURE AS A SERVICE PRIVATE CLOUD WITH CAPACITY ON DEMAND

Enterprise Resource Planning System Deployment on Mobile Cloud Computing

LiveAction Visualization, Management, and Control for Cisco IWAN Overview

Integrating Web Messaging into the Enterprise Middleware Layer

Mobile App Testing Guide. Basics of Mobile App Testing

Introduction to Mobile Access Gateway Installation

Comparing VMware Zimbra with Leading and Collaboration Platforms Z I M B R A C O M P E T I T I V E W H I T E P A P E R

Keywords Cloud Environment, Cloud Testing, Software Testing

UPS battery remote monitoring system in cloud computing

BYOD: BRING YOUR OWN DEVICE.

Transcription:

Modeling Mobile Application Test Platform and Environment: Testing Criteria and Complexity Analysis ABSTRACT Chuanqi Tao School of Computer Science and Engineering Nanjing University of Science and Technology Nanjing, Jiangsu, China taochuanqi@njust.edu.cn With the rapid advance of mobile computing technology and wireless networking, there is a significant increase of mobile subscriptions. This drives a strong demand on mobile application testing on mobile devices. Since mobile APPs are native to mobile devices, an underlying mobile platform becomes the basic foundation of their test environments. To achieve effective test automation, test solutions must be compatible, deployable, and executable on different mobile platforms, devices, network, and appliance APIs. This paper is written to provide an approach to modeling mobile test environments based on a Mobile Test Environment Semantic Tree (MT E ST ). Based on this model, the paper discusses test complexity evaluation methods for test environment. Furthermore, some case study results are reported to demonstrate and analyze the proposed testing models. Categories and Subject Descriptors D.2.5 [Software Engineering]: Testing and Debugging General Terms Languages, Experimentation Keywords test modeling and analysis, mobile testing, mobile test environment, mobile APP testing 1. INTRODUCTION With the recent fast increase in the number of mobile users, more mobile devices are shipped daily and more mobile APPs and applications are deployed on mobile devices to meet their needs. According to a Clearwater Technology Team Report in 2011 [1], the mobile computing industry is expected to be worth almost US $330 billion by 2015. According to ABI research, the smartphone market is expected to grow at a CAGR of 24% for the period 2011-15, mainly Permission to make digital or hard copies of all or part of this work for personal or classroom Permission usetois make granteddigital withoutor feehard provided copies thatof copies all or arepart not made of this or work distributed for for profit or commercial advantage and that copies bear this notice and the full citation on personal the firstor page. classroom Copyrights usefor iscomponents granted without of this work fee provided owned bythat others copies than ACM are must not made be honored. or distributed Abstracting forwith profit credit or is commercial permitted. Toadvantage copy otherwise, and that or republish, copies to bear post this on notice servers and or tothe redistribute full citation to lists, onrequires the firstprior page. specific To copy permission otherwise, and/or toa fee. republish, Requesttopermissions post servers from Permissions@acm.org. or to redistribute to lists, requires prior specific JAMAICA 14, permission and/or Julya21, fee. 2014, San Jose, CA, USA Copyright JAMAICA 2014 14, July ACM21, 978-1-4503-2933-0/14/07...$15.00 2014, San Jose, CA, USA http://dx.doi.org/10.1145/2631890.2631896 Copyright 2014 ACM 978-1-4503-2933-0/14/07...$15.00. Jerry Gao School of Computer Engineering San Jose State University San Jose, CA 95112, USA jerry.gao@sjsu.edu due to rising demand from emerging markets in Asia Pacific and Latin America [2]. The fast growing market and expected increasing revenues drives new a strong demand on developing and testing more mobile APPs and mobile web applications. Up to now, most published research papers primarily focus on the specific technical issues and solutions such as white-box and unit testing for mobile programs, black-box testing and GUI testing mobile applications, mobile application QoS requirements, mobile usability testing, mobile test automation and frameworks, and testing intelligent mobile terminals. Nowadays, diverse mobile test environments and technology tools cause tedious operations and higher costs in mobile test environment set-up and test automation control. According to our survey, the existing test models seldom address the test modeling and analysis for mobile environment contexts (such as mobile platforms, web browsers, mobile technologies, different native APIs, and device-specific gesture, and related configuration on different devices), diverse network connectivity and related contexts, scalability and mobility, usability and security. Thus, engineers lack welldefined test models and criteria to address the special features in mobile APPs and mobile web applications, as well as test automation solutions for a variety of mobile devices, mobile functions, and mobile environment set-up, i.e., there is a lack of well-defined test models to address distinct needs in mobile testing. This paper focuses on those needs above. The paper uses a model-based approach to address testing issues. A new model, known as Mobile Test Environment Semantic Tree (MT E ST ), is used to assist engineers to perform test modeling and analysis for mobile test environments. Based on the given model, a test complexity evaluation method is provided. The paper has three primary contributions in mobile environment testing. Firstly, it uses a model-based approach to modeling, presenting and analysis of diverse mobile test environments. Secondly, it provides a systematic method to evaluate test complexity of diverse mobile environment deployments. Thirdly, Two realistic mobile apps are studied using the proposed models. The paper is structured as follows. The next section presents the model-based approach to mobile test environment. Test complexity analysis is discussed in Section 3. Section 4 reports the results of case studies. The related work is provided in Section 5. Conclusion and future work are summarized in Section 6. 28

Figure 2: Test Space for Mobile Applications Table 1: The Notations of Semantic Relations in the MT E ST Figure 1: Mobile Test Environments Relations EOR(P, C1, C2 ) AND(P, C1,..., Cn ) SELECT 1 (P, C1,..., Cn ) SELECT M (P, C1,..., Cn ) Semantic descriptions (P is a parent node, and C i is its child node) P-Node must be deployed with two child nodes C1 and C2 exclusively P-Node must be deployed with all of its child nodes C 1,..., C n P-Node must be deployed with one of its selective child nodes C 1,..., C n P-Node must be deployed with M selective nodes from its child nodes C 1,..., C n 2. MODELING TEST ENVIRONMENT IN MOBILE TESTING Since mobile applications are expected to be deployed and executed on diverse mobile platforms, they must be validated on different mobile platforms and devices. In addition, most mobile devices today support diverse wireless network connectivity (such as 2G/3G/4G/Wi-Fi/Wi-Max), mobile applications must be validated under different network connectivity and related contexts. Windows, Linux, and Mac are the three popular mobile platforms for mobile testing tools. At the current point, only a few of them have its limitation on their execution platforms. For example, Keynote s MITE only supports Window platform. As shown in Figure 1, there are four different mobile testing approaches and supporting environments [7]. They are such as emulationbased testing, simulation-based testing, device-based testing, remote device-based testing. According to the recent feedback from test engineers, there are some major issues and needs in mobile test environments [7]. For example, there is a lack of well-defined test models and coverage criteria to address distinct features and needs in mobile application testing. Since diversity of mobile app operation environments on mobile devices requires a special test model to address the test coverage for complicated mobile app (or mobile web app) operation contexts. The similar needs can be found in testing mobile scalability, mobility, mobile usability and security. Therefore, well-defined test models to address the special features are needed in mobile APPs and mobile web applications. However, mobile test environment brings a lot of issues and challenges, such as how to solve the high cost and complexity problem in building mobile test environment, how to meet the demand of diversity of mobile platforms with different device-based gestures, and how to cope with the incompatible mobile platforms with limited computing resources. In this paper, we only focus on modeling test environment. Although there are numerous useful models in software testing, very few are suitable to model and present diverse deployments or configurations and their mappings in the 3- dimension configuration space for mobile test environment. As shown in Figure 2, we need a well-defined test model to assist engineers to analyze and present each device (say D j) under a specified configurable environment (say E k ) as well as the corresponding configuration function (say F i), so that test generation methods, test complexity analysis techniques can be developed. 2.1 Test Model In previous work [8], we introduced a sematic tree model to test configurable component-based software. Here, the semantics tree model is used as a basis to model to address the needs of the mobile test environment in the 3-dimension space. The tree nodes present configurable parts (or elements), such as deployable platform, network, or APIs. The links present different semantic relations between nodes. A mobile test environment semantic tree model MT E ST can be formally defined as 3-tuple = (N, E, R), where N is a set of tree nodes. There are three types of nodes: a) a single root node, b) intermediate nodes (or parent nodes), and c) leaf nodes. E is a set of links between nodes. Each link connects a parent node and one of its child nodes in a tree. Each link show a part of a semantic relation between a parent node and its child nodes. R is a set of relations, and each item in R has a semantic label that presents a semantic relation between a parent node and its child nodes. There are four types of semantic relations with labels: EOR, AND, SELECT-1, and SELECT-M. Their detailed semantics are given in Table I. To support the model-based analysis, we introduced a concept of semantic spanning trees based on the semantic tree model to present the various configurations. A semantic s- panning tree MT E SP T is a sub-tree of a given semantic tree 29

Figure 3: A Sample Semantic Tree and Its Selected Semantic Spanning Trees MT E ST. Unlike regular spanning trees, a semantic spanning tree MT E SP T for MT E ST only can be derived based on the given configuration semantic properties. Semantic Spanning Tree: A semantic spanning tree MT E SP T is a sub-tree of a given semantic tree MT E ST. Unlike common spanning trees, a semantic spanning tree MT E SP T for MT E ST only can be derived based on the following properties: -For each parent node (Np) with an AND relation in MT E SP T, it must include all of its child nodes and its links. -For Np with an EOR relation in v, it must include only one of its child nodes and the corresponding link. -For Np with a SELECT 1 relation in MT E SP T, it must include only one of its child nodes and the corresponding link. -For Np with a SELECT M relation in MT E SP T, it must include only M child nodes and the related links. As shown in Figure 3, (a) is a semantic tree model, and its two sample spanning trees are shown in (b) and (c). The detailed Algorithm for generating Spanning Tree can be referenced in [8]. Compared to traditional Classification Tree Method (CTE) by Grimm and Grochtmann [9], the proposed MT E ST has the following new features in modeling mobile app test environment. a) MT E ST presents semantic relations such as OR, AND, and SELECT-N between parent nodes and its child nodes; b) A semantic sub-tree MT E SP T can be derived from MT E ST based on the defined semantic properties; c) Test complexity and criteria can be analyzed effectively according to MT E ST. Therefore, the proposed semantic tree is more suitable for configuration testing in mobile app or component-based software due to the rich semantic relation. In addition, the semantic tree can be utilized in modeling test environment, function, or architecture while the traditional CTE approach primarily focuses on feature-wised function partition testing. Furthermore, we proposed the spanning tree concept, which can present a possible configuration in practice. 2.2 Model Identification and Generation for Mobile Test Environment All commercial mobile applications must be executed in a certain operation environment. Figure 4 shows a simple semantic model example which presents different configurable operating environments for a smartphone A. In the real world, we can use this model to consider all required configurable hardware and software elements (or entities) in a productaŕs operation environment. They include different configuration selections in network protocols, device drivers, diverse operating systems and their versions, multimedia and third-party dependent technologies. Mobile app test engineers need a semantic tree model to perform test modeling and test complexity of these diverse configuration environments. Since each configurable environment usually requires a set of environment-oriented test scripts to set up so that system function and performance testing can be conducted properly. It is important to have some systematic way to specify diverse configurations to support test modeling, complexity and coverage analysis. Although there are well-established software analysis and design models, such as UML, they are not suitable to present the diverse configurations in mobile applications in terms of environments. The proposed semantic tree model provides an effective modeling tool to support engineers to perform software configuration analysis and specification for configurable mobile applications. The first approach is a static specification-based approach, in which engineers use the semantic tree model to specify and model the configurations. Clearly, when the given software supports complicated configurations, this approach becomes tedious. Therefore, the second approach is more dynamic and systematic one, in which some built-in dynamic configuration discovery and tracking capability will be provided in configurable software. With this capability, dynamic configuration decisions in environments can be tracked and analyzed for the purpose of test modeling, test complexity analysis, and test coverage measurement. 3. TEST COMPLEXITY ANALYSIS FOR MOBILE TEST ENVIRONMENT The existing research indicates that complexity can be used to estimate the cost or effort required to design, code, test, and maintain software, as well as predict errors or faults that might be encountered during testing [12]. In addition, complexity measurement provides a guideline and cost indicator for software maintenance. High cost and complexity exits in building a mobile test environment for mobile APPs. Fast upgrading mobile platforms and diverse native appliances brings higher costs and complexity on building and set-up a desirable test environment for mobile APPs due to the diversity of mobile platforms and native APP interfaces with different device-based gestures, the incompatible mobile platforms with limited computing resources, and the lack of reusable test tools for mobile APPs on different platforms. For any configurable node N Ci in N S of MT E ST in semantic tree model, its test complexity can be computed based on its semantic relation with child nodes. Let T complexity(n Ci) be the configuration complexity for its architectures. To support the evaluation of test complexity of diverse configurable environments, we provide a detailed computation method for test complexity below. Suppose the node N Ci having EOR semantic relation with its child nodes, then it has two different architec- 30

Figure 4: Mobile Test Environment Semantic Tree Model tures, thus, its configuration complexity will be 2. T complexity(n Ci) = 2 (1) Suppose the node N Ci having AND semantic relation with its child nodes, then its configuration complexity will be 1. T complexity(n Ci) = 1 (2) Suppose the node N Ci having a SELECT-1 semantic relation with its child nodes, then its configuration complexity will be n if we assume that a total number of environments n are allowed to be selected. T complexity(n Ci) = n (3) Suppose the node N Ci having a SELECT-M semantic relation with its child nodes. Assuming each environment can be configured with selected m nodes from a total of n nodes, its configuration complexity will be n!/(m!(n-m)!). T complexity(n Ci) = n!/(m!(n m)!) (4) 4. CASE STUDY We report our case study by applying the proposed testing approach and complexity analysis into several realistic mobile applications. we have selected the yelp application and justwink for the mobile testing. Yelp is an online guide that searches businesses near you. The current release of the yelp is the hybrid application for mobiles. Hybrid application can be defined as the combination of the native and the mobile web application. In this study we will select some deployable environments for the yelp application and test those features to verify their correctness. Configuration testing of the native mobile application YELP here involves testing the application on various operating systems of the mobile with various connectivity. The operating systems on which we tested the application are such as Android, IOS, and Windows. The networks that are being considered are WAN (Wide Area network). The various types of Wide Area Networks are 2G, 3G, and 4G. The Wireless Internet is WiFi. Network connection depends on the carrier which user has opted, In our case we are considering T-Mobile Network and wifi-internet access. The third party API used by the app are android, Google maps API, Hardware Management API. The semantic tree model also provided us with an effective tool to analyze justwink application. We have considered four features in our project test environment - mobile devices, platform, connectivity and API. We have used two software testing classes and three master project teams to conduct the related experiments in San Jose State University, California, USA. In this study, we primarily focus on the following items: Model mobile applications using the proposed approach based on the mobile test environment semantic tree model (MT E ST ). Identify and analyze the test complexity of test environment in mobile testing. 4.1 Study Results and Discussion Figure 5 shows the sample semantic tree model for just- Wink mobile app. The tree model has 18 nodes, among of which there are 11 leaves. In addition, there are two d- ifferent types of semantic relations, including AND, EOR, and SELECT-1. The test complexity value is presented on top of nodes. Figure 6 presents one spanning tree from the semantic tree model shown in Figure 5. The spanning tree represents one mobile test environment configuration, which means justwink app is configured with Android 4.3 platform, wifi network connectivity, and both Camera and Speech APIs. Table 2 presents the detailed complexity of the semantic tree model of the studied moble app Yelp and justwink. For example, the total test complexity for Yelp app is 32 and 18 for justwink, which presents the total number of d- ifferent configured and deployed environments for Yelp and justwink. Hence, while validating this software, a vendor s engineers must test its deployed instances to cover its configurable environments. In practice, they can achieve the adequate test criteria in an incremental approach. For example, whenever a customer is deployed one instance, its configured environment (or platform) will be recorded. The 31

Table 2: The Semantic Tree and Its Complexity Results Semantic Tree for No. of No. of No. of Max No. of No. of No. of Testing Environment Nodes Leaves Links Height AND Select-1 Spanning Trees Yelp Application 17 12 18 4 1 6 32 justwink 18 11 17 3 2 2 18 4.2 Threats to Validity There are several potential threats to the case study. We selected two apps in the case study, that is not adequate for large-scale empirical validation. More apps are needed to indicate the effectiveness of our approach. The proposed measurement here is not the only possible measurement for testing complexity. Measurement factors such as human cognitive complexity, manual analysis complexity are not considered in this paper. In addition, in real-life projects testing on all mobile platforms in all system versions is not applicable. Testing of an Android app on all system versions and hardware (smartphone, tables from all vendors) is very time consuming. Clearly, test complexity grows much higher due to more choices are given in the semantic tree. This suggests the mobile app test environment could be very complicated. More test automation research work for mobile testing is needed. Figure 5: A MT E ST Sample for justwink and Test Complexity Analysis Figure 6: A Spanning Tree derived from Semantic Tree complexity analysis enables to engineers to figure out the required number of pre-test scripts for environment configuration and set-up. This will be useful for test planning in test cost and complexity analysis. For theses two applications in the study, we found that we need to develop 32 scripts for Yelp app and 18 scripts for justwink app to set up and cover different environments so that the deployed system instance can be tested with certain adequate test set using the existing test methods. 5. RELATED WORK Up to today, many papers have been published to address different testing areas in mobile applications. White-Box Testing Techniques: Existing white-box testing methods are still applicable to mobile applications. For example, the authors in [11] present an Android application verification tool built on Java Pathfinder to perform white-box mobile Java program testing so that race conditions and deadlocks can be detected using UML state charts and symbolic execution. Mahmood et al. [15] used a whitebox approach to generate test cases based on two programbased models (Call Graph Model and Architectural Model) to achieve mobile code test coverage. Black-Box Testing Techniques: Many black-box testing techniques are useful in mobile application testing. Random testing is one example. The scenario-based testing method is another example [13]. In addition, some papers discuss how to use GUI-based testing techniques for mobile applications. For instance, Saswat Anand et al. in [17] discussed an automated concolic testing approach to validating mobile GUI event sequences in smartphone applications. Similarly, D. Amalfitano et al. in [5] presented AndroidRipper, which uses an automated GUI-based technique to test Android apps with a structured manner. In addition, some researchers focused on Usability Testing [14], Testing Quality-of-Service (QoS) [16], and Wireless connectivity testing [3, 18]. Mobile Test Automation and Frameworks: Some research efforts are dedicated to develop tools (or frameworks) address some limits in current tools. For example, JPFAN- DROID in [11] is a verification tool supporting white-box testing for mobile application. And JeBUTi/ME discussed in [15] is a tool supporting white-box test coverage analysis based on a conventional CFG-based test model. A few of recent research papers focus on GUI-based testing using test scripts and GUI event-flow models [15, 17]. In addition, a few research tools at the system level are proposed. One of them is the integrated test automation framework [10]. With this framework, high-level test cases can be executed on different mobile platforms (such as Android and iphone). Another example is MoViT [6], which is a distributed software suite for the emulation of mobile wireless networks. Existing black-box and white-box test models and coverage criteria can be used in mobile applications to address mobile program structures, dynamic behaviors, and GUI 32

operation flows. However, engineer still need new test models to address special needs in testing mobile applications. The existing test models seldom address the test modeling and criteria for mobile environment contexts (such as mobile platforms, web browsers, mobile technologies, different native APIs, and device-specific gesture, and related configuration on different devices), diverse network connectivity and related contexts, scalability and mobility, usability and security. Unlike the existing research, this paper provides a configuration model to present the various APP deployed platform and environments. In addition, we use a modelbased approach to address the testing issues in modeling mobile applications, including test modeling and test complexity analysis. 6. CONCLUSIONS According to the latest study from Juniper Research, the market for cloud-based mobile applications will grow 88% from 2009 to 2014. We believe, this brings the strong demand on new research results and mobile test automation solutions to cope with the discussed issues and challenges. Although there are numerous papers addressing how to construct configurable software and components [4, 19], few papers discussed how to test configuration features in mobile applications, especially for mobile test environment. This paper uses model-based approach to discuss the relating issues, challenges, and test process. It applied a semantic tree model as a test model to present and analyze the diverse configurable and deployable environments in mobile applications. In addition, the detailed test criteria analysis and complexity computation is presented. Furthermore, some case study results are reported to demonstrate its effectiveness and application in test modeling and test complexity analysis. Currently, we are developing a test automation solution to support automatic mobile test environment deployment. The future extension of this research is to study how to use a model-based approach to addressing testing issues and challenges in mobile applications in cloud-based and service-based background. 7. ACKNOWLEDGMENTS This work is supported partially by the National Natural Science Foundation of China No. 61202003, partially by the Specialized Research Fund for the Doctoral Program of Higher Education No. 20113219120021, and partially by Fujitsu Labs. We also thank the students of SJSU s CMPE 287 course who participated in our study, and the support of Computer Engineering Department in San Jose State U- niversity of California. 8. REFERENCES [1] http : //www.clearwatercf.com/documents/library. [2] https : //www.abiresearch.com/press/200 million mobile application testing market boos. [3] T. P. Akka and M. Palola. Towards automating testing of communicational b3g applications. In International Conference on Mobile Technology, Applications Systems, 2006. [4] D. M. Cohen, S. R. Dalal, M. L. Fredman, and G. C. Patton. The aetg system: An approach to testing based on combinatorial design. IEEE Transactions on Software Engineering, 23(7):437 444, 1997. [5] D. Amalfitano, et al. Using gui ripping for automated testing of android applications. In IEEE International Conference on Automated Software Engineering, 2012. [6] E. Giordano, et al. Movit: the mobile network virtualized testbed. In ACM International Workshop on Vehicular Inter-networking, Systems, and Applications, 2012. [7] J. Gao, X. Bai, W. T. Tsai, and T. Uehara. Mobile application testing: a tutorial. IEEE Computer Special Issue on Software Validation, pages 26 35, 2014. [8] J. Gao, J. Guan, A. Ma, C. Q. Tao, X. Y. Bai, and D. C. Kung. Testing configurable component-based software-configuration test modeling and complexity analysis. In International Conference on Software Engineering and Knowledge, pages 495 502, 2011. [9] G.Matthias and G. Klaus. Classification trees for partition testing. Software Testing, Verification and Reliability, 3(2):63ĺC82, 1993. [10] H. Song, et al. An integrated test automation framework for testing on heterogeneous mobile platforms. In International Symposium on Software and Network Engineering, 2011. [11] H. V. D. Merwe, et al. Verifying android applications using java pathfinde. ACM SIGSOFT Software Engineering Notes, 37(6):1 5, 2012. [12] A. E. Hassan. Predicting faults using the complexity of code changes. In International Conference on Software Engineering, pages 78 88, 2009. [13] J. Bo, et al. Mobiletest: a tool supporting automatic black box test for software on smart mobile devices. In International Workshop on Automation of Software Test, 2007. [14] T. Kallio and A. Kaikkonen. Usability testing of mobile applications: A comparison between laboratory and field testing. Journal of Usability studies, 1(1):4 16, 2005. [15] R. Mahmood, et al. A white-box approach for automated security testing of android applications on the cloud. In International Workshop on Automation of Software Test, 2012. [16] R. Mizouni, et al. Performance evaluation of mobile web services. In IEEE European Conference on Web Service, 2011. [17] S. Anand, et al. Automated concolic testing of smartphone apps. In ACM SIGSOFT International Symposium on the Foundations of Software Engineering, 2012. [18] I. Satoh. Software testing for wireless mobile computing. IEEE Wireless Communications, 11(5):58 64, 2004. [19] D. B. Stewart, R. A. Volpe, and P. K. Khosla. Design of dynamically reconfigurable real-time software using port-based objects. IEEE Transactions on Software Engineering, 23(12):759 776, 1997. 33