Copyright 2013 Vivit Worldwide On the Edge of Mobility Building a Bridge to Quality October 22, 2013
Brought to you by Copyright 2013 Vivit Worldwide
Hosted by Stephanie Konkoy Americas Chapter/SIG Liaison Marketing Coordinator Vivit Copyright 2013 Vivit Worldwide
Today s Presenter Shamim Ahmed Global CTO HP Software Professional Services Copyright 2013 Vivit Worldwide
Housekeeping This LIVE session is being recorded Recordings are available to all Vivit members Session Q&A: Please type questions in the Questions Pane Copyright 2013 Vivit Worldwide
Webinar Control Panel Toggle View Window between Full screen/window mode. Questions Copyright 2013 Vivit Worldwide
A bridge to an effective enterprise mobile apps test program HP Software Professional Services
The world is going mobile 56% Mobile access is now a significant factor when we consider new business applications and vendors 4:1 By 2016, mobile app projects will outnumber traditional app projects 52% Believe that mobile applications make most users more productive Location-aware, frictionlesspayment taxis and private cars Accelerated check deposits Check in, change and monitor flights 8
Quality and performance matter Could better mobile testing have prevented this? A very public launch With very public quality issues led to public apologies and executive turn over 9
Poll What are the biggest challenges you face in mobile apps testing? a) Don t know where/how to start b) Don t have the right tools c) Don t have expertise d) Don t have test environment e) Don t have enough resources 10
The challenge is getting worse Not enough time to test 18% 33% Don't have an in-house test env 19% 38% Don t have the right tools to test 37% 65% Don t have the devices readily available 52% 52% Don t have the right testing process / method 34% 56% No mobile testing experts available 29% 48% 2013 The 2013-2014 World Quality Report was based on 1,500 detailed interviews with senior executives from a range of IT and business-related functions in medium and large companies, government and public sector organizations across 25 countries. 2013 11
Why is mobile testing so difficult? 1. User expectations are exceedingly high Incredibly competitive market It must work right the first time 2. Diverse platforms Test results can be dependent on device, OS, and network provider 3. Speed and velocity are essential Pace of change in mobile platforms and devices is incredibly fast Mobile development projects must be agile with fast and frequent sprints 4. Mobile testing cannot succeed as an isolated effort Bolt-on Waterfall testing to Agile development does not work Must be integrated with full lifecycle approach to drive quality outcomes 5. New performance demands More users, interactive every few minutes 6. False sense of security Outsourcing app development is no guarantee of quality 12
The extreme agile nature of mobile apps development BYOD Extreme Text agile Accelerated DevOps Cloud 13
Key HP best practices for mobile testing
HP recommended practices for mobile apps testing Mobile test framework Agile automation Hybrid mobile test environment Lifecycle approach 15
HP mobile testing framework Mobile Test Accelerator MTA framework Backend system API HP UFT GUI/API G U I with Key-word pre-built HPA Component Lib. UFT Mobile Platform Provider 16
Poll What testing tools do you use for mobile apps testing (check all that apply) a) HP Application Lifecycle Management/Quality Center b) HP UFT Mobile c) HP Loadrunner/Performance Center d) Shunra e) Other tools 17
Where does the MTA fit? Dev/Build tools ALI Dev Integrations HP ALM + Mobile Test Accelerator HP UFT Real devices Mobile Test Automation UFT Add-ins SEV, ST Mobile Application Requirements, Test Cases, Defects HP PC/ Loadrunner with TruClient 18
Key features of MTA Lifecycle approach Support for hybrid apps / environments Mobile Test Accelerator Keyword + function library NFR checklist Features: Keyword/data driven functional test automation integrated with UFT Mobile solution framework Has set of pre-built libraries that hide the complexity of scripting, extensible to multilingual support, device and platform independent Integrates with ALM solution framework and other third party tools such as build systems Intuitive and flexible framework (script once, test many) 19
Test Functions Library in MTA Flow Test Plan (ALM) Test Flow Flow Excel Driver Bus. Func Component Component Step Keyword Step Keyword Step Keyword Mobile Test Functions Library 20
Example of mobile NFRs???? Screen rotation Incoming phone call Incoming SMS System pop-ups 21
Dynamic scripting NFRs Scripting engine UFT test scripts Test steps 22
Hybrid mobile test lab Windows7 USB ad hoc HP UFT add-in for local add-in for Cloud add-in for WAN @ location 1 Mobile cloud WAN VMware images ALM with SPRINTER Scheduling Orchestration Windows7 On-premise lab HP UFT Add-ins add-in for WAN add-in for Cloud add-in for local @ other location 23
Example of multi-platform tests Mobile native devices Chrome browser Android ALM-QC with MTA Test cases Trigger Dev/Build Management Systems Safari browser ios Libraries Back-end servers Chrome browser SEV, ST UFT IE browser Windows 7 PC UFT on Win Host 24
Typical agile testing cycle for mobile Dev: unit, sanity, security Daily Scrum Major release Sprint Final Deploy Test: regression, interoperability Pre-production Check out if no Show stoppers 2 weeks Sprint Test: performance, usability OPS: monitoring High priority task Product Backlog Sprint Backlogs / User stories Sprint (N-1) Sprint demo Sprint (N) Test: apps store certification Production To appstore 25 ~ 2 weeks 1 weeks
Poll What % of your current mobile test cases are automated? a) <20% b) 20-40% c) 40-60% d) 60-80% e) >80% 26
The s+1 rule of test automation Automation never lags by more than one sprint C Component (non-gui) G GUI element B Business process Stable Sprint 1 Sprint 2 1 Sprint 3 Sprint 4 Sprint 5 C1 C1 C1 C1 C1 G1 G1 G1 G1 G1 C2 C2 C2 C2 G2 G2 G2 C3 C3 B1 B1 27
Shift left your testing Traditional Sprint 1 Sprint 2 Sprint 3 Sprint 4 Developers Design Code 1 Code 2 Code 3 Testers Design test Execute test Test driven Developers Code 1 Code 2 Code 3 Testers Design test Execute test 28
Iteration N+1 s+0 parallel automation for mobile apps Developers Change analysis Testers Update library Design wireframes Create keyword library Develop code Create scripts Build and unit test Build verification and regression test 29
Builds Build Continuous integration to support agile testing Developers Test Req/Tasks/Defects HP ALM Track ALI Reports IDE + HP ALI Dev Implement requirements defined in HP ALM Plan Check-in Check-out SCM System Source code is stored in SCM Build System Continuous Integration 30
How we measure quality of automation KPI Description Target Reusable % of test components re-used within scripts > 30% Maintainability Effort required to maintain a script relative to the effort of script creation Comparable to maintainability cost of the mobile application Reliability Test failure due to errors in scripting <5% Robustness Ability to recover from exception conditions and continue un-attended >95% Scalable Enables addition of new scripted components across multiple releases with minimal effort <10% 31
Savings from HP Mobile Testing Accelerator Task Effort without MTA (hrs) Effort using MTA (hrs) % Savings Script average size test case 1.5 1 33.3% Debugging, exception and error handling for robustness Updating average size test script 2 1 50% 1 0.3 70% Multi-language support 4 0.5 87.5% 32
Don t neglect performance and security Your users are everyone, everywhere Analyze security threats Cannot rely on securing the perimeter App security is the biggest hole Scan code for security vulnerabilities Static and dynamic scans Conduct security penetration tests Viruses & trojans Data protection Network vulnerability ID theft Emulated devices Real devices WAN emulation Test under real conditions More users, more frequently Test at each and every point Client network backend Backend monitoring 33
Typical mobile app performance testing lifecycle Layer 1 Layer 2 Layer 3 Mobile applications Backend Wireless N/W Wire N/W Laptop Web server App server DB server User experience Device/browserprocessing or rendering time Network latency Application response time Isolate latency at device or browser end. Tool: LoadRunner (Mobile App Protocol / TrueClient) + Real Devices/ Simulators/Cloud/ Gomez Load test with network bandwidth and latency simulation. Network bottlenecks to be isolated. Tool: LoadRunner + Shunra Business scenarios to be simulated as in production with usage patterns and load scenarios. Application response times, performance metrics (through put, memory, CPU utilization) at server end to be measured. Tool: LoadRunner 34
Summary: HP overall approach to mobile apps testing Quality/ agility 1 2 3 4 Test framework Agile automation Hybrid test lab Lifecycle approach Built-in non-functional test cases Agnostic to UFT plug-in Multi-language support Keyword-based Dynamic script generation Parallel automation Local for unit and ad-hoc testing Cloud-based for regression/ scheduled tests Leverage enterprise ALM tools Full traceability Virtualize back-end systems Shared requirements/ defects 35
HP services can help you Value/ROI Testing and consulting QA Best practices consulting Functional Test automation Performance Testing Testing-as-a-Service Mobile ALM Center of Excellence Processes & methods Integrated tools Governance Foundation services Assessment & planning services 36 1-3 2 weeks 2-6 months 3-12+ months days Time
Poll How can HP help you with your mobile testing effort (tick all that applies)? a) Help me develop a strategy b) Help me assess where I am and build a roadmap c) Help me with how to use HP Software better d) Help me with technical resources e) Help me improve my methods and processes 37
Connect with HP experts Contact your HP sales rep to schedule a Mobile ALM Solution Discovery Workshop or to meet with one of our experts Hear about ALM for Mobile http://h30499.www3.hp.com/t5/apps-for-mobile/bgp/apps-for-mobile HP ALM services for Mobile http://www.hp.com/go/almservices 38
Q&A
Thank you Complete the short survey and opt-in for more information from HP Software and you will be entered in five random drawings for a $50 USD Amazon gift card. www.hp.com www.vivit-worldwide.org Copyright 2013 Vivit Worldwide