Automating Performance Tests: Tips to Maximize Value and Minimize Effort. Test Automation Day. Scott Barber

Similar documents
Choose Wisely. Scott Barber

Scott Barber. Fully Integrating Performance Testing: Into Agile Development. Created for: Chief Technologist PerfTestPlus, Inc.

Performance Testing: Scott Barber

Test Automation for Embedded Devices

SOA Performance Testing Challenges

Rapid Performance Testing

Scott Barber. Improving System Performance. through a Few Minutes, Every Day, from Everyone

Performance Testing for Managers

Experience, Not Metrics

Performance Testing Challenges

SOA Testing Challenges

Creating Effective Load Models for Performance Testing with. Incomplete Empirical Data. Scott Barber Chief Technology Officer PerfTestPlus, Inc.

Scott Barber Chief Technology Officer PerfTestPlus, Inc.

The 2014 Bottleneck Report on Enterprise Mobile

Sample Exam Foundation Level Syllabus. Mobile Tester

Using Google Analytics

WHO S AFRAID OF THE BIG, BAD (ONLINE) BOOKING?

Optimizing Your Database Performance the Easy Way

Everything you need to know about pitch, proposal and RFP management software

5 Group Policy Management Capabilities You re Missing

WHY BUSINESS CONTINUITY PLANS FAIL

Managing Your Class. Managing Users

Enrollment in a similar plan

Using Microsoft SharePoint for Project Management

How to Plan a Successful Load Testing Programme for today s websites

business brief How Mobility Transforms the Art of Customer Interaction jabra.com

Reducing Customer Churn

Performance Testing Uncovered

Server Security. Contents. Is Rumpus Secure? 2. Use Care When Creating User Accounts 2. Managing Passwords 3. Watch Out For Aliases 4

Where-to-Web-to-Print? Hosted versus On-Site

From Rivals to BFF: WAF & VA Unite OWASP The OWASP Foundation

The Sophisticated Attack Myth: Hiding Unsophisticated Security Programs: The Irari Rules of Classifying Sophisticated Attacks

What makes a good process?

Once you have clearly defined your ideal client, use these practical applications for your business web presence:

White Paper. The Ten Features Your Web Application Monitoring Software Must Have. Executive Summary

Introduction. Success Tips for GRC Projects

The Benefits of Deployment Automation

Getting Started with Kanban Paul Klipp

The Basics of Promoting and Marketing Online

Enterprise Job Scheduling: How Your Organization Can Benefit from Automation

SharePoint Information Architecture: The Tools of the Trade By Steven Pogrebivsky, CEO at MetaVis Technologies

Developing a Load Testing Strategy

Building Mobile Applications

The Lucky 13. of International Conference Calling. 13 Critical Capabilities Your International Conferencing Provider Should Offer

5 barriers to database source control and how you can get around them

Top 10 Considerations for Selecting the Right RMM Solution

Image Credit:

Basic Testing Concepts and Terminology

'& ##! %1# ##!!* #!!! 23!!!

Bringing Value to the Organization with Performance Testing

HOSTING IS DEAD. How to break free from servers, slash costs and grow your WordPress site.

TESTING FRAMEWORKS. Gayatri Ghanakota

Project 2: Penetration Testing (Phase II)

STRESS TESTING LOAD ON A SERVER

Essential Visual Studio Team System

The essential guide to integrating and customizing Shopify. Using Bedrock Data

7 Tips to Maximize Profits as a Hosting Reseller

Web applications today are part of every IT operation within an organization.

WHAT IS THE CONFIGURATION TROUBLESHOOTER?

8 TIPS FOR MAKING THE MOST OF GOOGLE ANALYTICS. Brought to you by Geary LSF and Orbital Informatics

Cloud Computing Survey Perception of the companies. DPDP - Macedonia

Agile Aspects of Performance Testing. March 1, Page 1

My DevOps Journey by Billy Foss, Engineering Services Architect, CA Technologies

Introduction. Review of GSX 9.3

Point-Of-Sale Research Worksheet

Staffing Your Test Automation Team

Visit salonbooker.com or Call

IT Service Management Metrics that Matter. Reason to Improve: Unintended Consequences of Low Performance

Creating a Software House

COMEAU WEB COPY. by Brett Comeau. Comeau Web Copy. lead marketing. e: o: 1(888) c: 1(310)

Best Practices in Release and Deployment Management

Manual Tester s Guide to Automated Testing Contents

Managed Service Marketing

Foreword by Martin Fowler *

What strategies will help me reach my savings goals?

Analysis Programming

+ I'm Leaving You: The risks of dumping your old CMS for Drupal and how to manage them By Nicole Lind SVP, Treehouse Agency

COMPELLING SUBJECT LINE SWIPES

As databases grow, performance drops, backup and recovery times increase, and storage and infrastructure costs rise.

Should You Advertise at The Knot

The Role of Automation Systems in Management of Change

Lockout/Tagout (LOTO): Automating the process

BUSINESS IMPACT OF POOR WEB PERFORMANCE

Taking the First Steps in. Web Load Testing. Telerik

Test Data Management Best Practice

Presentation Title: When Anti-virus Doesn t Cut it: Catching Malware with SIEM

10 Steps to Building a More Profitable Construction Business. Leslie Shiner, MBA Sage Certified Consultant

Building a Modern Security Engineering Organization.

Sure-fire remedies for your Monday morning networking headaches

Aligning Customer Expectations. In the Complex World of Magento

How To Adopt Rup In Your Project

Guideline for stresstest Page 1 of 6. Stress test

Using WebLOAD to Monitor Your Production Environment

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Copyright 1

State of web readiness: E-commerce

A Strategic Approach to IT Monitoring

Introduction site management software

How To Make Internet Available For Free

Performance Testing for Managers. Presented by Stuart Moncrieff at SIGiST Melbourne on June 15 th, 2011

Transcription:

Automating Performance Tests: Tips to Maximize Value and Minimize Effort Created for: Test Automation Day Zeist, NE 23 June, 2011 Scott Barber Chief Technologist PerfTestPlus, Inc. Automating Performance Tests Page 1

Co-Author: Beautiful Testing Scott Barber CTO, PerfTestPlus, Inc. sbarber@perftestplus.com www.perftestplus.com Co-Founder: Workshop On Performance and Reliability www.performance-workshop.org Performance Testing Guidance for Web Applications oreilly.com/catalog/9780596159825 www.codeplex.com/perftestingguide www.amazon.com/gp/product/0735625700 Automating Performance Tests Page 2

More than other test automation Bad performance test automation leads to: Undetectably incorrect results Good release decisions, based on bad data Surprising, catastrophic failures in production Incorrect hardware purchases Extended down-time Significant media coverage and brand erosion Automating Performance Tests Page 3

More than other test automation Performance test automation demands: Clear objectives (not pass/fail requirements) Valid application usage models Detailed knowledge of the system and the business External test monitoring Cross-team collaboration Automating Performance Tests Page 4

Avoid bad performance test automation Unfortunately, bad performance test automation is: Very easy to create, Difficult to detect, and More difficult to correct. The following 10 tips will help you avoid creating bad performance test automation in the first place. Automating Performance Tests Page 5

Tip # 10: Data Design *Lots* of test data is essential (at least 3 sets per user to be simulated 10 is not uncommon) Test Data to be unique and minimally overlapping (updating the same row in a database 1000 times has a different performance profile than 1000 different rows) Consider changed/consumed data (a search will provide different results, and a item to be purchased may be out of stock without careful planning) Don t share your data environment (see above) 9 xxxxxxxxxxxxxxxxx 8 xxxxxxxxxxxxxxxxx 7 xxxxxxxxxxxxxxxxx 6 xxxxxxxxxxxxxxxxx 5 xxxxxxxxxxxxxxxxx 4 xxxxxxxxxxxxxxxxx 3 xxxxxxxxxxxxxxxxx 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 6

Tip # 9: Variance Static delays yield unrealistic results (a range of +/- 50% is typically adequate) Delays between each page should be different (users do not spend the same amount of time on every page) Script multiple paths to the same result (not every user will take a direct path to their desired result) Don t let every path run to completion (not every user will finish what they started) 9 Variance 8 xxxxxxxxxxxxxxxxx 7 xxxxxxxxxxxxxxxxx 6 xxxxxxxxxxxxxxxxx 5 xxxxxxxxxxxxxxxxx 4 xxxxxxxxxxxxxxxxx 3 xxxxxxxxxxxxxxxxx 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 7

Tip # 8: Object-Orientation Separate scripts for every path is unrealistic (this can lead to a 1:1 ratio of scripts to simulated users) Many paths have overlapping activities (without OO, a change to single webpage can lead to dozens of script edits) Script maintenance is difficult enough (building OO scripts can make maintenance up to 10x simpler) Makes custom functions viable (code once, reuse over and over even on future projects) 9 Variance 8 Object-Orientation 7 xxxxxxxxxxxxxxxxx 6 xxxxxxxxxxxxxxxxx 5 xxxxxxxxxxxxxxxxx 4 xxxxxxxxxxxxxxxxx 3 xxxxxxxxxxxxxxxxx 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 8

Tip # 7: Iterative/Agile Writing performance scripts is development (if you don t treat it as such, you ll regret it at execution time) Code some, test some (formal development practices are not generally necessary; applying sound principles is) The application will change, so will scripts (it s more efficient to keep up with changes build-to-build than all at once) Use configuration management (when scripts work against a build, check them into the CM system with the build roll-backs happen) 9 Variance 8 Object-Orientation 7 Iterative/Agile 6 xxxxxxxxxxxxxxxxx 5 xxxxxxxxxxxxxxxxx 4 xxxxxxxxxxxxxxxxx 3 xxxxxxxxxxxxxxxxx 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 9

Tip # 6: Error Detection Tools have weak error detection (particularly if your site has custom error pages/messages) Error pages tend to load *very* quickly (a test that has 50% undetected page not found errors will have fantastic performance results) Custom functions are often needed (yes, this means writing real code get help if you need it) Don t believe your performance results until you check the logs (see above) 9 Variance 8 Object-Orientation 7 Iterative/Agile 6 Error Detection 5 xxxxxxxxxxxxxxxxx 4 xxxxxxxxxxxxxxxxx 3 xxxxxxxxxxxxxxxxx 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 10

Tip # 5: Human Validation Building scripts that *seem* to work is easy (building scripts that *do* work can be hard... Check logs & use the application manually while tests are running) Performance test results can be misleading (reported response times aren t always similar to what users see get humans on the system while it s under load) Numbers don t tell the whole story (4 seconds may sound good, but users may experience 8 seconds outside your firewall) Users like consistent performance (get users on the system, then inject load pay attention to their responses) 9 Variance 8 Object-Orientation 7 Iterative/Agile 6 Error Detection 5 Human Validation 4 xxxxxxxxxxxxxxxxx 3 xxxxxxxxxxxxxxxxx 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 11

Tip # 4: Model Production Results are only as accurate as your models (focus on how the system *will* be used, not how someone *hopes* it will be used) Use multiple profiles/models (usage patterns can vary dramatically over time the same volume of traffic in an different pattern can change performance remarkably) Don t extrapolate results (when the environments don t match, don t guess what production will be) Limited beta releases are the best way to validate models before it s too late 9 Variance 8 Object-Orientation 7 Iterative/Agile 6 Error Detection 5 Human Validation 4 Model Production 3 xxxxxxxxxxxxxxxxx 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 12

Tip # 3: Reverse Validate Released does not mean done (almost everyone pushes a patch shortly after release 1 plan on it) Check your model against production usage (typically at the end of week 1 and month 1 are good) Re-run in test environment with revised models (you may be surprised at how much the performance results differ) Compare results from re-run against previous runs *and* production (this is the only way to validate your predictions and/or improve future predictions) 9 Variance 8 Object-Orientation 7 Iterative/Agile 6 Error Detection 5 Human Validation 4 Model Production 3 Reverse Validate 2 xxxxxxxxxxxxxxxxx 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 13

Tip # 2: Tool-Driven Design The tool was not made to test your application (expect to need to accomplish some things that the tool doesn t make easy) Do not limit your tests to what is easy in the tool (it is frequently the things the tool doesn t handle that causes performance problems) Don t be afraid to use multiple tools (sometimes its simply easier to launch two tests from different tools than it is to get one tool to do everything) Tools are supposed to make your job easier (if it doesn t, get a new tool) 9 Variance 8 Object-Orientation 7 Iterative/Agile 6 Error Detection 5 Human Validation 4 Model Production 3 Reverse Validate 2 Tool-Driven Design 1 xxxxxxxxxxxxxxxxx Automating Performance Tests Page 14

Tip # 1: Value First Sometimes the best automation is no automation (spending a week to script a difficult rare activity is not a good use of time do that activity manually during test runs) Don t fall in love with your scripts (applications change, treat your scripts as disposable it s often more efficient to re-record than to debug) Make custom code reusable (taking the time to make custom functions reusable will save time later) Before choosing to build a complicated script, ask Is this the most valuable use of my time? 9 Variance 8 Object-Orientation 7 Iterative/Agile 6 Error Detection 5 Human Validation 4 Model Production 3 Reverse Validate 2 Tool-Driven Design 1 Value First Automating Performance Tests Page 15

Summary 10 Design Data Carefully 9 Build in Variance 8 Employ Object-Orientation 7 Apply Iterative/Agile Approaches 6 Incorporate Error Detection 5 Include Human Validation 4 Model Production Usage Patterns 3 Reverse Validate with Production 2 Avoid Tool-Driven Test Design 1 Value First; What *not* to Automate Automating Performance Tests Page 16

Questions Automating Performance Tests Page 17

Contact Info Scott Barber Chief Technologist PerfTestPlus, Inc E-mail: sbarber@perftestplus.com Web Site: www.perftestplus.com Automating Performance Tests Page 18