How to Optimize Automated Testing with Everyone's Favorite Butler Viktor Clerc, Product Manager & Jenkins Fan, XebiaLabs Footer
Agenda The World of Testing is Changing Testing = Automation Test Automation and CD: Execution and Analysis Focus on the Basics Best Practices for Test Execution using Jenkins Supporting Test Analysis 2
But first a bit about me Product Manager XL TestView for XebiaLabs Traversed through all phases of the software development lifecycle Supported major organization in setting up a test strategy and test automation strategy Is eager to flip the way (most) organizations do testing 3 Footer
and about XebiaLabs We build tools to solve problems around DevOps and Continuous Delivery at scale 4 Footer
The World of Testing is Changing 5
Introducing Test Automation For Real SPECIFY DESIGN BUILD TEST INTEGRATE REGRESSION USER ACCEPTANCE RELEASE Development = Test Test = Development Automate ALL Acceptance Driven Testing User Acceptance Test effort
Testing = Automation 7
Testing = Automation: Implications Developers are becoming testers Maintain test code as source code Need to set up on-demand pipelines and environments Infrastructure as code X-browser tests, Selenium grids, dedicated performance environments, mobile etc. Hosted services
Testing = Automation: Challenges Many test tools for each of the test levels, but no single place to answer Good enough to go live? Requirements coverageis not available Did we test enough? Minimize the mean time to repair Support for failure analysis JUnit, FitNesse, JMeter, YSlow, Vanity Check, WireShark, SOAP- UI, Jasmine, Karma, Speedtrace, Selenium, WebScarab, TTA, DynaTrace, HP Diagnostics, ALM stack AppDynamics, Code Tester for Oracle, Arachnid, Fortify, Sonar,
Testing = Automation: Challenges Thousands of tests makes test sets hard to manage: Where is my subset? What tests add most value, what tests are superfluous? When to run what tests? Running all tests all the time takes too long, feedback is too late Quality control of the tests themselves and maintenance of testware
Testing = Automation: Challenges Tooling overstretch
Testing = Automation: Challenges Tooling overstretch Poor butler!
Test Automation and CD: Execution and Analysis 13
The Two Faces of CD A lot of focus right now is on pipeline execution but there s no point delivering at light speed if everything starts breaking Testing (= quality/risk) needs to be a first-class citizen of your CD initiative!
The Two Faces of CD CD = Execution + Analysis
The Two Faces of CD CD = Execution + Analysis = Speed + Quality
The Two Faces of CD CD = Execution + Analysis = Speed + Quality = Pipeline orchestration +..?
Focus on the Basics 18
Quick Review 1. Cohn s pyramid Unit tests Service tests (under the GUI) (graphical) User Interface tests 2. And even further downstream Integration Tests Performance Tests 19
Modern Testing 101 1. Testers are developers 20
Modern Testing 101 1. Testers are developers 2. Test code equals production code Conway s Law Measure quality 21
Modern Testing 101 1. Testers are developers 2. Test code equals production code Conway s Law Measure quality 3. Linking tests to use cases 22
Modern Testing 101 1. Testers are developers 2. Test code equals production code Conway s Law Measure quality 3. Linking tests to use cases 4. Slice and dice Labeling 23
Modern Testing 101 1. Testers are developers 2. Test code equals production code Conway s Law Measure quality 3. Linking tests to use cases 4. Slice and dice Labeling 5. Radical parallelization 24 Fail FASTer! Kill the nightlies
Dealing With Growing Tests Conway s Law for test code Let the test code mimic the production code Organize tests under the project/system under test Suite.App.UseCase.TestCase Cut the suite at UseCase: now you have independent chunks which you can run massively in parallel 25
Dealing With Growing Tests Tests should not depend on other tests Setup and tear down of test data done within each test Share test components (as you would do with real production code) Trade-off between: No code duplication yet somewhat more complex fixtures Easy-to-grab simple fixtures but a lot of them (and duplication) 26
Keep It Manageable Focus on functional coverage, not technical coverage Say 40 user stories, 400 tests Do I have relatively more tests for the more important user stories? How do I link tests to user stories/features/fixes? Metrics Number of tests Number of tests that have not passed in <time> Flaky tests Duration 27
Slice and Dice Use appropriate labels in your test code Responsible team Topic Functional area Flaky Known issue etc. 28
Best Practices for Test Execution in Jenkins 29
Jenkins Testing Basics Tilt the pyramid and use this as the guiding principle to set up your Jenkins test jobs left to right 30
Organizing Test Jobs in Jenkins 1. Create unique artifacts and fingerprints to monitor what you are pushing across your pipeline 2. Treat different platforms (e.g. browsers) as different tests, handled by different jobs 3. Well-known plugins: Multi-job Copy Artifact Workflow 31
Organizing Test Jobs in Jenkins 4. Keep Jenkins jobs sane and simple Ergo: execute shell scripts from your Jenkins jobs 5. Shell scripts are parameterized 6. Parameters are fed to individual test tools FitNesse labels, Cucumber labels, etc. etc. 7. Shell scripts placed under version control Managed by the team as any other source code
Example Job Distribution Build Deploy Int. Tests Test Test Test Perf. Tests Test Build Deploy Int. Tests Test Perf. Tests Test 33 Beware of scattered result qualification
Distributing Tests Across Jobs Radical parallelization using cheap and cheerful throw-away environments Especially when environments (e.g. containers) lie at your fingertips Jobs should not depend on other jobs Test jobs are your eyes and ears optimize for them! 34
Example Job Distribution Int. Tests Test Build Deploy Test? Test Perf. Tests 35
Challenge: Scattered Results 36
Supporting Test Analysis 37
Making Sense of Test Results Real go/no go decisions are non-trivial No failing tests 5 % of failing tests No regression (tests that currently fail but passed previously) List of tests-that-should-not-fail Need historical context One integrated view Data to guide improvement Footer
Making Sense of Test Results Executing tests from Jenkins is great, but Different testing jobs have their share of Jenkins plugins Historic view merely available per job, not across jobs Pass/Unstable/Fail is too coarse How to do Passed, but with known failures? Footer
Making Sense of Test Results Ultimate analysis question ( are we good to go live? ) is difficult to answer No obvious solution for now, unless all your tests are running through one service 40 Footer
Example Case Study
FitNesse Implementation Started with 1 project containing all tests Sharing knowledge Structured the same as our use cases, i.e. WebshopSuite.BusinessAccountSuite.UseCase1500 Nightly runs from the beginning Indication by labels ( nightly ) First sequential per application WebshopSuite Later parallel, split by functional area WebshopSuite.BusinessAccountSuite.* 42
Example Pipeline Build Deploy Step Code review Check-in Build Unit tests Build EAR Deploy Smoke Test Tools Environment git-server Jenkins-server Dedicated Team Server
Testing End to End Testing Step System Test Production Acc. Test Deploy to Chain Smoke Test Chain Test Remarks Tools Environment Step Dedicated Team Server Security Test Tools Environment Step Dedicated Team Server Source Code Quality Test Tools Environment Jenkins server and Sonar server Chain {1-5} Chain {1-5} Chain {1-5}
Test Analysis: Homebrew 45
Test Analysis: Custom Reporting 46
Test Analysis: Custom Reporting 47
Summary Testing = Automation Testers are developers Structure and annotate tests Conway s Law for Tests Link to functions/features/use cases Radical parallelization Throwaway environments 48
Summary Keep Jenkins jobs simple Keep Jenkins jobs independent Track SUT with fingerprints Invoke test tools via plugins or version-controlled scripts Parameterization! Parallelize & optimize 49
Summary CD = Speed + Quality = Execution + Analysis Making sense of scattered test results is still a challenge Need to figure out how to address real world go/no go decisions 50
What s Next? Visit http://tiny.cc/webinar-xebialabs for a webinar by CloudBees and XebiaLabs demonstrating the key value of CD and go-live decisions Read more on the testing challenges in CD http://tiny.cc/ta-and-cd Try XebiaLabs XL TestView solution to bring quality into the heart of your CD initiative http://tiny.cc/xl-testview 51
Please Share Your Feedback Did you find this session valuable? Please share your thoughts in the Jenkins User Conference Mobile App. Find the session in the app and click on the feedback area. 52