Testing and Quality in Agile Development Speaker: Allan Watty Company: ABB Inc Website: www.abb.com/enterprise-software



Similar documents
Bridging the Gap Between Acceptance Criteria and Definition of Done

Evolving Agile Testing

Agile Testing: The Agile Test Automation Pyramid

ICAgile Learning Roadmap Agile Testing Track

Making Test Automation Work in Agile Projects

Automated Acceptance Testing of High Capacity Network Gateway

Agile Testing (October 2011) Page 1. Learning Objectives for Agile Testing

Agile Test Planning with the Agile Testing Quadrants

Test Automation: A Project Management Perspective

Introduction to Agile Software Development Process. Software Development Life Cycles

How To Be Successful At An Agile Software Engineering

Making Test Automation Work in Agile Projects

A Practical Guide to implementing Agile QA process on Scrum Projects

An Overview of Agile Testing

Agile Testing. Workshop. Tilo Linz, imbus AG

Executive Guide to SAFe 24 July An Executive s Guide to the Scaled Agile Framework.

Automation using Selenium

How To Write Unit Tests In A Continuous Integration

Selling Agile to the CFO: A Guide for Development Teams

Introduction to Agile and Scrum

Continuous Delivery for Force.com

Continuous Delivery - is it from the Devil? Tamas Csako

Scrum: A disciplined approach to product quality and project success.

Introduction and Agenda

Agile Testing with Acceptance Test Driven Development and Behavior Driven Design. Two Day Course Overview

"Testing in the DevOps World of Continuous Delivery"

Agile extreme Development & Project Management Strategy Mentored/Component-based Workshop Series

Quality Assurance in an Agile Environment

The Agile Manifesto is based on 12 principles:

SOFTWARE TESTING TRAINING COURSES CONTENTS

Enabling Continuous Delivery by Leveraging the Deployment Pipeline

Automation testing in Agile projects - Overview Shirly Ronen-Harel Mar 2014

You ll need to have: It d be great if you have:

Collaborative Project Management in a DevOps Culture

Latest Trends in Testing. Ajay K Chhokra

How NOT to Do Scrum. Patterns and Anti-patterns. Revised July First presented at New York City Scrum User Group June 17, 2010

WEMS IT User Group. Mark Brodziak: Solutions Architect John McLean: Project Manager Gary Wade: IT Consultant. 20 March 2015

IT Home 2015 DevOps 研 討 會

Agility via Software Engineering Practices

Waterfall to Agile. DFI Case Study By Nick Van, PMP

Agile Software Development. Stefan Balbo / Patrick Dolemieux

Viewpoint. Choosing the right automation tool and framework is critical to project success. - Harsh Bajaj, Technical Test Lead ECSIVS, Infosys

Good Agile Testing Practices and Traits How does Agile Testing work?

When agile is not enough

Effektiver Tool-Einsatz

Accelerating software testing effectiveness using Agile methodologies..

Agile Testing. What Students Learn

Benefits of Test Automation for Agile Testing

AGILE TESTING PRACTICES Building quality in for faster releases

Agile Development and Testing Practices highlighted by the case studies as being particularly valuable from a software quality perspective

Agile Software Development and Service Science

Testing and Scrum. Agenda. Fall 2007 Scrum Gathering

Applying Agile Project Management to a Customized Moodle Implementation

Agile Scrum Workshop

An Example Checklist for ScrumMasters

Co-Presented by Mr. Bill Rinko-Gay and Dr. Constantin Stanca 9/28/2011

Maintaining Quality in Agile Environment

Code Quality Assurance. Peter Kofler, Code Cop FH Technikum Wien, February 2010

Software Automated Testing

A Viable Systems Engineering Approach. Presented by: Dick Carlson

What s new in the HP Functional Testing 11.5 suite Ronit Soen, product marketing John Jeremiah, product marketing

a new generation software test automation framework - CIVIM

Live Specifications: From Requirements to Automated Tests and Back

Agile Testing Overview

Continuous Integration Processes and SCM To Support Test Automation

QEx Whitepaper. Automation Testing Pillar: Selenium. Naveen Saxena. AuthOr:

Agile Software Development

HP Application Lifecycle Management

CONTINUOUS DELIVERY + DISTRIBUTED TEAMS Keys to Success

Know the Difference. Unified Functional Testing (UFT) and Lean Functional Testing (LeanFT) from HP

SCALING AGILE. minutes

Driving Quality Improvement and Reducing Technical Debt with the Definition of Done

DevOps for CA Plex Automated Testing

Continuous Delivery. Ariel Alonso, IPC

Continuous Integration

Applying Lean on Agile Scrum Development Methodology

TSG Quick Reference Guide to Agile Development & Testing Enabling Successful Business Outcomes

Use Scrum + Continuous Delivery to build the right thing

HP ALM11 & MS VS/TFS2010

Leveraging Agile and CMMI for better Business Benefits Presented at HYDSPIN Mid-year Conference Jun-2014

Better Software Though Expertise, Collaboration & Automation. BDD, DevOps and Testing

HP Agile Manager What we do

XP & Scrum. extreme Programming. XP Roles, cont!d. XP Roles. Functional Tests. project stays on course. about the stories

STUDY AND ANALYSIS OF AUTOMATION TESTING TECHNIQUES

SESSION 303 Wednesday, March 25, 3:00 PM - 4:00 PM Track: Support Center Optimization

Modern practices TIE-21100/

Agile Systems Engineering: What is it and What Have We Learned?

Certified Agile Software Test Professional (CASTP)

Upping the game. Improving your software development process

Cross Platform Mobile. -Vinod Doshi

Advanced Test-Driven Development

How Silk Central brings flexibility to agile development

Software Engineering and Scientific Computing

Agile Project Management: Adapting project behaviors to the software development environment

Security Automation in Agile SDLC Real World Cases

ASSURING SOFTWARE QUALITY USING VISUAL STUDIO 2010

Agile Software Development and Service Science

Preface Agile Testing Review

The Importance of Continuous Integration for Quality Assurance Teams

Optimizing Your Software Process

Transcription:

Testing and Quality in Agile Development Speaker: Allan Watty Company: ABB Inc Website: www.abb.com/enterprise-software Welcome to the PMI Houston Conference & Expo 2015 Please put your phone on silent mode Q&A will be taken at the close of this presentation There will be time at the end of this presentation for you to take a few moments to complete the session survey. We value your feedback which allows us to improve this annual event. 1

Key Objectives Will cover Challenges to expect with Agile test practices in large-scale, multi-team projects The Power of three Team dynamics Organizational strategies & considerations Test automation considerations Customer, business impacts

Key Objectives Will not cover Automation details In-depth tool details Code samples

Speaker Bio ABB Inc., 2012 present Senior Development Manager New Product development with 2 Scrum Teams PROS, 2005 2012: Development Manager, QA Manager Aspen Technology, 2003 2005: Development Manager BMC Software, 1995 2003: Senior Development Manager Education B.S. Math & Biology, Lehman College, CUNY M.S. Computer Science, NYU Polytechnic M.B.A, University Of Houston Bauer College, Certifications: PMP, CSM President Agile Leadership Network Houston Chapter Former Life: Musician, Math Teacher

Agenda Agile Testing Challenges The whole team approach Designing for Testability Definitions of Ready and Done Risk based testing Using the Power of Three Making testing a first-class citizen Acceptance Test Automation Core Testing Practices Use Metrics to reflect and adapt

Key Challenges on Agile teams Mini-Waterfalls Testers are always behind Short-cuts Cone of Uncertainty Clarity on requirements Huge Manual Regression Backlog Poor estimates What should be automated? Late changes to the design Applications not designed for testability Testing at the end does not work Moving / unclear definition of DONE

Trouble Ahead? Lengthy Code, Test and Fix cycles High % of un-finished stories Debate on requirements after story is implemented Stories accepted with defects Broken builds is treated as no big deal Numerous build warnings Code Coverage is lowered by new code Stories started without clarity on success criteria Inability to Automate Acceptance Tests within Sprint

Cone of Uncertainty* *Steve McConnell, Software Estimation: Demystifying the Black Art

The Whole Team Approach Everyone is committed to deliver the highest quality software possible Daily collaboration Testing is a central practice Embedded testers and developers Task and knowledge sharing Release & Sprint planning Start and finish together Common definition of READY Common definition of DONE.

Foundation of A Great Team Patrick Lencioni: Overcoming the The Five Dysfunctions of a Team

Design for Testability Ensure that that any new features/code that is implemented can be tested. SOLID Design principles Layered Architecture Interface Programming User Behavior API Contract How can you know when I am done?

Using the Power of Three Developers, QE, Product owner share ownership of the story. Shepherd a Story/Feature from inception to completion Transparency and team work Shared understanding of: What is being built Conditions of success Nothing is thrown over the wall Work at a sustainable pace

Make testing a first-class citizen Test scripts get the same care and feeding as production code Failed tests results get actioned. Use Smoke Tests during development to detect DOA builds Test Framework must be maintainable by DEV and QE Test Code is stored in CM repository Broken automated test will slow new feature work.

Definitions of Ready and Done Story Readiness o What are the goals of pre-grooming and grooming? o When is a story ready to be brought into a sprint? Story Completeness o When is a story done? o What are all the things that should be completed before we accept a story? Definitions must be common across teams. o A story is not done until all its acceptance tests pass

Definition of Done Key Practice: Teams should agree on DONE at the start of the release. Without a consistent meaning of done, velocity cannot be estimated. Ensures that we deliver a potentially shippable product You are not finished with a feature until it meets the benchmark defined by done. Simple verifiable definition: o Coding is Complete o Testing is Complete o Related User Documentation is Complete o Feature is accepted by Product Owner 15

Why High Performance Agile Testing? Quality as a competitive advantage Overcome challenges of Agile testing Build capability to handle disruptive technology Be ready to scale as company grows 16

Mind map Look at the Big Picture Risk Based Testing Use the Whole Team Approach Adopt an Agile Testing Mindset Core Testing Practices Key Success Factors Automate Regression Testing Collaborate with the Customer Handle Technical Debt Regular constant feedback 17

What is Agile Testing? Before Agile: Testing begins at the end Predictive schedule Adaptive schedule F E D D C C B B B A A A A Sprint 1 Sprint 2 Sprint 3 Sprint 4 GA Agile: Each story is developed and tested within an iteration 18

What is Agile Testing? wk 1 wk 2 wk 3 wk 1 wk 2 wk 3 wk 1 wk 2 wk 3 wk 1 wk 2 wk 3 Implementation Testing Sprint Demo, retro, planning Testing starts early and is continuous o Each story slice is tested as soon as its finished o Application is exercised constantly, no surprises later QE is part of, and embedded in the product team o Co-located product teams o Power of 3: developers, testers, product owners Team Focus: o Test Automation o Exploratory testing o Design for testability o Business Value

QA/QE Involvement Release Planning o Story mapping, Feature/Epic review Sprint Planning and Reviews User Story conditions of success Test Case maintenance Customer usage Risk assessment Metrics Design reviews Root Cause Analysis Test Frameworks

Look at the Big Picture Use business-facing tests, examples to drive development Use real world test data Be aware of impacts on other areas: Software dimensions Feature usage Estimation Focus on business value User Stories Understand your context Value of Testing: Shipping a product customers love Value of Bug Reports: Describing the actual problem Customers Business Product/Process Software Operations Support

Five Dimensions of a Software Release Features Staff Quality Schedule Cost Each of these five dimensions can take one of three roles on any given project: a driver, a constraint, or a degree of freedom. 22

Actual use of software features Rarely 19% Actual Use of Software 45% 16% Sometimes Never Often 13% 7% Always Source: The Standish Group CHAOS Research

Agile Testing Mindset Testing is not a phase! Early feedback before feature is complete Key testing tasks: automation, exploratory One process no handoffs Focus on conditions of satisfaction No Quality Police Pro-active approach Eliminate Waste Adding value

Adopt Regular Release Cadence 25

Handle Technical Debt Using a design strategy that isn't sustainable in the longer term, but yields a short term benefit. Taking short cuts on coding and testing Design flaws Messy code Poor functional test coverage Quality debt

Handle Technical Debt 27

Handle Technical Debt Key Key Practice: We We will will ship ship on on time time and and revise the the quick quick fix fix as as our our top top priority in in the the next next release. Must Must have have commitment to to schedule prioritize, and and fix fix soon soon Take my advice: go well, not fast. Care about your code. Take the time to do things right. In software, slow and steady wins the race; and speed kills. Robert C. Martin (Uncle Bob) 28

Risk based testing Utilize support incidents and Root Cause Analysis output as additional test ideas. Review the likely use of each feature by customers. Analyze the likelihood and the impact of failure of key features. Prioritize regression testing based on risk Surround key functionality with unit and systems tests

Why Automate? Free up time for other important work Capture Repeatable tasks Safety net Quick feedback More time for exploratory testing Reduction of regression test time Tests provide living documentation You know what your system does

Regression Test Automation issues Initial hump of pain Fear of tool or code Test automation is software development! UI based testing vs. constant UI behavior changes Behind the UI tests requires development collaboration

Agile Testing Quadrants 32

Test Automation pyramid QTP, eggplant, Sikuli, Selenium, Watir FitNesse, Cucumber, Specflow, Concordion, shell scripts JuNit, Nunit, Flexunit, Klocwork PMD, FIndBugs 33

Story Workflow Story Start Story Review (Dev, QE, PM) Acceptance Criteria Agreement? Yes Story Implementation The whole team handles implementation Create Feature Files Write Failing Specflow Tests Lightwieht Design Reviews Incremental Testing Exploratory Testing Implement Story using TDD Refactor Yes Acceptance Test Passing Exploratory and manual tests 34 Story Review and Acceptance Story Complete

Testing the Windows Calculator Types of testing 1. User Behavior 2. Calculator engine How many times do you need to test the UI? 35

Behind the UI testing Test cases Test cases Customer s, PM, Dev, QE Fixture Fixture Developmen t Test all the services of an application separately from its user interface System 36

Automated Acceptance Tests Automated Acceptance Tests o ATDD, BDD, Specification by example ATDD is about communication ATDD is a practice for collaboratively discussing acceptance criteria with examples. Like TDD, involves creating tests before code Tests represent expectations of behavior the software should have. Acceptance tests provide feedback to the team about how close we are to the Definition of Done Provides living documentation of your application

Gherkin Language As a [Shopper] I want [to put items in my shopping cart] so that [I can manage items before I check out] Acceptance Criteria N Given I'm a logged-in User When I go to the Item page And I click "Add item to cart" Then the quantity of items in my cart should go up And my subtotal should increment And the warehouse inventory should decrement

Unit tests using TDD Test Driven Development (TDD) steps: 1. Write a unit test for a new piece of code 2. Execute the unit test 3. It should fail 4. Write the code that implements the function 5. Execute the unit test, it should now pass 6. Refactor 7. Repeat at step 1

ATDD cycle ATDD Cycle 40

ATDD cycle 1. Write Automated Acceptance Tests (AATs) using the acceptance criteria. 2. Test will fail initially. To make the AATs pass, start providing the functionality. 3. Now we need apply TDD concepts, write a failing unit test. 4. We write enough code to pass failed unit test. 5. Repeat at step 2, continue following UTDD 6. When a story is complete, AAT's and Unit Tests are passing, we now continue to step 1 again for the next story.

Behind the UI and API level automation Many tools are available to help with automation: o Specflow, Jbehave, Fitnesse, Concordion, Cucumber Tools are used to bridge the communication gap between domain experts, testers and developers. Bind business readable behavior specifications and examples to the underlying implementation. Specifications/user stories can be written in Gherkin format Tools enable separation of: o o o Input data and test scenarios (Feature Files) Test Methods or Fixtures that that drives the application Reporting of the results in HTML format

Specflow: API level automation 1. SpecFlow binds business readable behavior specifications and examples to the underlying implementation. 2. Lets us write specifications/user stories in plain readable Gherkin format. 3. Feature file: Specflow uses feature file to store the acceptance criteria of the features (User Stories) 4. Every scenario consists of a list of steps, which must start with one of the keywords Given, When, Then, But or And. 5. A test method is generated for each scenario step, the method name will be derived from the title of the scenario.

Selenium: UI level automation A set of tools that supports rapid development of test automation for web-based applications Support for a number of programming languages: Java, C#, Perl, PHP, Python, Ruby. Cross browsers support: IE, Firefox, Opera, Safari and Google Chrome Cross platform support: Windows, Linux and Macintosh

Test Automation 1. Multi-Layered Approach 2. Test Driven Development delivers unit, component tests 3. Acceptance Test Driven Development delivers API level, GUI tests 4. GUI smoke tests as regression safety net

Adopt Test Design Principles Single Purpose DRY Domain Specific Language Abstract Code out of Tests Setup and Teardown Tests Independence Tests must run green all the time Common Test Standards (including naming conventions) Test code is production code

UI Tests Minimal and purposeful Workflow tests UI Smoke Test Page Object Pattern Push for API/Service level tests

Page Object Pattern Separates test specification from the test implementation Wraps an HTML page, or fragment, with an application-specific API Enables easy manipulation of page elements UI Smoke Test Encapsulate the mechanics required to find and manipulate the data in the gui control Utilize Data Driven Testing

Page Object Pattern 49

Development Practices Key practices we have embraced: Before any code is written, acceptance criteria for a story is reviewed together by Product Owner, Developer and QE. For new functionality, the developer creates a light weight design and a design review is done Development starts with writing unit tests SpecFlow Acceptance Tests scenarios are written by QE SpecFlow Acceptance Tests code is written by developers and based on scenarios from QE Developers strive to give QEs testable functionality early and often Code review before source code check-in Apply any necessary changes due to code reviews Continuous Integration builds are started after every check in QE modifies Selenium UI tests if needed Changes are tested and approved by QEs and Product Owners Story is accepted by Product Owners

Continuous Integration Practices Key practices we have embraced: o o o o o o o o o o Continuous Integration builds are started after every check in All unit tests are run at the beginning of a build All automated acceptance tests are run at during the build Any unit or acceptance test failures stop the build Selenium UI smoke tests are executed daily Selenium UI regression tests are executed twice a week Any Selenium test failures are reported as defects and fixed within the sprint Installation packages are built as part of CI New installation is deployed to QA Development site as part of CI New installation is manually deployed to QA Test site

Build Quality In Find and fix defects the moment they occur. Mistake-Proof the Process Think of tests as specifications. Use them to establish confidence in the correctness of the system at any time. Integrate Early and Often Don't Tolerate Defects If you expect to find defects during final verification, your development process is defective.

Stop The Line Broken Builds Warnings threshold exceeded Defect threshold exceeded

Core Testing Practices Review Stories and Acceptance tests with PM and Developers Test estimates are part of sprint backlog Design test cases: o o Acceptance tests System tests: Negative, boundary, i18n, equivalence tests Test case reviews with PM and Developers Work with developers to get testable chunks delivered early Test case automation

Core Testing Practices Regular Root Cause Analysis Sessions Developers execute acceptance tests before final story testing. Exploratory testing Performance and Scalability Testing Test Case Maintenance o Remove dead tests o Update and add new tests Automate acceptance tests during the sprint UI automated smoke test UI Workflow automated tests

Use Metrics to reflect and adapt 1. Percentage of Story points claimed 2. Percentage of test cases automated 3. Percentage of code coverage 4. Current # of Build Warnings 5. Number of defects found in the sprint 6. Number of defects found during regression 7. Number of defects fixed 8. Number of Open defects Display metrics in each sprint review

Team Metrics 57

Collaborate with Customers Customer involvement is the most critical factor in software quality * Learn how customers use the product Formal beta releases (Early Access Program) Formal means: tested features, feedback expected, partnership Include key customer workflows in testing Dev/QE/PO customer communication and visits Ask customers for concrete examples, scenarios * Karl Wiegers,author - Software Requirements

Regular constant Feedback Use feedback to improve Make process / strategy corrections as needed Immediate Feedback: Continuous Integration & Deployment Learn from the tests Retrospectives Regular Root cause analysis of customer incidents Use support incidents to drive defect prevention Create taxonomy of defects Automate test cases from support incidents

Agile Testing: Key Success Factors Use the whole-team approach Adopt an Agile testing mind-set Automate regression testing Provide and obtain feedback Core Testing Practices Manage Technical Debt Collaborate with customers Look at the big picture Regularly reflect and adapt

Products that delight our customers! Under Commit! o Plan less work than you think you can do o Focus on finishing one user story at a time o 100% full means we do not go anywhere Budget time to o Learn new skills, make improvements o Refactor automated tests Focus on Mission: o Finding defects as early as possible in every release. o Quality becomes a competitive advantage o One team culture of developers, testers and product managers

Key References Agile Testing, by Lisa Crispin Essential Scrum, by Kenneth Rubin ATDD by Example: A Practical Guide to Acceptance Test-Driven Development, by Markus Gärtner Clean Code, by Robert Martin Lean-Agile Acceptance Test-Driven Development, by Ken Pugh Specification by Example by Gojko Adzic BDD in Action by John Ferguson Smart The Five Dysfunctions of a team by Patrick Lencioni Managing Software Debt, by Chris Sterling

Agile Leadership Network Houston Chapter Venue: Sysco Corporation, 1390 Enclave Parkway. Meetings are held the 3rd Thursday of each month unless otherwise noted on the chapter s website. Meeting format: o 6:00pm 6:30pm Network with your fellow Agile practitioners o 6:30pm 8:00pm Announcements and introductions, followed by our Featured Program Admission is Free Registration: http://alnhouston.org

Questions 64