Continuous Integration Sprint 6 Agneta Nilsson, Jan Bosch, & Christian Berger
How can we identify, prioritize, and select suitable improvement initiatives more efficiently in order to support the advancement towards continuous integration? Traditional Development R&D Organization All Agile Continuous Integration Continuous Deployment R&D as an Innovation System
The CIVIT Model, updated Customer Release Full Product Partial Product Improvement Directions Subsystem Component Once /release Month Week Day Hour Immediate/ Minutes
CItIM 1. Visualization (CIVIT) 4. Evaluation Continuous Integration Improvement Method 2. Identification, prioritization, and selection 3. Implementation
Systematic way of doing the change? Low hanging fruit Help apply CItIM model on the situation to see how we do the selection. But before doing the selection, we should identify - what can we do?
Identified themes Link between requirements and testing Maturity of test environment System level testing The product family perspective Suppliers and the 25x problem Variability in testing Cost of maintaining a CI environment vs taking the normal hit in unpredictability at the end
Role of requirements in relation to tests: Waste or value? Link between requirements and testing
We need to learn how to test with unstable requirements, and how to deal with increasing workload and complexity Link between requirements and testing
Instabilities of tests due to Managing trust in test results Managing variance in target and test equipment Maintenance of tests and test platforms Maturity of test environment
System level testing Very complex Very unpredictable behavior Unreliable testing tools Scarce resources, competent personnel Attitudes/mindsets personnel/organization Need to prioritize tests to support CI System Level Testing
Integration & verification It's fundamental to first understand what are we actually trying to do! - don't have the time to jump on the bike because you're so stressed to get moving System Level Testing
Documents vs Structured Information I like to think of it as moving the organization from a script where you have all these go-to in the code to an object oriented part where you just have instances that communicate with each other on a very simple level. And you hide all the complexity away. System Level Testing
we could sell a truck that is actually not verified, but we can't sell a truck that is not integrated System Level Testing
Data-Driven instead of processes An organization thinking in data instead of processes has an enormous advantage because it will focus on what's actually being done, instead of focusing on guarding business processes, protecting the budgets etc. which become extremely hard to change because of all these boundaries.
From Requirements to Test During the course of a project, requirements transition to test cases The formalization of requirements into test cases is a form of requirements elicitation and specification Requirements Test cases time
From Requirements to Test Depending on the level in the CiViT model, requirements might transition faster/easier to test cases Depending on the test environment (unit test, simulation environment, partial test environment), requirements transition faster/easier to test cases
Reuse of Requirements and Tests Companies (e.g. Axis) often define a new product in terms of the delta to an existing product That allows for reuse existing requirements and test cases Requirements New test cases Reused test cases time
Different Approaches Companies can formalize requirements through test cases early or late in the development process Different forces influence the curve that companies in different domains use Requirements Early transition Linear transition Late transition test cases time
Sprint 7 Further analyses of org & tech impediments from CI Risk-based test selection Link between requirements and testing Maturity of test environment System level testing The product family perspective Suppliers and the 25x problem Variability in testing Cost of maintaining a CI environment vs taking the normal hit in unpredictability at the end
Thank You! Nilsson, A. Bosch, J, & Berger, C. (2014) Visualizing Testing Activities to Support Continuous Integration: A Multiple Case Study. In proceedings of XP2014, 15th International Conference on Agile Software Development, May 26-30, 2014, Rome, Italy.
Legend F functional requirements Q Quality requirements Complete coverage Significant testing 70% < coverage < 95% Partial testing 30% < coverage < 70% Some testing but less than 30% coverage No testing of this type at all Coverage for each type of testing F Q L E L Legacy functionality E Edge cases Level of test automation Fully automated Significant automation, between 70 and 95% Partial automation, between 30 and 70% Some automation, less than 30% No automation at all
Axis Communications Firmware Platform Customer F Q L E Release F Q L E F Q L E Full Product F Q L E F Q L E F Q L E F Q L E Partial Product Subsystem Component F Q L E Once /release Month Week Day Hour Immediate/ Minutes
Sprint 6 What different aspects should be taken into consideration when selecting an improvement initiative? How can we assess the advantages, disadvantages, and various tradeoffs involved in the identified initiatives? How can we more systematically prioritize and select suitable initiatives in a certain context?
Picture Toolbox (Backup) Requirements Test cases time