Jidoka in Software Development Emanuele Danovaro, Andrea Janes, Giancarlo Succi Center for Applied Software Engineering Free University of Bolzano/Bozen, Italy {emanuele.danovaro, andrea.janes, giancarlo.succi}@unibz.it http://www.case.unibz.it
Agenda Lean Thinking A Jidoka Framework Overview Components Conclusions 2
Lean Thinking Initiated by Toyota when Ford was dominating the market using mass production. Initial idea: produce cars of different type on the same production line with similar costs and quality as obtained using mass production. 3
The starting point Time Time 4
Lean Thinking Strategy 1: Removing Muda Set-up JIT delivery Time Stock 5
Lean Thinking Strategy 2: Use Jidoka to ensure quality Machines that can detect a problem with the produced output and interrupt production automatically ("Quality at the source") http://sese.hpu.edu.cn/ie/mirror/leanword.htm 6
Examples for Lean Thinking in Software Engineering Product Muda Refactoring Jidoka Automatic Testing Process Agility? 7
Jidoka Framework Do Plan Check Act 8
Part 1: Notifying stakeholders 9
Interrupting software production Alerts shown in the development environment shown in the tray area E-mail Deny check-in Level of importance 10
Part 2: Identifying problems 11
Stopping software production We need rules to decide whether a software artifact is ready to proceed to the Define data next production step collection needs Rules use collected measurements to evaluate if a situation is critical or not evaluate which kind of alert to fire 12
Finding good rules GQM Align rules to busines goals Create rules that are understood as useful by stakeholders Alignment involving stakeholders in setting the goals before actual implementation has been found to improve organizational performance 13
Examples for rules Bad smells in code Shotgun surgery (on a development level or also across the team) Workflow conformance Verify if software is developed test-first Verify if software is released within a maximum time frame 14
Part 3: Collecting data 15
Measurement Probes Inputs Activities Outputs 16
Inputs Main cost driver typically effort Time spent editing artifacts like code or documentation We built probes that track the time spent within IDEs like Eclipse, Microsoft Visual Studio, Sun NetBeans, etc., and personal productivity suites like OpenOffice and Microsoft Office. We allow users also to enter effort manually 17
Outputs Output consists of artifacts like code and documentation Properties can be observed to determine the quality: dynamically or CruiseControl statically dynamically: execute and report problems Source statically: verify source code quality using code metric size, complexity, and object oriented tools metrics and report problems 18
Activities Identifying activities is not straightforward, it depends on the specific problem Activity example Testing Test first Possible identification strategy Editing of files with filename like "test*.java" Editing of file with filename like "testx.java" AND subsequent editing of file "X.java" 19
Activities We adopt a rule based approach to decide if an activity took place or not Rules can be updated "a posteriori" and all events can be re-evaluated again The output of this component is the sequence of identified activities, the respective timestamps and users. 20
Jidoka Framework 21
Conclusions The idea to continuously monitor software artifacts and to alert the developer of possible mistakes or problems is not new Which are the novelties? 22
Conclusions Novelties We measure not only the artifacts produced but also the resources used as well as the activities performed We use rules to build quality into the process, i.e., to enforce these rules without the need of a person to constantly check them 23
Conclusions Novelties (continued) We see a dominance of the "pull" paradigm (stakeholder uses tool to analyze software development process) We want to promote a "push" paradigm (tool autonomously analyzes software development process according to predefined rules and notifies stykeholders) 24
Limitations Probes should not slow down machine Amount of data that is collected Not every analysis can be automated Finding purposeful rules is hard Involves different stakeholders with different priorities 25
Further research Which types of knowledge can be build into the process in the Jidoka way? Which are the best ways to interrupt software production? Validation Does Jidoka work in software production? Impact? 26
Thank you for your attention! andrea.janes@unibz.it