Measure for Measure: A Practical Management Program Robert Altizer and Don Downin Motorola Semiconductor Products Sector Tempe, AZ 1
Attempts to Manage With Metrics Organizations collect and report standard metrics with no relationship to projects or business goals Data are sporadic or incomplete, reports are ignored by management Amounts to no more than feeding the corporate gorilla Often Fail 2
Can We Do It Right? What we have Strong measurement culture Maturity Level 3 process capability What we want: Statistical process management capability Management demand for quality metrics reports Team member understanding of metrics use Firm foundation for measuring performance and driving continuous product and process improvement 3
The Motorola SPS Management Program Business-oriented quality management plans with proper scope and rational goals attributes with significant business value and commitment of a named owner Review performance against quality goals Use metrics to manage activities and make product and process change decisions Establish a stable and capable development and support environment 4
All Products, All Phases of the Product Life Cycle New system and product development Integration of purchased components Updates or enhancements to existing systems or products Manufacturing Sustaining engineering and support of released products Infrastructure support Process improvement activities 5
The Management Process QMP Adjustments Team Project Management Activities Create Project Development Plan Execute Project Development Plan Product and Process Adjustments QMP Team Project Management Activities Create Project Development Plan Execute Project Development Plan Develop Management Plan Search for common defects Conduct Reviews Collect, report and analyze metrics Control project Perform defect causal analysis Report quality status Product and Process Adjustments Metrics QMP QMP Organization and Owner Activities Maintain baselines Collect, analyze and report organization metrics Review and evaluate quality management activities Conduct quality management process audits 6
Management Process Team Project Management Activities 1.1 Create the Project Development Plan Business success factors Business goals Team Management Activities Planned Metrics QM Plan Adjustments Organization Activities 3.1 Maintain Product and Process Capability Baselines and Control-limits Project Development Plan 1.2 Execute the Project Development Plan 2.1 Develop the Planned QM Plan Metrics 2.5 Product/Process Baselines and Control-limits QM Plan: Project Specific Performance and Improvement Goals & Activities, with identified metrics Defect Prevention Tasks 2.2 Search for Common Defects Common Defects Defect Injection Causes Baselines and Control Limits Performance Baselines and Control Limits Lessons Learned Baseline & Controllimit changes Metrics data on Performance and Improvement Goals & Activities 2.7 Perform Defect Causal Analysis Audit Reports Other Data Root Cause Analysis Collect Product and Process Metrics Defect Data 3.2 Collect & Analyze Organizational Metrics Inspections Database Organizational Metrics QM Plan Adjustments Various other process step outputs Product/Process Adjustments Review Data Other Data Product/Process Adjustments Project Metrics 2.6 Analyze Data; Control the Project Product/Process Adjustments Product/Process Adjustments 3.3 Review Organizational QM Activities Project Metrics QM Plan: Project Specific Performance and Improvement Goals & Activities 2.3 and 2.4 Conduct Product and Process Reviews Trends and Spikes Project Metrics Organizational Impact Management Status Documented Process QM Plan Adjustments Review Data Audit Reports Project Metrics QM Plan Documented Process 2.8 Report Status Management Status To Area Management Audit Reports 3.4 Conduct Process Audits 7
Management Program Objectives and Success Criteria Objective Provide engineering teams with the ability to set realistic product and process quality goals, based upon business needs Improve the ability of teams to manage the quality of their deliverables throughout the life cycle. Institutionalize quality management processes, standards, and procedures that have been approved by the EPG. Systematically eliminate the introduction of defects into the development environment Satisfied When - Process capability baselines are established by all teams - All teams are using the Management Program with goaloriented metrics - All teams have had adequate training on the Management Program - A support structure exists to capture, report, and analyze metrics and data - Assessments of the QMP program and its execution indicate this is true. - Defect reduction plans are in place and shown to be working. - In-process faults and post release defects trend is toward zero. 8
Management Policy The General Manager has ultimate responsibility for the organization s quality Every development, sustaining and service team shall use a quality management plan Hold periodic quality reviews of plans, goals, performance, actions, defect prevention, and continuous improvement Track and report quality review process effectiveness 9
The Management Plan Defined at the project/product, department and organization level Contains a set of scope-appropriate quality attribute management plans Essential requirements for quality attributes: Must support an identified product or process business goal Must have an identified owner with responsibility and authority to allocate resources and direct actions toward achieving the goal 10
Management Plans Organization Organization Level Attribute Mgmt Plan #1 Attribute Mgmt Plan #2 Attribute Mgmt Plan #6 Defect Prevention & Strategic Action Plan Department Level Attribute Mgmt Plan #1 Attribute Mgmt Plan #2 Attribute Mgmt Plan #3 Attribute Mgmt Plan #5 Defect Prevention & Strategic Action Plan Project Level Attribute Mgmt Plan #1 Attribute Mgmt Plan #2 Attribute Mgmt Plan #3 Attribute Mgmt Plan #4 Defect Prevention & Strategic Action Plan 11
Building a Attribute Management Plan Propose several quality attributes and make the business case for for managing them Business success factors make the quality attribute relevant Business goals define a performance level for each of the defined business success factors The quality attribute owner is its champion, typically the project leader, department manager or organization general manager 12
Attribute Identification Attribute Title: Business Success Factors: Business Goal: Scope: Owner: Post-Release Software Defects: The internal view of delivered defects Establishment of Six- Sigma (or better) quality in delivered software products enhances customer acceptance. Reducing the time spent in rework allows more time for value-added development. Minimize customer s probability of encountering defects in released software. Organization Level General Manager 13
Use Operational Questions to Characterize Goals (GQM) Example questions in support of the goal minimize the customer s discovery of defects in released product : For what products are released defects measured? What s the baseline defect rate for each product? What s the current defect density in each product? What s the distribution of defect types? Where were defects injected? How quickly are defects fixed? How many are open right now? 14
Define Metrics Derive metrics from operational questions Define metric details: Algorithm and data Where reviewed Data source Responsibility for compiling, computing and publishing Management review questions Sample chart or graph 1 0.1 0.01 Chart 1: Post-Release Defects 0.001 95 96 97 98 DTRC = Defects/Total KAELOC DMRC = Defects/Delta KAELOC 15
Metrics Data Definition M # Data Element Data Source Frequency of Collection Responsible 1 2 3 Baseline fault density for total and modified released code Organization process database Once; Update as needed Department EPG reps; Chief SW Engr.; SQA Manager 1 2 3 Sizes of base, new, and total released software for each product or project measured. Project configuration management systems Quarterly Department EPG reps; SQA Manager 1 2 3 Number of faults reported in new and delta software released. Project defect tracking systems Quarterly Department EPG reps; SQA Manager 16
Every Metric Must Have Connection to a business success indicator An owner who will use it to manage the business at the specified scope Two-way responsibility: Owner needs it to run the business Metric producer knows it will be used 17
Performance Baselines, Goals and Action Triggers For each metric, we define: The baseline (nominal expected) performance for each specified metric The performance goal to be achieved within the scope of the plan The control limits (if known) The preventative action trigger The corrective action trigger Goals can be specific target values, trends, tightening of control limits, etc. 18
Baselines, Control Limits, Goals and Triggers M # Baseline & Control Limits Performance Goal Preventative Action Trigger Corrective Action Trigger. 1 DTRC = 4 dpm (0.004 Defects/KAELOC) DTRC = 1 dpm by 1Q1998 DTRC greater than previous report DTRC > 8 dpm for two consecutive reports 2 DMRC = 20 dpm (0.020 Defects/KAELOC) DMRC = 8 dpm by 1Q1998 DMRC greater than previous report DMRC > 30 dpm for two consecutive reports 3 8X (60% per year) 10X every two years (68% per year) DMRC X-Factor below goal rate DMRC X-Factor below goal rate for more than two consecutive reports 19
Management Review Plan The Plan includes a description of how its performance will be reviewed: Type and frequency of review meetings Metrics and goals to be reviewed Essential attendees Management review questions included in each Plan probe beneath the reported metrics for greater understanding 20
Management Review Questions Q# M# Post-Release Defects Questions 1 1, 2 How much software did we release in the reporting period? How much was new? 2 1, 2 How much total software is in release and currently supported?? 3 1, 2 What types and severities of defects were reported in post-release software? Do we have plans to control them? 4 1, 2 What software had the most problems? Why? 5 3 What are the root causes? What are we doing about them? 6 1, 2 How many of the released defects were found by our users? 7 3 How many problems did we fix last quarter? How long does a fix take? 8 3 How many remain unfixed? Why? 21
Other Plan Components Metric References Detailed reference, including all data definitions, algorithms, formulas, etc. Defect Prevention Plan Activities that will result in a lower defect injection rate (not relevant to all quality attributes) Strategic Improvement Plan Long-term activities to effect improvement in several quality attributes (may not be relevant at pro level scope) 22
Reviews The performance of a project, department or organization, shown by its quality metrics, is reviewed by management Conducted as a regular part of our monthly organization operations review Roles defined for: Manager Team member Attribute owner SQA Manager 23
Review Process A quorum is required (manager, owner, team members, quality representative) Projects present: Current quality metrics charts with appropriate backup data and context descriptions Content and status (triggers) of preventative and corrective action plans Defect prevention, strategic improvement plans Managers responsible for review questions 24
An Organization Level Metrics Report 1 0.1 0.01 0.001 Chart 1: Post-Release Defects 95 96 97 98 DTRC = Defects/Total KAELOC DMRC = Defects/Delta KAELOC X-Factor Value 100 10 Chart 2: Customer View of Defects 1 0.1 95 96 97 98 CRUD Rate of Improvement Customer Reported Unique Defects 1 0.8 0.6 0.4 0.2 0 CRUD $ COPQ ($K) 1 0.8 0.6 0.4 0.2 0 Chart 3: Cost of Poor 95 96 97 98 Dollar Costs of Poor ($K) COPQ / Total SW Dev Cost (%) 100% 80% 60% 40% 20% 0% % COPQ IPF Chart 4: In-Process Faults 1 100% 80% 0.1 60% 0.01 40% 20% 0.001 0% 95 96 97 98 IPF = In-Process Faults/Delta KAELOC Coverage = IPF Measurement Coverage Coverage X-Factor Value 100 10 1 0.1 Chart 5: Cycle Time Rate of Improvement 95 96 97 98 4 3 2 1 0 SEI / UA KPA Goal Satisfaction Status R P P M P T S M Q C P A M F P T D P IM P IC P P E R M Q M Satisfied Unsat Start Target D T P S M P C C D Q Participation: 3Q96: 58% 4Q96: 58% 1Q97: 58% 2Q97: 58% 25
Management Program Assessment Key issues for success: commitment, resources, training Self-evaluations of the program look at: Definition and approval of policy and procedures Metrics and review support environment Training of participants Execution: Reviews are held and action plans developed and worked Rating Criteria 1 No work done. 2 Some work done, but still significant work to do. 3 Almost all work done, small additional tasks remain. 4 All work accomplished. 26
Conclusions Our Management Program: Aligns quality management with business success Keeps activities within a tractable scope Ensures relevance through ownership and regular management quality reviews Uses a simple, tailorable, statistically based quality plan template Provides a firm foundation for measuring performance and driving continuous product and process improvement in any engineering discipline Provides a solid way to satisfy SEI Level-4 27
Contact Information Bob Altizer Chief Software Engineer Motorola SPS 2100 East Elliot Road, M/S EL303 Tempe, AZ 85284-1801 USA 602-413-4158 / r10908@email.sps.mot.com Don Downin Manager, Software Process and Motorola SPS 2100 East Elliot Road, M/S EL503 Tempe, AZ 85284-1801 USA 602-413-5616 / rd3000@email.sps.mot.com 28