Subject: Defense Software: Review of Defense Report on Software Development Best Practices



Similar documents
United States Government Accountability Office November 2010 GAO

GAO. VETERANS BENEFITS MODERNIZATION VBA Has Begun to Address Software Development Weaknesses But Work Remains

GAO CONTRACT MANAGEMENT. Small Businesses Continue to Win Construction Contracts. Report to Congressional Committees

GAO. DEFENSE COMMUNICATIONS Management Problems Jeopardize DISN Implementation

DLA Needs to Strengthen Its Investment Management Capability

GAO ARMED FORCES INSTITUTE OF PATHOLOGY

GAO DEFENSE WORKING CAPITAL FUNDS. DOD Faces Continued Challenges in Eliminating Advance Billing

Page 1. June 26, The Honorable Carl Levin Chairman The Honorable John McCain Ranking Member Committee on Armed Services United States Senate

INSPECTOR GENERAL DEPARTMENT OF DEFENSE 4800 MARK CENTER DRIVE ALEXANDRIA, VIRGINIA

GAO DEFENSE ACQUISITIONS. Knowledge of Software Suppliers Needed to Manage Risks. Report to Congressional Requesters

GAO. VETERANS AFFAIRS Status of Effort to Consolidate VA Data Centers. Report to the Honorable Chaka Fattah, House of Representatives

Information Technology

SOFTWARE LICENSES. DOD s Plan to Collect Inventory Data Meets Statutory Requirements

GAO. DEF ENSE RELOCATION ASSISTANCE Service Information Systems Operating, but Not Yet Interactive. &port to the Sccrctary of Defcnsc * L, - -~.

Depot Maintenance: Status of the Public-Private Partnership for Repair of the Dual-Mode Transmitter in the F-16 Fire-Control Radar

March 14, The Honorable Carl Levin Chairman The Honorable James N. Inhofe Ranking Member Committee on Armed Services United States Senate

MARKET RESEARCH. Better Documentation Needed to Inform Future Procurements at Selected Agencies

GAO FEDERAL REAL PROPERTY. Corps of Engineers Needs to Improve the Reliability of Its Real Property Disposal Data

Federal Aviation Administration

CLOUD COMPUTING. Agencies Need to Incorporate Key Practices to Ensure Effective Performance

STRATEGIC SOURCING. Opportunities Exist to Better Manage Information Technology Services Spending

GAO INFORMATION TECHNOLOGY DASHBOARD. Opportunities Exist to Improve Transparency and Oversight of Investment Risk at Select Agencies

FEDERAL INFORMATION SECURITY. Mixed Progress in Implementing Program Components; Improved Metrics Needed to Measure Effectiveness

Allegations of the Defense Contract Management Agency s Performance in Administrating Selected Weapon Systems Contracts (D )

Subject: GSA On-line Procurement Programs Lack Documentation and Reliability Testing

1 GAO R Equipment Lease versus Purchase Analysis

GAO FEDERAL CONTACT CENTERS. Mechanism for Sharing Metrics and Oversight Practices along with Improved Data Needed. Report to Congressional Requesters

GAO INFORMATION TECHNOLOGY. SBA Needs to Strengthen Oversight of Its Loan Management and Accounting System Modernization

GAO. CUSTOMS SERVICE MODERNIZATION Ineffective Software Development Processes Increase Customs System Development Risks

MEASURES FOR EXCELLENCE. Software Process Improvement: Management. Commitment, Measures. And Motivation

April 17, Human Capital. Report on the DoD Acquisition Workforce Count (D ) Department of Defense Office of Inspector General

STRATEGIC SOURCING. Selected Agencies Should Develop Performance Measures on Inclusion of Small Businesses and OMB Should Improve Monitoring

April 29, The Honorable John Warner Chairman The Honorable Carl Levin Ranking Minority Member Committee on Armed Services United States Senate

DOD BUSINESS SYSTEMS MODERNIZATION. Additional Action Needed to Achieve Intended Outcomes

a GAO GAO DOD CONTRACT MANAGEMENT Overpayments Continue and Management and Accounting Issues Remain

Agencies Need Better Controls to Achieve Significant Savings on Mobile Devices and Services

WEAPONS ACQUISITION REFORM

GAO. INVENTORY MANAGEMENT DOD Can Build on Progress by Using Best Practices for Reparable Parts. Report to Congressional Committees

Statement. Mr. Paul A. Brinkley Deputy Under Secretary of Defense for Business Transformation. Before

AUDIT REPORT. The Department of Energy's Management of Cloud Computing Activities

DEFENSE LOGISTICS. Army Should Track Financial Benefits Realized from its Logistics Modernization Program. Report to Congressional Requesters

GAO MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices

GAO AIR FORCE WORKING CAPITAL FUND. Budgeting and Management of Carryover Work and Funding Could Be Improved

GAO HOMELAND SECURITY. Contract Management and Oversight for Visitor and Immigrant Status Program Need to Be Strengthened

October 19, The Honorable John Warner Chairman The Honorable Carl Levin Ranking Minority Member Committee on Armed Services United States Senate

GAO CONTRACT PRICING. Contributions to the Software Productivity ; Consortium. Defense Contractor

GAO WARFIGHTER SUPPORT. DOD Needs Strategic Outcome-Related Goals and Visibility over Its Counter-IED Efforts. Report to Congressional Requesters

ELECTRONIC HEALTH RECORDS. Fiscal Year 2013 Expenditure Plan Lacks Key Information Needed to Inform Future Funding Decisions

GAO CRITICAL INFRASTRUCTURE PROTECTION. Significant Challenges in Developing Analysis, Warning, and Response Capabilities.

AUDIT OF NASA S EFFORTS TO CONTINUOUSLY MONITOR CRITICAL INFORMATION TECHNOLOGY SECURITY CONTROLS

GAO. DOD PERSONNEL Weaknesses in Security Investigation Program Are Being Addressed. Testimony Before the Committee on Armed Services U.S.

ort Office of the Inspector General Department of Defense YEAR 2000 COMPLIANCE OF THE STANDARD ARMY MAINTENANCE SYSTEM-REHOST Report Number

April 16, Congressional Committees

Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs (D )

GAO. AIR TRAFFIC CONTROL Smaller Terminal Systems. Lapamy Requirements Need to: l3e Defined

April 28, GAO R NSPS Transition. United States Government Accountability Office Washington, DC 20548

DOD ADVERTISING. Better Coordination, Performance Measurement, and Oversight Needed to Help Meet Recruitment Goals

GAO DEPARTMENT OF DEFENSE. Improving the DOD Payment Process, Using Recovery Auditing and Changing the Prompt Payment Act

ACQUISITION OF SOFTWARE AND SERVICES TO SUPPORT THE CORPORATE HUMAN RESOURCES INFORMATION SYSTEM

GAO. FINANCIAL MANAGEMENT Training of DOD Financial Managers Could Be Enhanced. Report to the Secretary of Defense

United States Government Accountability Office GAO. Report to Congressional Committees

GAO. MILITARY RECRUITING DOD Could Improve Its Recruiter Selection and Incentive Systems

GAO LAND MANAGEMENT SYSTEMS. Major Software Development Does Not Meet BLM s Business Needs

--_I-._---_ _-- Ma wh I!W. PROGRAMMING LANGUAGE Status, Costs, and Issues Associated With Defense s Implementation of Ada

MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices

TITLE III INFORMATION SECURITY

GAO. TAX SYSTEMS MODERNIZATION IRS Needs to Resolve Certain Issues With Its Integrated Case Processing System

GAO DOD BUSINESS SYSTEMS MODERNIZATION

Program Management Office Provided Adequate Oversight of Two Contracts Supporting the Defense Enterprise Accounting and Management System

December 18, The Honorable Daniel K. Akaka Chairman The Honorable Richard Burr Ranking Member Committee on Veterans Affairs United States Senate

Acquisition. Army Claims Service Military Interdepartmental Purchase Requests (D ) June 19, 2002

Report No. DoDIG April 27, Navy Organic Airborne and Surface Influence Sweep Program Needs Defense Contract Management Agency Support

United States Government Accountability Office August 2013 GAO

Information System Security

GAO DATA CENTER CONSOLIDATION. Strengthened Oversight Needed to Achieve Cost Savings Goal. Report to Congressional Requesters

FEDERALLY FUNDED RESEARCH CENTERS. Agency Reviews of Employee Compensation and Center Performance

Audit of Controls Over Contract Payments FINAL AUDIT REPORT

GAO. DEFENSE INFRASTRUCTURE Funding Risks in Services 1999 Central Training Programs

WEAPONS ACQUISITION REFORM

GAO. LAND MANAGEMENT SYSTEMS Major Software Development Does Not Meet BLM s Business Needs. Report to the Secretary of the Interior

a GAO GAO ARCHITECT OF THE CAPITOL Status Report on Implementation of Management Review Recommendations Report to Congressional Committees

GAO FINANCIAL. IRS Lacks Accountability Over Its ADP Resources MANAGEMENT. Report to the Cornmissioner, Internal Revenue Service GAO/AIMI)-X3-24

Subject: Military Personnel Strengths in the Army National Guard

Deputy Chief Financial Officer Peggy Sherry. And. Chief Information Security Officer Robert West. U.S. Department of Homeland Security.

SENTINEL AUDIT V: STATUS OF

THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC

Department of Defense DIRECTIVE

Funding Invoices to Expedite the Closure of Contracts Before Transitioning to a New DoD Payment System (D )

GAO DEFENSE COMPUTERS. Management Controls Are Critical to Effective Year 2000 Testing

Statement of Mr. William H. Reed Director, Defense Contract Audit Agency Senate Armed Services Committee April 19, 2007

UNDER SECRETARY OF DEFENSE 4000 DEFENSE PENTAGON WASHINGTON, DC

ADVISORY MEMORANDUM REPORT ON DEVELOPMENT OF THE LOAN MONITORING SYSTEM ADVISORY REPORT NUMBER A1-03 FEBRUARY 23, 2001

Independent Evaluation of NRC s Implementation of the Federal Information Security Modernization Act of 2014 for Fiscal Year 2015

AIR FORCE WORKING CAPITAL FUND. Actions Needed to Manage Cash Balances to Required Levels

GAO DEFENSE CONTRACT AUDITS. Actions Needed to Improve DCAA's Access to and Use of Defense Company Internal Audit Reports

Performance Audit Concurrent Review: ERP Pre-Solicitation

DEFENSE ACQUISITION WORKFORCE

AUDIT REPORT. Cybersecurity Controls Over a Major National Nuclear Security Administration Information System

Report via OMB s Integrated Data Collection (IDC), 10

Department of Homeland Security Office of Inspector General. Review of U.S. Coast Guard Enterprise Architecture Implementation Process

Transcription:

United States General Accounting Office Washington, DC 20548 Accounting and Information Management Division B-285626 June 15, 2000 The Honorable John Warner Chairman The Honorable Carl Levin Ranking Minority Member Committee on Armed Services United States Senate Subject: Defense Software: Review of Defense Report on Software Development Best Practices On March 20, 2000, the Department of Defense provided your Committee with a report on its efforts to adopt management best practices for software development and acquisition. Defense was directed to provide this report by language in the Committee s report to accompany the National Defense Authorization Act for fiscal year 2000, Senate Report 106-50. The requirement was established because the Committee was concerned that DOD had not taken sufficient actions to address costly and long-standing software development and acquisition problems, which have been documented in many GAO, Inspector General, and department studies. Senate Report 106-50 also required GAO to review and comment on Defense s response. 1 This letter provides our comments. Our objective in reviewing Defense s response was limited to evaluating the response to determine whether (1) it satisfied the committee s directive and (2) the information included was accurate and complete. Because we did not evaluate the effectiveness of Defense s efforts to identify and adopt best practices in software development, our comments do not constitute an assessment of whether Defense s efforts are improving software development. However, we are conducting another review of Defense s management of its software process improvement efforts, and we will provide the Committee with a copy of our report based on that review. Our review of Defense s response was conducted in April and May 2000 in accordance with generally accepted government auditing standards. 1 The Senate required that Defense deliver a report on this issue by February 1, 2000, and that GAO provide its comments on the report within 60 days.

RESULTS IN BRIEF In responding to the Committee s directive, Defense was required to report on its efforts to identify and adopt best practices in software development and on its efforts to address six additional issues--ranging from the employment of risk management in software development, to the management of requirements changes, to the development of metrics to help assess performance and pinpoint potential problems, to tracking software development rework expenditures. Defense s response addressed the Committee s basic question and all six additional issues. However, the responses on some issues were incomplete and others contained inaccurate or outdated information. Specifically, of the seven total issues, Defense s response did not fully address two issues and was inaccurate or outdated on three. Table 1 summarizes of our evaluation of Defense s response. Table 1: Evaluation of Defense s Responses to Issues in Senate Report 106-50 Issue Response Addresses Issue? Response Accurate/ Complete? Defense efforts to identify and adopt best practices in software development How risk management is used in a project or program s software development process No The process used to control and manage requirements changes during the software development process The metrics required to serve as an early warning of evolving problems, measure the quality of the software product, and measure the effectiveness of the software development or acquisition process Measures used to determine successful fielding of a software product Partially How Defense ensures that duplication of ongoing software development efforts are minimized; and how commercial software and previously developed software solutions are used Partially No to the maximum extent practicable The portion of defense software expenditures used for rework No Defense s response also did not offer additional relevant information that could have improved the report. For example, Defense has begun some initiatives under its goal of building a coherent global network that focus on developing an effective Defense software management program. This information would have provided insight into current and future Defense efforts. Page 2

EVALUATION OF INDIVIDUAL SEGMENTS OF THE DEFENSE RESPONSE The following sections provide our comments on the individual issues. Defense efforts to identify and adopt best practices in software development Defense s response addresses the Committee s basic question. In particular, Defense noted that it has established a policy to adopt best practices in software development by requiring software to be managed and engineered using best practices. Also, Defense is currently surveying the industry about best practices in software management, and, as a result of that survey and subsequent analysis, may implement new policy and guidance on software best practices. In addition, Defense noted that in the future it may increase emphasis on this area by promoting the use of commercial software or other proven software and by establishing a clearinghouse to store existing and proven software components for reuse. Each of these potential initiatives would facilitate the identification and adoption of best practices. Although defense s response addresses its policy to adopt best practices, implementation of this policy has yet to be formalized. For example, Defense has not provided guidance to software program managers on how to identify or adopt such practices. Also, even though some Defense units have information available on best practices, managers use of data from such sources is not mandatory. In particular, although the Software Program Manager s Network has developed a 16-point plan of critical software practices, Defense has no formal program implementing the plan. Instead, Defense encourages program managers to take advantage of the network s support. How risk management is used in a project or program s software development process Defense s response provides information on both the reporting of risks and the risk management process. The portion of the response dealing with reporting of risks is adequate. Both the Defense Acquisition Executive Summary and the Major Automated Information System reports have the potential to provide system overseers with information on program risk, as determined by the program manager. However, the portion of the response dealing with the risk management process is not accurate. Specifically, it reflects the use of risk management in the systems engineering and system design processes but not in software development. This is problematic because experience has shown that the software component of major acquisitions (versus hardware or firmware) is the source of most system risk, and the component most frequently associated Page 3

with late deliveries, cost increases, and performance shortfalls. 2 Private industry and government organizations have widely recognized this risk by endorsing and accepting the models and methods that define and determine an organization s software process maturity developed by Carnegie Mellon University s Software Engineering Institute (SEI). The process used to control and manage requirements changes during the software development process Defense s response addresses the issue. In its comments, Defense notes that individual system acquisition management offices are responsible for change control during software development and the program s software configuration control board is responsible for managing this process. Using such a board is a standard practice in software development. Defense s response did not provide any details about this process or describe the interaction that should take place between acquisition managers and the board. The metrics required to serve as an early warning of evolving problems, measure the quality of the software product, and measure the effectiveness of the software development or acquisition process Defense s response addresses the issue. The response discusses current Defense-sponsored software metrics support programs, such as the Practical Software Measurement program and SEI s Software Engineering Measurement and Analysis team. Defense also notes that both the Army and Navy have implemented voluntary metrics programs that identify and collect appropriate metrics to monitor key risk areas. Defense also has other efforts underway to increase the use of metrics in software-intensive major automated information system programs. For example, Defense is sponsoring independent software assessments of selected major programs, which will provide software development baselines with recommended risk management metrics and strategies for tracking and controlling software development. Also, Defense intends to collect data from completed major programs and use it to both help develop the initial cost and schedule estimates for new programs and to mitigate risk in managing acquisition programs. Defense officials told us these efforts will be used to develop a core set of software metrics to assist in the measurement and tracking of software process improvements. They also plan to implement automated metrics collections and the use of software analysis tools, depending on future funding and acceptance by Defense components. 2 Air Traffic Control: Immature Software Acquisition Processes Increase FAA System Acquisition Risks (GAO/AIMD-97-47, March 21, 1997). Page 4

Measures used to determine successful fielding of a software product Defense s response partially addresses the issue but does not provide a complete answer. The response places responsibility for successful fielding of a software product on (1) the contractor developing the product and (2) the individual system maintenance process and the postdeployment software support process. However, Defense s response does not explain how the contracting unit measures the success of the contractor s effort or what, if any, Defense s requirements for maintenance or support are. Defense s response also identifies several generic measures that could be used to evaluate success such as maintenance costs or number of software problems reported. However, Defense does not specify whether these measures are required to be developed and/or approved. It also does not discuss what parameters or thresholds might be attached to ensure the effectiveness of the measures (e.g., what maintenance costs are appropriate for systems given functionality, changes in requirements, complexity, and sophistication; what number of software problems are appropriate given similar factors). Finally, we found that Defense policy requires the use of a software measurement process to assess and improve the software development process and associated software products. While Defense s response does not discuss this policy, information on this policy seems to be more germane to the Committee s question as to how Defense determines successful fielding of a software product. How Defense ensures that duplication of ongoing software development efforts are minimized; how commercial software and previously developed software solutions are used to the maximum extent practicable Defense s response partially addresses the issue. Defense discusses two clearinghouse and analysis centers for software that are available to DOD program managers: the Data and Analysis Center for Software and the Defense Technical Information Center. However, there is no mention of any Defense policy or guidance relating to the use of these centers to reduce duplicative software development or that Defense even promotes their use. Defense s response also identifies a set of 14 software reuse information sources, which provide guidance and a number of plans and strategies on software reuse. However, none of them provide a source of existing and proven software components that would allow managers to act on this guidance. Defense noted in its opening remarks that it may establish a clearinghouse to store existing and proven software components that are ready for reuse. Should Defense establish this clearinghouse, Defense managers would have such a source. In addition, part of Defense s response is either inaccurate or outdated. It discusses two Defense entities that we were informed are no longer in operation--the Software Reuse Initiative Program Management Office and the Army Software Reuse Center. Page 5

The portion of defense software expenditures used for rework Defense s response addresses the issue, but the information provided may not be fully supported by the available data. Defense states that it does not know the amount of money that it spends on software maintenance annually nor the cost segments that make up this total, such as costs for planned enhancements and costs for problem fixes. Defense also indicates that there is no practical way to separate the amount expended for planned enhancements from that expended for product fixes. However, an approved Department of Defense journal published a July 1997 article 3 that discussed a Defense unit s effort to track costs associated with product fixes and enhancements. The article discussed why software maintenance planning and management should be formalized and quantified, and it provided a methodology for tracking costs associated with both fixes and modifications for a large Defense system, including the total staff days expended. ADDITIONAL INFORMATION NOT INCLUDED IN DEFENSE S RESPONSE Defense officials separately provided us with additional information on ongoing projects related to software development. By including this information in its response to the Committee, Defense could have demonstrated that it is taking positive actions to ensure that software development is more effectively managed. For example, Defense has two projects underway that may affect how requirements are managed, but did not provide any details about them in the report to the Committee. Under one project, Defense is developing a software product development report that will provide additional information on software requirements early in the contracting process, before development begins. Under the second project, the Defense Science Board is expected to make formal recommendations in midyear 2000 to strengthen the requirements collections process. Defense has also begun some initiatives under its goal of building a coherent global network that include an objective of developing an effective Defense software management program. If these initiatives receive sufficient funding and support, Defense plans to issue and implement new software policies and guidance to improve software management. But several key actions under this goal were not included in Defense s response. For example, one ongoing activity focuses on developing and adopting new software concepts and best practices across Defense; another aims to renovate, revise, and/or augment existing standards; and still another seeks to develop a methodology to determine Defense-wide compliance with the SEI s Capability Maturity Model. 3 Measurements to Manage Software Maintenance, CrossTalk: The Journal of Defense Software Engineering, Vol. 10, No. 7, The Software Technology Support Center, Hill Air Force Base, Utah, www.stsc.hill.af.mil/crosstalk/1997/jul/maintenance.asp. Page 6

The official who produced Defense s response told us that he was unable to include information on these ongoing projects because of a lack of time. He added that, if he now had an opportunity to do so, he would include this additional information. In addition, Defense officials noted that the program overseeing some of these initiatives was not created until after Defense s response was prepared, although briefing charts on this program show start dates for these initiatives in late 1999. AGENCY COMMENTS In providing oral comments on a draft of this letter, Defense generally agreed with our findings. We have incorporated Defense s specific comments where appropriate. -- -- -- -- We are sending copies of this report to Representatives Floyd Spence, Chairman, and Ike Skelton, Ranking Minority Member, Committee on Armed Services, House of Representatives, and to Arthur Money, Assistant Secretary of Defense for Command, Control, Communications, and Intelligence. Copies will also be made available to others upon request. Please contact me at (202) 512-6240 if you have any questions concerning this letter. Carl Higginbotham and Tonia Brown were key contributors to this letter. Jack L. Brock, Jr. Director, Governmentwide and Defense Information Systems (511973) Page 7