Technology Readiness Assessment (TRA)
|
|
|
- Sherilyn Jocelin Lloyd
- 9 years ago
- Views:
Transcription
1 DEPARTMENT OF DEFENSE Technology Readiness Assessment (TRA) Guidance April 2011 Prepared by the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) revision posted 13 May 2011
2 Contents 1. Summary Initiating and Conducting a TRA Key Players Roles and Responsibilities Process for Conducting a TRA Establish a TRA Plan and Schedule Form a SME Team Identify Technologies To Be Assessed Collect Evidence of Maturity Assess Technology Maturity SME team Assessment Prepare, Coordinate, and Submit a TRA Report ASD(R&E) Review and Evaluation Submitting a TRA Skeletal Template for a TRA Annotated Template for a TRA (Recommended or Nominal Length of Section TRL Definitions, Descriptions, and Supporting Information List of Acronyms... ACR-1 1-ii
3 Section 1. Summary A Technology Readiness Assessment (TRA) is a systematic, metrics-based process that assesses the maturity of, and the risk associated with, critical technologies to be used in Major Defense Acquisition Programs (MDAPs). It is conducted by the Program Manager (PM) with the assistance of an independent team of subject matter experts (SMEs). It is provided to the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) and will provide part of the bases upon which he advises the Milestone Decision Authority (MDA) at Milestone (MS) B or at other events designated by the MDA to assist in the determination of whether the technologies of the program have acceptable levels of risk based in part on the degree to which they have been demonstrated (including demonstration in a relevant environment) and to support riskmitigation plans prepared by the PM. The plan for conducting a TRA is provided to the ASD(R&E) by the PM upon approval by the Component Acquisition Executive (CAE). A TRA is required by Department of Defense Instruction (DoDI) for MDAPs at MS B (or at a subsequent Milestone if there is no MS B). It is also conducted whenever otherwise required by the MDA. It is required for space systems by Under Secretary of Defense for Acquisition, Technology, and Logistics (USD(AT&L)) memorandum Transition of the Defense Space Acquisition Board (DSAB) Into the Defense Acquisition Board, dated March 23, The TRA final report for MDAPs must be submitted to ASD(R&E) for review to support the requirement that ASD(R&E) provide an independent assessment to the MDA. A TRA focuses on the program s critical technologies (i.e., those that may pose major technological risk during development, particularly during the Engineering and Manufacturing Development (EMD) phase of acquisition). Technology Readiness Levels (TRLs) can serve as a helpful knowledge-based standard and shorthand for evaluating technology maturity, but they must be supplemented with expert professional judgment. To reduce the risk associated with entering EMD, DoDI requires Requests for Proposals (RFPs) to incorporate language that prevents the award of an EMD contract if it includes technologies that have not been demonstrated adequately. Adequate demonstration in a relevant environment (TRL 6) is one benchmark that is evaluated, but it is not the only consideration, nor necessarily dispositive. As such, a generic TRA not based on the planned specific technical solution is insufficient. Since the TRA must be based on the technologies of the program that entail some element of 1-1
4 risk, TRAs may have to be performed on all the competitors proposals in a source selection. In accordance with a USD(AT&L) memo titled Better Buying Power: Guidance for Obtaining Greater Efficiency and Productivity in Defense Spending, dated September 14, 2010, the TRAs described in this Guidance replace the former TRAs described in the Technology Readiness Assessment Desk Book, dated July TRAs that must be submitted to ASD(R&E) are required only for MDAPs that require certification under 10 U.S.C. 2366b or other provisions of law, or when otherwise directed by the MDA. Generally, TRAs are not required for MDAPs at MS C. Independent of the elimination of the formal requirement to conduct a TRA for a Major Automated Information System (MAIS), MS C, and Acquisition Category (ACAT) II IV programs, all PMs and their chains of command retain complete responsibility for assessing, managing, and mitigating acquisition program technology risk. MDAs for non-acat I programs should consider requiring TRAs for those programs when technological risk is present. 1 See 1-2
5 Section 2. Initiating and Conducting a TRA 2.1 Key Players Key players in the TRA process are as follows: The Milestone Decision Authority (MDA)/Defense Acquisition Executive (DAE), The Component Acquisition Executive (CAE)/Program Executive Officer (PEO) and Science and Technology (S&T) Executive, The Program Manager (PM), The Assistant Secretary of Defense for Research and Engineering (ASD(R&E)), and The team of independent subject matter experts (SMEs). 2.2 Roles and Responsibilities Key player roles and responsibilities are as follows: The MDA Determines whether to approve the Milestone decision or to defer until technology matures. Determines whether or not the technologies of the program can be certified under 10 U.S.C. 2366b based on independent review and assessment by ASD(R&E), which review and assessment are informed, in part, by the program TRA. In case of technologies not demonstrated in a relevant environment, determines whether the PM s proposed risk-mitigation plans are adequate and, in turn, determines whether to issue a waiver under 10 U.S.C. 2366b. With the user, determines if risk can be reduced to an acceptable level by relaxing program requirements. 2-1
6 The CAE/PEO and S&T Executive Approves the PM s TRA plan and assigns additional participants as desired. Reviews and approves the list of critical technologies that pose potential risk to program success and that are to be assessed in the TRA. Reviews and approves the TRA final report and cover memorandum and includes any additional material desired. Transmits the completed TRA to ASD(R&E). Raises any issues that cannot be resolved with the ASD(R&E) to the MDA.. The CAE may choose to make the Service S&T Executive a key participant in the TRA process. For example, the CAE may direct the S&T Executive to take responsibility for TRA management and execution. The CAE may assign the S&T Executive as a reviewer/signatory on MDAP Technology Development Strategies (TDSs) to support identification and management of critical technologies leading up to MS B. The PM Assesses the technological risk in his/her program. Plans and ensures funding of the program s risk-reduction activities to ensure that technologies reach the appropriate maturity levels prior to being incorporated into the program baseline design. A key benchmark is that the technologies of the program be demonstrated in a relevant environment at MS B or at a subsequent Milestone if there is no MS B for this program. If this benchmark is not achieved, a waiver by the MDA is possible, but this waiver must be based on acceptable means of risk mitigation, such as inclusion of an alternative more mature technology as a funded option. Prepares a plan for conduct of the TRA. Receives approval of the plan through CAE and PEO and provides the plan to ASD(R&E) upon approval by the CAE. Funds the TRA and ensures that it is properly carried out. Prepares a draft TRA schedule and incorporates the approved version in the program s Integrated Master Plan (IMP) and Integrated Master Schedule (IMS). A draft TRA should be completed prior to the MDA Defense Acquisition Board (DAB) Pre-MS B Program Review that precedes EMD RFP release and MS B. After Preliminary Design 2-2
7 Review (PDR) and prior to MS B or another certification decision event, the TRA will be updated as needed based on the PDR and sourceselection results to ensure that knowledge obtained at PDR and in the proposals is available to inform the ASD(R&E). In consultation with ASD(R&E) and with PEO and CAE approval, identifies the subject matter expertise needed to perform the TRA. Assigns members of the SME team and informs the CAE, PEO, ASD(R&E), and S&T Executive of the final membership. Familiarizes the SME team with the program, the performance and technical requirements, and the designs under consideration. Identifies possible critical technologies for consideration by the SME team. Provides evidence of technology demonstration in relevant environments to the SME team for assessment, including contractor data as needed. Provides proposed risk-mitigation plans to address remaining technological risk associated with critical technologies to the SME team, independent of levels of demonstration. Provides technical expertise to the SME team as needed. Prepares the TRA report that will include findings, conclusions, and other pertinent material prepared by the SMEs. Prepares the TRA report cover memorandum, which may include additional technical information deemed appropriate to support or disagree with SME team findings. Sends the completed TRA, with SME team comments unaltered, through the PEO to the CAE for review and transmittal to ASD(R&E), together with any additional information the CAE chooses to provide. Determines whether a waiver to the 2366b certification requirement may be appropriate, and if so, requests PEO and CAE approval to request such a waiver. The ASD(R&E) Reviews the TRA plan provided by the PM and provides comments regarding TRA execution strategy as appropriate. 2-3
8 In conjunction with the PM and SME team, reviews the PM-provided list of critical technologies to assess and recommends additions or deletions. Based on the TRA final report, inter alia, provides the MDA an independent assessment and review concerning whether the technology in the program has been demonstrated in a relevant environment. If a 2366b waiver has been requested, provides a recommendation to the MDA, with supporting rationale, as to whether a waiver should be granted. Recommends technology maturity language for an Acquisition Decision Memorandum (ADM), noting, in particular, conditions under which new technology can be inserted into the program. The SME Team Works closely with the PM throughout the TRA process. Reviews the performance, technical requirements, and designs being considered for inclusion in the program. In conjunction with the PM and ASD(R&E), reviews the PM-provided list of critical technologies to assess and recommends additions or deletions. The SME team should make recommendations to the PM (with associated rationale) on the candidate technologies that should be assessed in the TRA. Assesses whether adequate risk reduction to enter EMD (or other contemplated acquisition phase) has been achieved for all technologies under consideration, including, specifically, demonstration in a relevant environment. The assessment should be based on objective evidence gathered during events, such as tests, demonstrations, pilots, or physicsbased simulations. Based on the requirements, identified capabilities, system architecture, software architecture, concept of operations (CONOPS), and/or the concept of employment, the SME team will evaluate whether performance in relevant environments and technology maturity have been demonstrated by the objective evidence. 2-4
9 If demonstration in a relevant environment has not been achieved, the SMEs will review the risk-mitigation steps intended by the PM and make a determination as to their sufficiency to reduce risk to an acceptable level. TRLs will be used as a knowledge-based standard or benchmark but should not substitute for professional judgment tailored to the specific circumstances of the program. Prepares the SME comments in the TRA report including (1) the SME team credentials and (2) SME team findings, conclusions, and supporting evidence. 2.3 Process for Conducting a TRA Establish a TRA Plan and Schedule The TRA planning process begins when the PM establishes a plan for conducting the TRA, typically after MS A. After the TRA plan is approved by the PEO and CAE, it is provided to ASD(R&E) by the PM. The TRA plan should include a schedule that aligns with the Acquisition Strategy (AS), and it should be incorporated into the program s IMS. When a pre-ms B DAB Program Review is conducted prior to the release of the EMD RFP, a draft TRA will be reviewed and approved by the PEO and CAE and provided to the ASD(R&E) 30 days before the pre-ms B DAB Program Review. The TRA should be finalized after PDR and at least 30 days before MS B Form a SME Team Once a TRA schedule has been established, a team of SMEs should be formed. Subject matter expertise and independence from the program are the two principal qualifications for SME team membership. Members should be experts who have demonstrated, current experience in the relevant fields. It is the PM s responsibility to guide SME team members on their role in the TRA process, as provided for in the TRA plan. The PM should include an overview of the system, an overview of the TRA process, criteria for identifying critical technologies, and examples and instructions for determining whether technologies have been demonstrated in a relevant environment. The PM should exploit planned demonstration events and tests to provide the data needed by the SME team. SME team members might be required to sign non-disclosure agreements and declare that they have no conflicts of interest Identify Technologies To Be Assessed The fundamental purposes of the TRA are (1) to provide the PM with a comprehensive assessment of technical risk, and (2) to support the ASD(R&E) s 2-5
10 independent assessment of the risk associated with the technologies incorporated in the program including whether the technologies of the program have been demonstrated in a relevant environment so that the MDA is informed as to whether certification under 10 U.S.C. 2366b can be accomplished, whether a waiver is appropriate, and whether risk-mitigation plans are adequate. Thus, it is important to identify all appropriate technologies that bear on that determination. These technologies should be identified in the context of the program s systems engineering process, based on a comprehensive review of the most current system performance and technical requirements and design and the program s established technical work breakdown structure (WBS). Technology risk identification should start well before the formal TRA process. In fact, potential critical technology identification begins during the Materiel Solution Analysis (MSA) phase, which precedes MS A. An early evaluation of technology maturity, conducted shortly after MS A, may be helpful to refine further the potential critical technologies to be assessed. It may be appropriate to include high-leverage and/or high-impact manufacturing technologies and life-cycle-related technologies if there are questions of maturity and risk associated with those technologies. The PM should prepare an initial list of potential technologies to be assessed. When competing designs exist, the PM should identify possible technologies separately for each design. The PM should make key technical people available to the SME team to clarify information about the program. The SME team should recommend changes to the list of critical technologies to assess to the PM. Inputs to this process include the list of technologies developed by the PM and specific technical planning performed by existing or previous contractors or government agencies. The SME team should be given full access to these data Collect Evidence of Maturity Appropriate data and information are needed to assess whether the technologies of the program have been demonstrated in a relevant environment. The process of collecting and organizing the material for each technology should begin as early as possible. The PM should compile component or subsystem test descriptions, environments, and results in the context of the system s functional needs as needed to conduct his/her own assessment of technology maturity and as needed by the SME team to complete its work. Any other analyses and information necessary to assess and rationalize the maturity of the technologies should also be included Assess Technology Maturity SME Team Assessment 2-6
11 The PM must make key data, test results, and technical people available to the SME team to clarify information about the program. The SME team should assess critical technologies to determine whether these technologies have been demonstrated in a relevant environment and whether risk has been reduced or can be reduced to an acceptable level for inclusion in an EMD program. Before the assessment process begins, the SME team must ensure a sufficient understanding of the requirements, identified capabilities, system and software architectures, CONOPS, and/or the concept of employment to define the relevant environments. The SME team must also ensure that its understanding of design details is sufficient to evaluate how the technologies will function and interface Prepare, Coordinate, and Submit the TRA Report The CAE will submit a draft TRA report to ASD(R&E) 30 days prior to the Pre- MS B DAB Program Review. An update will be submitted after PDR and source selection and before formal MS B or other certification decision event. Generally, the TRA report should consist of (1) a short description of the program; (2) a list of critical technologies that pose a potential risk of program execution success, with the PM s assessment of the maturity of those technologies as demonstrated in a relevant environment and a description of any risk-mitigation plans; (3) the SME team membership and credentials; (4) SME team findings, conclusions, supporting evidence, and major dissenting opinions; and (5) a cover letter signed by the CAE approving the report; forwarding any requests for waivers of the 2366b certification requirement with supporting rationale, and providing other technical information deemed pertinent by the CAE and PM. The CAE and PM can provide any supplemental material as desired. The TRA report should present the evidence and rationale for the final assessment. Evidence could include records of tests or applications of the technology, technical papers, reports, presentations, and so forth. It should explain how the material was used or interpreted to make the assessment. The report should reference the sources and the pages in these sources for the evidence presented in the report to determine whether a technology has been demonstrated in a relevant environment. The material should explain the function of each technology at the component, subsystem, and system levels. The report should also contain an explicit description of the program increments or spirals covered if appropriate and relevant to the Milestone decision ASD(R&E) Review and Evaluation ASD(R&E) will evaluate the TRA in consultation with the CAE and the PM. ASD(R&E) will provide the MDA an independent assessment of technology maturity based on this process. 2-7
12 ASD(R&E) will prepare a memorandum that contains the evaluation results of the TRA. The memo will summarize ASD(R&E) s determination as to whether the technologies of the program have been demonstrated in a relevant environment; if not, whether or not a waiver is acceptable; and a recommendation on the adequacy of riskmitigation plans and the readiness of the program to proceed to the next stage of the acquisition process. The memorandum is sent to the MDA, with copies to the Overarching Integrated Product Team (OIPT), the CAE, and the PM. 2.4 Submitting a TRA Skeletal Template for a TRA The TRA report should consist of (1) a short description of the program; (2) a list of critical technologies that pose a potential risk of program execution success, with the PM s assessment of the maturity of those technologies as demonstrated in a relevant environment and a description of any risk-mitigation plans; (3) the SME team membership and credentials; (4) SME team findings, conclusions, supporting evidence, and major dissenting opinions; and (5) a cover letter signed by the CAE approving the report; forwarding any requests for waivers of the 2366b certification requirement with supporting rationale, and providing other technical information deemed pertinent by the CAE and PM. The CAE and PM can provide any supplemental material as desired. The following outline is a skeletal template for TRA submissions: 1.0 Purpose of This Document 2.0 Executive Summary 3.0 Program Overview 3.1 Program Objective 3.2 Program Description 3.3 System Description 4.0 Program Technology Risks Summary and Readiness Assessment 4.1 Process Description 4.2 Identification of Technologies Assessed 4.3 PM s and SME Team s Assessments of Technology Risk and Technology Demonstration in a Relevant Environment First Technology 2-8
13 5.0 Summary Next Technology 2-9
14 2.4.2 Annotated Template for a TRA (Recommended or Nominal Length of Section) The following outline is an annotated version of the TRA template. 1.0 Purpose of This Document (One Paragraph) Provides a short introduction that includes the program name, the system name if different from the program name, and the Milestone or other decision point for which the TRA was performed. For example, This document presents an independent TRA for the UH-60M helicopter program in support of the MS B decision. The TRA was performed at the direction of the UH-60M Program Manager. 2.0 Executive Summary (One Page) 3.0 Program Overview 3.1 Program Objective (One Paragraph) States what the program is trying to achieve (e.g., new capability, improved capability, lower procurement cost, reduced maintenance or manning, and so forth). For MS B, refers to the Capability Development Document (CDD) that details the program objectives. 3.2 Program Description (One Page or Less) Briefly describes the program or program approach not the system. Does the program provide a new system or a modification to an existing operational system? Is it an evolutionary acquisition program? If so, what capabilities will be realized by increment? When is the Initial Operational Capability (IOC)? Does it have multiple competing prime contractors? Into what architecture does it fit? Does its success depend on the success of other acquisition programs? Also, explicitly identifies the program increments or spirals covered by the TRA, if relevant. 3.3 System Description (Nominally 5 Pages) Describes the overall system, the major subsystems, and components to give an understanding of what is being developed and to show what is new, unique, or special about them. This information should include the systems, components, and technologies to be assessed. Describes how the system works (if this is not obvious). 2-10
15 4.0 Program Technology Risks Summary and Readiness Assessment 4.1 Process Description (Nominally 2 Pages) Tells the composition of the SME team and what organizations or individuals were included. Identifies the special expertise of these participating organizations or individuals. This information should establish the subject matter expertise and the independence of the SME team. Members should be experts in relevant fields. Usually, the PM will provide most of the data and other information that form the basis of a TRA. Tells how technologies to be assessed were identified (i.e., the process and criteria used and who identified them). States what analyses and investigations were performed when making the assessment. 4.2 Identification of Technologies Assessed (as Needed) Lists the technologies included in the TRA and why they were selected as critical. Describes the relevant environment in which each technology was assessed. Normally, this would be the operational environment in which the system is intended to perform; however, this can be adjusted if the technology s environment will be controlled while it operates in the system in question. Includes a table that lists the technology name and includes a few words that describe the technology, its function, and the environment in which it will operate. The names of these technologies should be used consistently throughout the document. Includes any technologies that the SME team considers critical and that have not been included in previously fielded systems that will operate in similar environments. Note that the technologies of interest here are not routine engineering or integration risk elements. They are items that require more than the normal engineering development that would occur in design for production as opposed to technology maturation programs. 4.3 PM s and SME Team s Assessments of Technology Risk and Technology Demonstration in a Relevant Environment (as Needed) First Technology Describes the technology. Describes the function it performs and, if needed, how it relates to other parts of the system. Provides a synopsis of development history and status. If necessary, this synopsis can include facts about related uses of the same or similar technology, numbers of hours breadboards were tested, numbers of prototypes built and tested, relevance of the test conditions, and results achieved. 2-11
16 Describes the environment in which the technology has been demonstrated. Provides a brief analysis of the similarities between the demonstrated environment and the intended operational environment. States whether the assessed technology has been demonstrated in a relevant environment or not. Provides data, including references to papers, presentations, data tables, and facts that support the assessments as needed. These references/tables/graphs can be included as an appendix. Provides a summary of planned risk-mitigation activities showing how those activities will reduce the risk of the technology to acceptable levels. Provides the SME team s concurrence or non-concurrence and the rationale therefore, and the SME team s assessment of the adequacy of proposed risk mitigation plans Next Technology For the other technologies assessed, this paragraph and the following paragraphs (e.g., 4.3.3, 4.3.4, and so forth) present the same type of information that was presented in paragraph Summary (One Page) Includes a table that lists the technologies that were assessed, the degree of risk associated with each, recommended mitigation measures if any, and whether each was demonstrated in a relevant environment. Summarizes any technologies for which the PM and the SME team are in disagreement as to the degree of risk or whether the technology has been demonstrated in a relevant environment. 2-12
17 2.5 TRL Definitions, Descriptions, and Supporting Information TRL Definition Description Supporting Information 1 Basic principles observed and reported. 2 Technology concept and/or application formulated. 3 Analytical and experimental critical function and/or characteristic proof of concept. 4 Component and/or breadboard validation in a laboratory environment. 5 Component and/or breadboard validation in a relevant environment. 6 System/subsystem model or prototype demonstration in a relevant environment. Lowest level of technology readiness. Scientific research begins to be translated into applied research and development (R&D). Examples might include paper studies of a technology s basic properties. Invention begins. Once basic principles are observed, practical applications can be invented. Applications are speculative, and there may be no proof or detailed analysis to support the assumptions. Examples are limited to analytic studies. Active R&D is initiated. This includes analytical studies and laboratory studies to physically validate the analytical predictions of separate elements of the technology. Examples include components that are not yet integrated or representative. Basic technological components are integrated to establish that they will work together. This is relatively low fidelity compared with the eventual system. Examples include integration of ad hoc hardware in the laboratory. Fidelity of breadboard technology increases significantly. The basic technological components are integrated with reasonably realistic supporting elements so they can be tested in a simulated environment. Examples include high-fidelity laboratory integration of components. Representative model or prototype system, which is well beyond that of TRL 5, is tested in a relevant environment. Represents a major step up in a technology s demonstrated readiness. Examples include testing a prototype in a high-fidelity Published research that identifies the principles that underlie this technology. References to who, where, when. Publications or other references that outline the application being considered and that provide analysis to support the concept. Results of laboratory tests performed to measure parameters of interest and comparison to analytical predictions for critical subsystems. References to who, where, and when these tests and comparisons were performed. System concepts that have been considered and results from testing laboratoryscale breadboard(s). References to who did this work and when. Provide an estimate of how breadboard hardware and test results differ from the expected system goals. Results from testing laboratory breadboard system are integrated with other supporting elements in a simulated operational environment. How does the relevant environment differ from the expected operational environment? How do the test results compare with expectations? What problems, if any, were encountered? Was the breadboard system refined to more nearly match the expected system goals? Results from laboratory testing of a prototype system that is near the desired configuration in terms of performance, weight, and volume. How did the test environment differ from the operational environment? Who performed the tests? How did the test compare with expectations? What problems, if any, were encountered? What are/were the plans, options, or 2-13
18 TRL Definitions, Descriptions, and Supporting Information (Continued) TRL Definition Description Supporting Information 7 System prototype demonstration in an operational environment. 8 Actual system completed and qualified through test and demonstration. 9 Actual system proven through successful mission operations. laboratory environment or in a simulated operational environment. Prototype near or at planned operational system. Represents a major step up from TRL 6 by requiring demonstration of an actual system prototype in an operational environment (e.g., in an aircraft, in a vehicle, or in space). Technology has been proven to work in its final form and under expected conditions. In almost all cases, this TRL represents the end of true system development. Examples include developmental test and evaluation (DT&E) of the system in its intended weapon system to determine if it meets design specifications. Actual application of the technology in its final form and under mission conditions, such as those encountered in operational test and evaluation (OT&E). Examples include using the system under operational mission conditions. actions to resolve problems before moving to the next level? Results from testing a prototype system in an operational environment. Who performed the tests? How did the test compare with expectations? What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before moving to the next level? Results of testing the system in its final configuration under the expected range of environmental conditions in which it will be expected to operate. Assessment of whether it will meet its operational requirements. What problems, if any, were encountered? What are/were the plans, options, or actions to resolve problems before finalizing the design? OT&E reports. 2-14
19 List of Acronyms ACAT AS ASD(R&E) ADM CAE CDD CONOPS DAE DAB DoDI DSAB DT&E EMD IMP IMS IOC MDA MDAP MAIS MS MSA OIPT OT&E PEO PDR PM R&D RFP S&T SME TDS Acquisition Category Acquisition Strategy Assistant Secretary of Defense for Research and Engineering Acquisition Decision Memorandum Component Acquisition Executive Capabilities Development Document Concept of Operations Defense Acquisition Executive Defense Acquisition Board Department of Defense Instruction Defense Space Acquisition Board Developmental Test and Evaluation Engineering and Manufacturing Development Integrated Master Plan Integrated Master Schedule Initial Operational Capability Milestone Decision Authority Major Defense Acquisition Program Major Automated Information System Milestone Material Solution Analysis Overarching Integrated Product Team Operational Test and Evaluation Program Executive Officer Preliminary Design Review Program Manager Research and Development Request for Proposal Science and Technology Subject Matter Expert Technology Development Strategy ACR-1
20 TRA TRL U.S.C. USD(AT&L) WBS WSARA Technology Readiness Assessment Technology Readiness Level United States Code Under Secretary of Defense for Acquisition, Technology, and Logistics Work Breakdown Structure Weapons Systems Acquisition Reform Act ACR-2
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 5000.02 January 7, 2015 USD(AT&L) SUBJECT: Operation of the Defense Acquisition System References: See References 1. PURPOSE. This instruction: a. In accordance
Technology Readiness Assessment (TRA) Deskbook
DEPARTMENT OF DEFENSE Technology Readiness Assessment (TRA) Deskbook July 2009 Prepared by the Director, Research Directorate (DRD) Office of the Director, Defense Research and Engineering (DDR&E) This
WORKFORCE COMPOSITION CPR. Verification and Validation Summit 2010
WORKFORCE COMPOSITION CPR PEO IEW&S Organizational Assessment VCSA Brief Date 2010 October 13, 2010 This briefing is UNCLASSIFIED/FOUO PREDECISIONAL LIMITED DISTRIBUTION AS OF: 11 Sep 2010 Verification
Manufacturing Readiness Level (MRL) Deskbook Version 2.0 May, 2011
Manufacturing Readiness Level (MRL) Deskbook Version 2.0 May, 2011 Prepared by the OSD Manufacturing Technology Program In collaboration with The Joint Service/Industry MRL Working Group This document
FOUNDATION: Material Solution Analysis Is More Than Selecting an Alternative
Establishing the Technical FOUNDATION: Material Solution Analysis Is More Than Selecting an Alternative Aileen G. Sedmak, Zachary S. Taylor, and Lt Col William A. Riski, USAF (Ret.) Several government
UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 16 R-1 Line #145
Exhibit R-2, RDT&E Budget Item Justification: PB 2015 Office of Secretary Of Defense Date: March 2014 0400: Research, Development, Test & Evaluation, Defense-Wide / BA 6: RDT&E Management Support COST
Achieving True Risk Reduction through Effective Risk Management
Achieving True Risk Reduction through Effective Pete Nolte Deputy Director, Major Program Support Office of the Deputy Assistant Secretary of Defense for Systems Engineering 16th Annual NDIA Systems Engineering
CTC Technology Readiness Levels
CTC Technology Readiness Levels Readiness: Software Development (Adapted from CECOM s Software Technology Readiness Levels) Level 1: Basic principles observed and reported. Lowest level of software readiness.
Position Classification Flysheet for Logistics Management Series, GS-0346
Position Classification Flysheet for Logistics Management Series, GS-0346 Table of Contents SERIES DEFINITION... 2 SERIES COVERAGE... 2 EXCLUSIONS... 4 DISTINGUISHING BETWEEN LOGISTICS MANAGEMENT AND OTHER
Headquarters U.S. Air Force
Headquarters U.S. Air Force I n t e g r i t y - S e r v i c e - E x c e l l e n c e Air Force Technology Readiness Assessment (TRA) Process for Major Defense Acquisition Programs LtCol Ed Masterson Mr
Interim DoDI 5000.02 -- The Cliff Notes Version --
Interim DoDI 5000.02 -- The Cliff Notes Version -- A Quick Glance at New Guidance -- 17 December 2013 -- Defense Acquisition University/Midwest Region [email protected] 17 Jan 2014, v 1.0 The New Department
Department of Defense DIRECTIVE
Department of Defense DIRECTIVE NUMBER 5000.01 May 12, 2003 Certified Current as of November 20, 2007 SUBJECT: The Defense Acquisition System USD(AT&L) References: (a) DoD Directive 5000.1, The Defense
COMPLIANCE WITH THIS PUBLICATION IS MANDATORY
BY ORDER OF THE COMMANDER AIR FORCE RESEARCH LABORATORY (AFRL) AIR FORCE RESEARCH LABORATORY INSTRUCTION 61-104 16 OCTOBER 2013 Scientific/Research and Development SCIENCE AND TECHNOLOGY (S&T) SYSTEMS
Overview of SAE s AS6500 Manufacturing Management Program. David Karr Technical Advisor for Mfg/QA AFLCMC/EZSM 937-255-7450 [email protected].
Overview of SAE s AS6500 Manufacturing Management Program David Karr Technical Advisor for Mfg/QA AFLCMC/EZSM 937-255-7450 [email protected] 1 Outline Background Objectives Requirements Implementation
DEPARTMENT OF THE AIR FORCE HEADQUARTERS AERONAUTICAL SYSTEMS CENTER (AFMC) WRIGHT-PATTERSON AIR FORCE BASE OHIO
DEPARTMENT OF THE AIR FORCE HEADQUARTERS AERONAUTICAL SYSTEMS CENTER (AFMC) WRIGHT-PATTERSON AIR FORCE BASE OHIO BULLETIN AWB 002A 17 May 2011 ((supersedes AWB-002) United States Air Force (USAF) Airworthiness
Developing Work Breakdown Structures
Developing Work Breakdown Structures International Cost Estimating & Analysis Association June 2014 Neil F. Albert MCR, LLC [email protected] 703-506-4600 2014 MCR, LLC Distribution prohibited without express
Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.
INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. February 2013 1 Executive Summary Adnet is pleased to provide this white paper, describing our approach to performing
INTRODUCTION... 6 PART I: PLANNING AND ORGANIZING... 8. Objective... 18
Contents INTRODUCTION... 6 PART I: PLANNING AND ORGANIZING... 8 Objective... 8 1.1. Timing... 8 1.2. Process... 8 1.3. Process Description... 8 Step 1: Select Team Leader and Team Members... 8 Step 2:
Systems Development Life Cycle (SDLC)
DEPARTMENT OF BUDGET & MANAGEMENT (SDLC) Volume 1 Introduction to the SDLC August 2006 Table of Contents Introduction... 3 Overview... 4 Page 2 of 17 INTRODUCTION 1.0 STRUCTURE The SDLC Manual consists
THE UNDER SECRETARY OF DEFENSE 30 1 0 DEFENSE PENTAGON WASHINGTON, DC 20301-3010
THE UNDER SECRETARY OF DEFENSE 30 1 0 DEFENSE PENTAGON WASHINGTON, DC 20301-3010 ACQUISmON, TECHNOLOGY AND LOGISTICS NOV 0 8 2013 MEMORANDUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS COMPONENT ACQUISITION
Thinking About Agile in DoD
Thinking About Agile in DoD Stephen Welby Deputy Assistant Secretary of Defense for Systems Engineering AFEI Agile for Government Summit November 21, 2013 11/21/2013 Page-1 US Department of Defense (DoD)
OPERATING AND SUPPORT COST-ESTIMATING GUIDE OFFICE OF THE SECRETARY OF DEFENSE COST ANALYSIS IMPROVEMENT GROUP
OPERATING AND SUPPORT COST-ESTIMATING GUIDE OFFICE OF THE SECRETARY OF DEFENSE COST ANALYSIS IMPROVEMENT GROUP OCTOBER 2007 TABLE OF CONTENTS 1. INTRODUCTION...1-1 1.1 PURPOSE... 1-1 1.2 APPLICABILITY...
How To Become A Senior Contracting Official
Program Management Specific Functional Requirements for Key Leadership Positions (Attributes and Demonstrated Experience Beyond Level III Certification) Education: o Advanced Degree Preferably in a technical,
1 July 2015 Version 1.0
1 July 2015 Version 1.0 Cleared for Open Publication June 26, 2015 DoD Office of Prepublication and Security Review Cybersecurity T&E Guidebook ii July 1, 2015 Version 1.0 Table of Contents 1 INTRODUCTION...
Information Technology (IT) Investment Management Insight Policy for Acquisition
MEMORANDUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS CHAIRMAN OF THE JOINT CHIEFS OF STAFF UNDER SECRETARIES OF DEFENSE DIRECTOR, DEFENSE RESEARCH AND ENGINEERING ASSISTANT SECRETARIES OF DEFENSE GENERAL
THE UNDER SECRETARY OF DEFENSE 30 1 0 DEFENSE PENTAGON WASHINGTON, DC 20301-3010
THE UNDER SECRETARY OF DEFENSE 30 1 0 DEFENSE PENTAGON WASHINGTON, DC 20301-3010 ACQUISmON, TECHNOLOGY AND LOGISTICS NOV 0 8 2013 MEMORANDUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS COMPONENT ACQUISITION
John T. Dillard [email protected] 831.656.2650 GRADUATE SCHOOL OF BUSINESS & PUBLIC POLICY U.S. NAVAL POSTGRADUATE SCHOOL
John T. Dillard [email protected] 831.656.2650 GRADUATE SCHOOL OF BUSINESS & PUBLIC POLICY U.S. NAVAL POSTGRADUATE SCHOOL Progressive elaboration (vs. Requirements creep ) Iterative design/rapid prototyping
Department of Defense MANUAL
Department of Defense MANUAL NUMBER 7600.07 August 3, 2015 IG DoD SUBJECT: DoD Audit Manual References: See Enclosure 1 1. PURPOSE. This manual: a. Reissues DoD 7600.07-M (Reference (a)) in accordance
U.S. Department of Education Federal Student Aid
U.S. Department of Education Federal Student Aid Lifecycle Management Methodology Stage Gate Review Process Description Version 1.3 06/30/2015 Final DOCUMENT NUMBER: FSA_TOQA_PROC_STGRW.NA_001 Lifecycle
Program Lifecycle Methodology Version 1.7
Version 1.7 March 30, 2011 REVISION HISTORY VERSION NO. DATE DESCRIPTION AUTHOR 1.0 Initial Draft Hkelley 1.2 10/22/08 Updated with feedback Hkelley 1.3 1/7/2009 Copy edited Kevans 1.4 4/22/2010 Updated
OPERATING AND SUPPORT COST-ESTIMATING GUIDE
OPERATING AND SUPPORT COST-ESTIMATING GUIDE OFFICE OF THE SECRETARY OF DEFENSE COST ASSESSMENT AND PROGRAM EVALUATION MARCH 2014 This page intentionally left blank Contents 1. INTRODUCTION... 1-1 1.1 Purpose...
Department of Defense INSTRUCTION. SUBJECT: Information Assurance (IA) in the Defense Acquisition System
Department of Defense INSTRUCTION NUMBER 8580.1 July 9, 2004 SUBJECT: Information Assurance (IA) in the Defense Acquisition System ASD(NII) References: (a) Chapter 25 of title 40, United States Code (b)
System (of Systems) Acquisition Maturity Models and Management Tools
System (of Systems) Acquisition Maturity Models and Management Tools Brian J. Sauser, Ph.D. Jose Ramirez-Marquez, Ph.D. Stevens Institute of School of Systems and Enterprise Readiness Level (TRL) System
NODIS Library Program Formulation(7000s) Search
NODIS Library Program Formulation(7000s) Search NASA Procedural Requirements This Document Is Uncontrolled When Printed. Check the NASA Online Directives Information System (NODIS) Library to verify that
Technical Performance Measurement A Program Manager s Barometer
PROGRAM MANAGEMENT TOOLS Technical Performance Measurement A Program Manager s Barometer DCMA Pilots a Modified Approach to TPMs Technical Performance Measurement has been in widespread use for many years
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 5000.02 December 8, 2008 USD(AT&L) SUBJECT: Operation of the Defense Acquisition System References: See Enclosure 1 1. PURPOSE. This Instruction: a. Reissues Reference
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 5200.44 November 5, 2012 DoD CIO/USD(AT&L) SUBJECT: Protection of Mission Critical Functions to Achieve Trusted Systems and Networks (TSN) References: See Enclosure
Criteria for Flight Project Critical Milestone Reviews
Criteria for Flight Project Critical Milestone Reviews GSFC-STD-1001 Baseline Release February 2005 Approved By: Original signed by Date: 2/19/05 Richard M. Day Director, Independent Technical Authority
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 3204.01 August 20, 2014 USD(AT&L) SUBJECT: DoD Policy for Oversight of Independent Research and Development (IR&D) References: See Enclosure 1 1. PURPOSE. This
Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344
Position Classification Standard for Management and Program Clerical and Assistance Series, GS-0344 Table of Contents SERIES DEFINITION... 2 EXCLUSIONS... 2 OCCUPATIONAL INFORMATION... 3 TITLES... 6 EVALUATING
Using the Technology Readiness Levels Scale to Support Technology Management in the DoD s ATD/STO Environments
Using the Technology Readiness Levels Scale to Support Technology Management in the DoD s ATD/STO Environments A Findings and Recommendations Report Conducted for Army CECOM Caroline P. Graettinger, PhD
Department of Defense DIRECTIVE
Department of Defense DIRECTIVE NUMBER 4180.01 April 16, 2014 USD(AT&L) SUBJECT: DoD Energy Policy References: See Enclosure 1 1. PURPOSE. This directive: a. Establishes policy and guidance and assigns
THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC 20301 3010
THE UNDER SECRETARY OF DEFENSE 3010 DEFENSE PENTAGON WASHINGTON, DC 20301 3010 ACQUlsmON, TECHNOLOGY AND LOG ISTICS AUG 1 0 2011 MEMORANDUM FOR SECRETARIES OF THE MILITARY DEPARTMENTS CHAIRMAN OF THE JOINT
In today s acquisition environment,
4 The Challenges of Being Agile in DoD William Broadus In today s acquisition environment, it no longer is unusual for your program to award a product or service development contract in which the vendor
10 Steps for Managing Sub-Grants
New Partners Initiative 10 Steps for Managing Sub-Grants DRAFT Table of Contents ACRONYMS AED PEPFAR NPI USG Introduction T he Academy for Educational Development (AED) provides technical support for grantees
Statement of work CDRL, And Tracking Tool (SCATT)
Statement of work CDRL, And Tracking Tool (SCATT) September 2011 1 Agenda Overview & Introduction Brief History SCATT Tool Components SOW Questionnaire CDRL Wizard CDRL Tracking Tool Who uses the SCATT
Department of Defense MANUAL. Procedures for Ensuring the Accessibility of Electronic and Information Technology (E&IT) Procured by DoD Organizations
Department of Defense MANUAL NUMBER 8400.01-M June 3, 2011 ASD(NII)/DoD CIO SUBJECT: Procedures for Ensuring the Accessibility of Electronic and Information Technology (E&IT) Procured by DoD Organizations
Old Phase Description New Phase Description
Prologue This amendment of The FAA and Industry Guide to Product Certification (CPI Guide) incorporates changes based on lessons learned and supports the broader use of this guide by providing additional
Exhibit F. VA-130620-CAI - Staff Aug Job Titles and Descriptions Effective 2015
Applications... 3 1. Programmer Analyst... 3 2. Programmer... 5 3. Software Test Analyst... 6 4. Technical Writer... 9 5. Business Analyst... 10 6. System Analyst... 12 7. Software Solutions Architect...
Minnesota Health Insurance Exchange (MNHIX)
Minnesota Health Insurance Exchange (MNHIX) 1.2 Plan September 21st, 2012 Version: FINAL v.1.0 11/9/2012 2:58 PM Page 1 of 87 T A B L E O F C O N T E N T S 1 Introduction to the Plan... 12 2 Integration
Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs (D-2004-109)
August 17, 2004 Acquisition Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs (D-2004-109) Department of Defense Office of the Inspector General Quality
ODIG-AUD (ATTN: Audit Suggestions) Department of Defense Inspector General 400 Army Navy Drive (Room 801) Arlington, VA 22202-4704
Additional Copies To obtain additional copies of this report, visit the Web site of the Department of Defense Inspector General at http://www.dodig.mil/audit/reports or contact the Secondary Reports Distribution
Department of Defense MANUAL
Department of Defense MANUAL NUMBER 5000.69 July 30, 2014 USD(AT&L) SUBJECT: Joint Services Weapon Safety Review (JSWSR) Process References: See Enclosure 1 1. PURPOSE. In accordance with the authority
SOFTWARE DEVELOPMENT AND DOCUMENTATION
DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. NOT MEASUREMENT SENSITIVE MIL-STD-498 5 December 1994 (PDF version) Superseding DOD-STD-2167A 29 February 1988 DOD-STD-7935A
Writing a Systems Engineering Plan, or a Systems Engineering Management Plan? Think About Models and Simulations
Writing a Systems Engineering Plan, or a Systems Engineering Management Plan? Think About Models and Simulations Philomena Zimmerman Office of the Deputy Assistant Secretary of Defense for Systems Engineering
The Program Managers Guide to the Integrated Baseline Review Process
The Program Managers Guide to the Integrated Baseline Review Process April 2003 Table of Contents Foreword... 1 Executive Summary... 2 Benefits... 2 Key Elements... 3 Introduction... 4 IBR Process Overview...
How to write a proposal
How to write a proposal Workshop Innovative space-based solutions for Future Cities July 8-9, 2014 Vienna Rob Postema Telecommunications and Integrated Applications Directorate European Space Agency ESA
Cost and Software Data Reporting (CSDR) Manual
DoD 5000.04 M 1 Cost and Software Data Reporting (CSDR) Manual April 18, 2007 Cost Analysis Improvement Group FOREWORD This Manual reissues DoD 5000.4-M-1 (Reference (a)) pursuant to the authority of DoD
Technology Program Management Model (TPMM) A Systems-Engineering Approach to Technology Development Program Management
UNCLASSIFIED Technology Program Management Model (TPMM) A Systems-Engineering Approach to Technology Development Program Management 10-26-2006 Mike Ellis TPMM Development Manager Dynetics, Inc. [email protected]
Appendix V Risk Management Plan Template
Appendix V Risk Management Plan Template Version 2 March 7, 2005 This page is intentionally left blank. Version 2 March 7, 2005 Title Page Document Control Panel Table of Contents List of Acronyms Definitions
Gartner, Inc. DIR-SDD-2042
Texas Department of Information Resources STATEMENT OF WORK (SOW) FOR DELIVERABLES-BASED INFORMATION TECHNOLOGY SERVICES Identity & Access Management Analysis IT Assessment & Planning Gartner, Inc. DIR-SDD-2042
DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION. Defense Acquisition Executive Summary (DAES)
DEPARTMENT OF DEFENSE Defense Contract Management Agency INSTRUCTION Defense Acquisition Executive Summary (DAES) Portfolio Management and Integration Directorate DCMA-INST 406 DCMA-PII Validated current,
National Defense Industrial Association Systems Engineering Division Task Group Report Top Five Systems Engineering Issues
National Defense Industrial Association Systems Engineering Division Task Group Report Top Five Systems Engineering Issues In Defense Industry January, 2003 Vers 9, 1/23/03 Background The Director, Systems
Document no. EGC 1. Rules of Procedure (April 2016)
Document no. EGC 1 Rules of Procedure (April 2016) Introduction An amendment to the Rules of Procedure was submitted by Belgium. A copy of their proposed amendment and rationale is included at the beginning
DOD BUSINESS SYSTEMS MODERNIZATION. Additional Action Needed to Achieve Intended Outcomes
United States Government Accountability Office Report to Congressional Committees July 2015 DOD BUSINESS SYSTEMS MODERNIZATION Additional Action Needed to Achieve Intended Outcomes GAO-15-627 July 2015
SOFTWARE ASSURANCE STANDARD
NOT MEASUREMENT SENSITIVE National Aeronautics and NASA-STD-8739.8 w/change 1 Space Administration July 28, 2004 SOFTWARE ASSURANCE STANDARD NASA TECHNICAL STANDARD REPLACES NASA-STD-2201-93 DATED NOVEMBER
MU2E PROCUREMENT MANAGEMENT PLAN
MU2E PROCUREMENT MANAGEMENT PLAN Version 1.1 June 29, 2012 Mu2e-doc-xxx Approved By: R. Ray date Mu2e Project Manager J. Collins date Fermilab Procurement Department Head G. Bock date Fermilab Associate
IT Acquisition Guide DISA. Proactively Tai l ored Acquisiti on Models and Processes to Guide DI SA's Acquisition of IT Products and Services
DISA IT Acquisition Guide Proactively Tai l ored Acquisiti on Models and Processes to Guide DI SA's Acquisition of IT Products and Services 1 November 2013 Version 1.0 This Page Intentionally Blank 1 DISA
Update: OSD Systems Engineering Revitalization Efforts
Update: OSD Systems Engineering Revitalization Efforts 23 October 2007 Col Rich Hoeferkamp Ms. Sharon Vannucci Systems and Software Engineering (Enterprise Development) Office of the Deputy Under Secretary
Department of Defense INSTRUCTION
Department of Defense INSTRUCTION NUMBER 5200.39 May 28, 2015 USD(I)/USD(AT&L) SUBJECT: Critical Program Information (CPI) Identification and Protection Within Research, Development, Test, and Evaluation
Re-Issuance of DOD Instruction 5000.02
Re-Issuance of DOD Instruction 5000.02 1 2 Overarching Objectives Decrease emphasis on rules and increase emphasis on process intent and thoughtful program planning Provide program structures and procedures
Innovation-based Business Opportunity (IBO) Form
Innovation-based Business Opportunity (IBO) Form Please answer each of the following questions: Title: Date: Innovation-based Business Opportunity (IBO) Form 1 1. Summary (300 words) Briefly summarise
Overview. FedRAMP CONOPS
Concept of Operations (CONOPS) Version 1.0 February 7, 2012 Overview Cloud computing technology allows the Federal Government to address demand from citizens for better, faster services and to save resources,
Identification. Preparation and formulation. Evaluation. Review and approval. Implementation. A. Phase 1: Project identification
II. Figure 5: 6 The project cycle can be explained in terms of five phases: identification, preparation and formulation, review and approval, implementation, and evaluation. Distinctions among these phases,
Chapter 5. Planning the Audit Engagement
Chapter 5 Planning the Audit Engagement A. Purpose for Planning the Engagement Engagement planning is performed to provide a means for developing an understanding of the business objectives of the auditee,
System Life Cycle and Methodologies
Part 1: Acquistion GSAM Version 3.0 Chapter 5 System Life Cycle and Methodologies Contents 5.1 Life Cycle Process and Decision Making... 5-4 5.1.1 System-of-Systems View... 5-4 5.2 DoD Decision Support
Session 4. System Engineering Management. Session Speaker : Dr. Govind R. Kadambi. M S Ramaiah School of Advanced Studies 1
Session 4 System Engineering Management Session Speaker : Dr. Govind R. Kadambi M S Ramaiah School of Advanced Studies 1 Session Objectives To learn and understand the tasks involved in system engineering
DESIGN AND INTEGRATION OF EQUIPMENT. (B) Process Flow Guideline
LLE INSTRUCTION 7700H LLEINST 7700H SUBJECT: APPENDIX: DESIGN AND INTEGRATION OF EQUIPMENT (A) List of Acronyms (B) Process Flow Guideline ENCLOSURES: 1. Equipment Qualification Checklist (EQC) 2. Project
U.S. Dept. of Defense Systems Engineering & Implications for SE Implementation in Other Domains
U.S. Dept. of Defense Systems Engineering & Implications for SE Implementation in Other Domains Mary J. Simpson System Concepts 6400 32 nd Northwest, #9 Seattle, WA 98107 USA Joseph J. Simpson System Concepts
Internal Audit Quality Assessment. Presented To: World Intellectual Property Organization
Internal Audit Quality Assessment Presented To: World Intellectual Property Organization April 2014 Table of Contents List of Acronyms 3 Page Executive Summary Opinion as to Conformance to the Standards,
Design Reviews. Introduction
The design review provides a forum in which questions can be answered, assumptions clarified and advice sought. They are a useful mechanism whereby the design of a product can be optimised through a systematic
IT Baseline Management Policy. Table of Contents
Table of Contents 1. INTRODUCTION... 1 1.1 Purpose... 2 1.2 Scope and Applicability... 2 1.3 Compliance, Enforcement, and Exceptions... 3 1.4 Authority... 3 2. ROLES, RESPONSIBILITIES, AND GOVERNANCE...
Department of Administration Portfolio Management System 1.3 June 30, 2010
E 06/ 30/ 2010 EX AM PL 1. 3 06/ 28/ 2010 06/ 24/ 2010 06/ 23/ 2010 06/ 15/ 2010 06/ 18/ 2010 Portfolio System 1.3 June 30, 2010 Contents Section 1. Project Overview... 1 1.1 Project Description... 1 1.2
Information Technology Project Oversight Framework
i This Page Intentionally Left Blank i Table of Contents SECTION 1: INTRODUCTION AND OVERVIEW...1 SECTION 2: PROJECT CLASSIFICATION FOR OVERSIGHT...7 SECTION 3: DEPARTMENT PROJECT MANAGEMENT REQUIREMENTS...11
System Development and Life-Cycle Management (SDLCM) Methodology
System Development and Life-Cycle Management (SDLCM) Methodology Subject Type Standard Approval CISSCO Program Director A. PURPOSE This standard specifies the content and format requirements for a Software
CalMod Design-Build Electrification Services
SECTION 01800 SYSTEMS INTEGRATION AND INTEGRATOR REQUIREMENTS PART 1 GENERAL DESCRIPTION A. This section specifies the system-wide integration requirements for the Caltrain Electrification system, i.e.
TOPIC 12 CONTRACT COST AND PRICE ANALYSIS
CONTRACT AUDIT GUIDE FRAMEWORK TOPIC 12 CONTRACT COST AND PRICE ANALYSIS Objective To determine whether the agency effectively conducts cost or price analysis to arrive at fair and reasonable prices for
MEMORANDUM FOR THE HEADS OF DEPARTMENTS AND AGENCIES
MEMORANDUM FOR THE HEADS OF DEPARTMENTS AND AGENCIES FROM: SUBJECT: Peter R. Orszag Director Managing the Multi-Sector Workforce Federal agencies use both federal employees and private sector contractors
ACQUISITION ALERT 11-04. Delegation of Invoice Approval Authority
February 2, 2011 ACQUISITION ALERT 11-04 Delegation of Invoice Approval Authority This Acquisition Alert is being transmitted to all NOAA Heads of Contracting Offices (HCOs) for dissemination within their
Army Regulation 702 11. Product Assurance. Army Quality Program. Headquarters Department of the Army Washington, DC 25 February 2014 UNCLASSIFIED
Army Regulation 702 11 Product Assurance Army Quality Program Headquarters Department of the Army Washington, DC 25 February 2014 UNCLASSIFIED SUMMARY of CHANGE AR 702 11 Army Quality Program This major
IT Project: System Implementation Project Template Description
2929 Campus Drive Suite 250 IT Project: System Implementation Project Template Description Table of Contents Introduction... 2 Project Phases... 3 Initiation & Requirements Gathering Milestone... 3 Initiation
PROGRAM EDUCATIONAL OBJECTIVES (PEOs): PEOs should be aligned with the Program Mission
PROGRAM ASSESSMENT Program Mission, Program Educational Objectives, and Program Outcomes Department of Aviation and Technology, College of Engineering, San Jose State University Degree Program: M.S. Quality
Guidelines for Cybersecurity DT&E v1.0
Guidelines for Cybersecurity DT&E v1.0 1. Purpose. These guidelines provide the means for DASD(DT&E) staff specialists to engage and assist acquisition program Chief Developmental Testers and Lead DT&E
TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION
TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION The Customer Account Data Engine 2 Systems Development Guidelines; However, Process Improvements Are Needed to Address Inconsistencies September 30, Year
Actuarial Standard of Practice No. 23. Data Quality. Revised Edition
Actuarial Standard of Practice No. 23 Data Quality Revised Edition Developed by the General Committee of the Actuarial Standards Board and Applies to All Practice Areas Adopted by the Actuarial Standards
CMS Policy for Configuration Management
Chief Information Officer Centers for Medicare & Medicaid Services CMS Policy for Configuration April 2012 Document Number: CMS-CIO-POL-MGT01-01 TABLE OF CONTENTS 1. PURPOSE...1 2. BACKGROUND...1 3. CONFIGURATION
OPERATIONAL REQUIREMENTS DOCUMENT
TEMPLATE OPERATIONAL REQUIREMENTS DOCUMENT [Name of System or Product] to be developed by the [Name of Acquisition Program] [Name of Program Manager] Program Manager, [Name of Acquisition Program] [Name
