HOMELAND SECURITY STUDIES AND

Size: px
Start display at page:

Download "HOMELAND SECURITY STUDIES AND"

Transcription

1

2 OMELAND ECURITY TUDIE AND ANALYI INTITUTE The omeland ecurity Act of 2002 (ection 305 of PL , as codified in 6 U..C. 185), herein referred to as the Act, authorizes the ecretary of the Department of omeland ecurity (D), acting through the Under ecretary for cience and Technology, to establish one or more federally funded research and development centers or FFRDCs to provide independent analysis of homeland security issues. Analytic ervices Inc. operates the omeland ecurity tudies and Analysis Institute (sai) as a FFRDC for D under contract QDC-09-D sai provides the government with the necessary expertise to conduct: crosscutting mission analysis, strategic studies and assessments, development of models that baseline current capabilities, development of simulations and technical evaluations to evaluate mission trade-offs, creation and evolution of high-level operational and system concepts, development of top-level system and operational requirements and performance metrics, operational analysis across the homeland security enterprise, and analytic support for operational testing evaluation in tandem with the government s acquisition process. sai also works with supports other federal, state, local, tribal, public and private sector organizations that make up the homeland security enterprise. sai s research is undertaken by mutual consent with D and is organized by Tasks in the annual sai Research Plan. This report presents the results of research and analysis conducted under Task , ystem Readiness Assessment of sai s Fiscal Year 2009 Research Plan. The purpose of the task is to (i) update and document a technology readiness level calculator; (ii) document a technology readiness assessment methodology, and (iii) explore how AI can apply system readiness assessment approaches to support D in the future. This report presents the results of research and analysis conducted in fulfillment of (iii). The results presented in this report do not necessarily reflect official D opinion or policy.

3 OMELAND ECURITY TUDIE AND ANALYI INTITUTE Dr. David McGarvey Mr. Jim Olson Dr. cott avitz Task lead Dr. Gerald Diaz Division Manager Mr. George Thompson Deputy Director DEPARTMENT OF OMELAND ECURITY CIENCE AND TECNOLOGY READINE LEVEL CALCULATOR (VER 1.1) Final Report and User s Manual eptember 30, 2009 Prepared for Department of omeland ecurity cience and Technology Directorate

4 Department of omeland ecurity cience and Technology Readiness Level Calculator ACKNOWLEDGMENT This is a minor revision to a report published in 2008 by the omeland ecurity Institute, the predecessor organization to AI. We wish to acknowledge, with thanks, the very considerable contributions of Tavis teenbeke, the co-leader of the task that lead to that report and of Megan Mcugh and Laura Parker who were co-authors of that report. We would like to extend our gratitude to former AI member Ric lacksten and to AI member Melanie Cummings for their talent, support, and dedication to various aspects of this project. We would also like to thank William Nolte and James ilbro for sharing their insight and information about Technology Readiness Levels and the AFRL TRL Calculator (ver 2.2). It was an invaluable education for us, and critical to the development of the D &T RL Calculator (Ver1.1). Finally, we would like to thank Randy Long, Doug Drabkowski, Angela Ervin and amuel Francis at the Department of omeland ecurity, and Christopher mith and his staff at the Transportation ecurity Laboratory, for working with us to develop user requirements for the modified calculator and providing feedback on the content of the D &T RL Calculator. For information about this publication or other AI research, contact omeland ecurity tudies and Analysis Institute Analytic ervices Incorporated Quincy treet Arlington, VA Tel (703) Fax (703) AI Publication Number: This publication updates and supersedes I Publication Number ii

5 Department of omeland ecurity cience and Technology Readiness Level Calculator TALE OF CONTENT Executive ummary... v Introduction... 1 Objective... 2 Methodology and cope... 2 ackground... 3 Technology Readiness Levels (TRLs)... 3 Modification of TRL scale... 4 AFRL TRL Calculator (ver 2.2)... 7 Overview of the D &T RL Calculator (Ver1.1)... 9 ummary and Recommendations Appendix A: User s Manual for the D &T RL Calculator (Ver1.1) tarting Up For users with Excel 2003 and compatible versions of Excel For users with Excel Quick tart Instructions Detailed Operating Instructions D &T RL Calculator Worksheets TART ERE Worksheet RL Calculator Worksheet ummary Report Worksheet Appendix : TRL, MRL, and PRL Definitions and Descriptions used in the D &T RL Calculator (Ver1.1) Appendix C: Glossary of Terms for the D &T RL Calculator (Ver1.1) and Acronyms Acronyms Glossary References Appendix D: Decision algorithms, Change Log for the D &T RL Calculator (Ver1.1), and Explanation of Administrative Functions Decision Algorithms Log of modifications Administrative Functions Appendix E: Readiness Level Questions by Category iii

6 Department of omeland ecurity cience and Technology Readiness Level Calculator LIT OF TALE TALE 1: NAA TRL CALE... 3 TALE 2: D &T RL CALCULATOR (VER1.1) IG-LEVEL TRL DEFINITION TALE 3: D &T RL CALCULATOR (VER1.1) IG-LEVEL MRL DEFINITION TALE 4: D &T RL CALCULATOR (VER1.1) IG-LEVEL PRL DEFINITION TALE 5: TECNOLOGY READINE LEVEL DEFINITION AND DECRIPTION TALE 6: MANUFACTURING READINE LEVEL DEFINITION AND DECRIPTION TALE 7: PROGRAM READINE LEVEL DEFINITION AND DECRIPTION* TALE 8: D &T TECNOLOGY READINE LEVEL QUETION FOR TE MODIFIED CALCULATOR TALE 9: D &T MANUFACTURING READINE LEVEL QUETION FOR TE MODIFIED CALCULATOR TALE 10: D &T PROGRAMMATIC READINE LEVEL QUETION FOR TE MODIFIED CALCULATOR LIT OF FIGURE FIGURE 1: D &T RL CALCULATOR TART ERE WORKEET FIGURE 2: COLOR-CODE KEY FOR RL CALE IN RL CALCULATOR AND UMMARY REPORT WORKEET FIGURE 3: GREEN AND YELLOW ET POINT FIGURE 4: EXAMPLE OF TRL QUETION FOR RL FIGURE 5: EXAMPLE OF RL LAYOUT FIGURE 6: EXAMPLE OF REAL-TIME COLOR CODED READINE LEVEL iv

7 Department of omeland ecurity cience and Technology Readiness Level Calculator EXECUTIVE UMMARY Technology Readiness Assessments (TRAs) are important procedures for organizations managing resource allocation for technology development programs. A metric commonly employed in TRAs for approximating the degree of maturity of a technology is the Technology Readiness Level (TRL) scale first developed by the National Aeronautics and pace Administration (NAA). This scale has been implemented and modified since the early 1990 s in government programs. Ultimately, this work resulted in a calculator, the Air Force Research Laboratory (AFRL) TRL Calculator, that helps a user assess the TRL, Programmatic Readiness Level (PRL), and Manufacturing Readiness Level (MRL) of a given technology or system. This calculator is oriented in its terminology and structure to the Department of Defense (DoD) research, development, and acquisition process. In 2008 the omeland ecurity Institute (I), the predecessor to the AI modified the existing AFRL TRL Calculator for use within the Department of omeland ecurity cience and Technology Directorate. That new calculator, the Department of omeland ecurity (D) cience and Technology (&T) Readiness Level (RL) Calculator Version 1.0, allowed users to assess the RL of technology, as well as the RLs of manufacturing, and programmatics, independently of one another. This document presents Version 1.1, an updated version of the calculator that is compatible with either Excel 2003 or Excel (Version 1.0 was developed using Excel 2003.) It revises and supersedes the previous Final Report and Users Manual. Although we believe the calculator is a useful tool for &T Program Managers, its full potential is yet to be realized. ased on our research and analysis to date, AI recommends that D &T take appropriate steps to Develop entrance and exit criteria for each RL category. Develop guidelines for specific applications of the calculator. Continue refining concepts and terms specific to D &T. Develop a D &T-specific TRA methodology. Validate D &T RL Calculator results. Apply lessons learned by other organizations to modification of RL methodologies. As described in the body of the report, these efforts will entail a combination of research and analysis (e.g., identifying new questions and methods) and management actions (e.g., developing a management directive to govern the use of TRAs in &T processes). v

8 Department of omeland ecurity cience and Technology Readiness Level Calculator vi

9 Department of omeland ecurity cience and Technology Readiness Level Calculator INTRODUCTION Technology Readiness Assessments (TRAs) are established tools used to qualify technology development and help make investment decisions within those programs in order to deploy systems or elements of technology to an end user in a timely fashion. The National Aeronautics and pace Administration (NAA) and the Department of Defense (DoD) have taken the lead among government agencies in incorporating TRAs into their technology development programs and refining the TRA process specifically to their organizations in order to produce operational systems on schedule and within budget. TRAs are vital to the process of maturing technologies to the point where they can be operationally produced and deployed. They support that process by Providing metrics for technology maturity. TRAs help guide evaluation and tracking of technology maturity levels and program milestones. 1, 2 Identifying risk associated with technologies and investment requirements. TRAs help inform decisions associated with allocating resources and funds for a given technology development. Identifying potential problems early in a development process when solutions are less expensive and easier to execute. TRAs provide a systematic method for ensuring the success of a project by tracking completion of various steps as a project develops. Identifying gaps in testing, demonstration, and knowledge of a technology s current readiness level and the information and steps necessary to reach the required technology readiness level. 2 In the DoD, TRAs are a requirement of all acquisition programs. The DoD Technology Readiness Assessment Deskbook defines TRAs as a systematic, metric-based process and accompanying report that assesses the maturity of certain technologies. 1 Although the Department of omeland ecurity (D) cience and Technology Directorate (&T) does not have acquisition programs, per se, it does have the requirement to identify and develop technologies for use in homeland security. A metric commonly employed in TRAs for approximating and summarizing the degree of maturity of a technology is the Technology Readiness Level (TRL) scale first developed by the National Aeronautics and pace Administration (NAA) and since adapted and adopted by the Department of Defense (DoD) and other agencies. The TRL scale begins at 1 ( basic principles observed and reported ) and goes through 9 ( actual system flight proven through successful mission operations ). While a TRA does much more than assign a TRL, the TRL provides a useful summary index. Technology Readiness Levels (TRLs) can facilitate a cost-effective, systematic process of transitioning technology from development to an operational environment by identifying a program s technical risk in areas such as design, architecture, cost, schedule, and manufacturing. As a result, TRLs can be a useful tool for D Program Managers (PMs) and other decision makers in assessing technology development programs. 1 Deputy Under ecretary of Defense for cience and Technology (DUD(&T)). (May 2005). Technology Readiness Assessment (TRA) Deskbook. 2 U.. Department of Energy, Office of Environmental Management. (March 2008). Technology Readiness Assessment (TRA)/Technology Maturation Plan (TMP) Process Guide. 1

10 Department of omeland ecurity cience and Technology Readiness Level Calculator In 2001, William Nolte and coworkers at the Air Force Research Laboratory (AFRL) developed a TRL Calculator tool to aid in assigning and documenting TRLs in technology development programs. This calculator was used to help create a standard repeatable method for determining the TRLs. Initially released in January 2002, it is a Microsoft Excel workbook that presents a user with a series of questions to assess the maturity or readiness level (RL) of the technology. ecause TRLs by themselves do not provide a full picture of risk associated with a program/project or the difficultly required to advance a program/project, Nolte further refined his calculator by May 2004 to include Manufacturing Readiness Levels (MRLs) and Programmatic Readiness Levels (PRLs) (AFRL TRL Calculator (ver 2.2)). Objective In 2008, the omeland ecurity Institute (I) adapted the AFRL TRL Calculator for use within the D &T Directorate by PMs or others interested in tracking the progress of technology development. The objective was to improve the transition of technologies into and out of development within &T, provide more uniformity in program management, improve the documentation of technology readiness decisions, and facilitate development of TRA methodologies, including entrance and exit criteria for a given maturity level throughout the development life cycle of a technology within a D program. Version 1.0 of the D &T Readiness Level (RL) Calculator was released in December It was developed using Excel In 2009, I s successor the AI revised the calculator. Version 1.1 is now compatible with Excel The revised User s Manual provides instructions for use with either version of Excel. In addition, the revised Final Report updates the Institute s recommendations for continued development and use of the calculator. Methodology and cope The D &T RL calculator was originally developed as follows. Using open-source information, I analysts gathered information specific to TRL scales and historical modifications of those scales. They met with various D &T stakeholders to facilitate development of end user requirements, and then attempted to clarify ambiguous RL questions, define all terms used in a D-specific orientation, and apply the lessons learned from historical modifications. As a result, the modified calculator agreed with D &T terminology and documented management procedures. In addition, modifications included advanced user functionality, such as the ability to update the calculator s questions as new D lexicon and/or management processes are adopted and generate RL-specific reports for D PM analysis. ow RL categories impact one another, however, or the importance of various questions in each of the RLs was not addressed in the modified calculator. Also outside the scope of this task of calculator development were RL entrance and exit criteria. Potential users were assumed to reside within the Chemical/iological Division (CD) of D &T, although it was anticipated that the product would be useful in other divisions tasked with technology development. 3 McGarvey, D. et.al. (31 December 2008). Department of omeland ecurity, cience and Technology Directorate, Readiness Level Calculator (I Publication ). Version 1.0 of the calculator was developed under contract W81XW-04-D011, omeland ecurity Institute. The omeland ecurity Institute was established in 2004 as D s Federally Funded Research and Development Center (FFRDC) for homeland security studies and analyses. In March 2009, D established a successor organization, the omeland ecurity tudies and Analysis Institute (AI). 2

11 Department of omeland ecurity cience and Technology Readiness Level Calculator ackground Technology Readiness Levels (TRLs) TRLs were originally developed by NAA. In the early 1980s, NAA observed delays and costs overruns of its programs. Analysis by Werner Gruel, NAA comptroller, concluded that immature technologies resulted in unpredictable development costs and schedules, and the resultant cost growth and schedule slip. 4 To address this cause-and-effect, the NAA instituted TRLs, a methodical system that provides a consistent framework for assessing technology maturity and maintaining cost and schedule within a program. 5 TRLs are relevant to both simple and complex technological systems, as well as to their component subsystems. They are applicable to software and/or hardware or to systems encompassing both software and hardware elements. TRLs can also be used as exit criteria for program life cycle phases. In terms of program management, knowing a program s TRL can prescribe an action plan of activities that still need to be accomplished in order to facilitate transition for a particular technology to an operational end user. TRLs provide measures that can indicate a program s risk and potential for success of transitioning a technology to an end user, in other words, the value of investment. For example, a low TRL, e.g. TRL 1, represents high programmatic risk because the technology requires investment in more developmental milestones; while a high TRL, e.g., TRL 7, indicates the technology has matured further in development, achieved more milestones, and is associated with less programmatic risk. y the 1990s, TRLs had been modified to address nine readiness levels used across NAA today. The NAA TRL levels and definitions are included below in Table 1: NAA TRL scale. Table 1: NAA TRL scale 5 NAA TRL 1 asic principles observed and reported 2 Technology concept and/or application formulated Definition 3 Analytical and experimental critical function and/or characteristic proof-of-concept 4 Component and/or breadboard validation in laboratory environment 5 Component and/or breadboard validation in relevant environment 6 ystem/subsystem model or prototype demonstration in a relevant environment (ground or space) 7 ystem prototype demonstration in a space environment 8 Actual system completed and flight qualified through test and demonstration (ground or space) 9 Actual system flight proven through successful mission operations In 1999, the DoD adopted the use of TRLs during its acquisition phase to aid in the decisions made during technology development. 6,7 4 ilbro, J. J Consulting International-Technology Readiness Levels. Last accessed on 12/19/08 at 5 Mankins, J. (April 6, 1995). Technology Readiness Levels A White Paper. Advanced Concepts Office, Office of pace Access and Technology, NAA. 6 DoD (October 23, 2000). The Defense Acquisition ystem. 3

12 Department of omeland ecurity cience and Technology Readiness Level Calculator Modification of TRL scale ecause the TRL scale is applicable to many different technologies and can be interpreted or modified for specific types of technologies, several groups have adapted TRL definitions for their own use. The Army has developed a mapping of the TRLs to software and the Army Medical Research and Materiel Command has defined corollaries for biomedical TRLs. 8 In addition, TRL scales have since been adopted internationally in Canada, the United Kingdom, and Japan. The TRL scale, however, only provides a snapshot of the maturity of a technology or system at a given point in time. 9 In fact, in an ongoing development program, the technology will invariably be in constant flux. And depending on the type of technology, the TRL may decrease or increase as a result of availability of components, changes in funding, or mission directives. Other shortcomings of the TRL scale, pointed out by Jim mith, 10 include blurring contributions to readiness (e.g., how programmatics influence TRLs), product criticality (e.g., an increased importance for developing a technology may push necessity for skipping TRLs or assuming completion of steps to reach self-imposed or mandated deadlines), software aging (e.g., passé or critical software components of systems may require updating, impacting the overall systems TRL), and readiness in context (e.g., who is really looking at the RL and why?). In general, readiness is a measure of the suitability of a technology or product for use within a larger system in a particular context. It is also a measure of the risks associated with developing or investing in a program associated with developing the technology or system. ut TRLs do not provide a full assessment of the difficulty of integrating technology into an operational system, provide no guidance of the potential uncertainty in moving through the maturation of the technology, and offer no comparative analysis techniques for alternative TRLs. Alternative methodologies for assessing these RLs are being developed. ome combine the desirable aspects of TRLs with additional readiness attributes, such as PRLs and MRLs, to better assess program risk. These new methodologies include developing evaluation criterion, or milestones, tailored to the context of the RL assessment. In short, improvements in existing methodologies and development of alternative methodologies to assess RLs is an ongoing field of study. In an effort to address readiness attributes not captured when using the NAA or DoD TRL scale, numerous types of readiness levels have been created since the inception of TRLs, including MRLs, PRLs, Integration Readiness Levels (IRLs), and ystem Readiness Levels (RLs). Further, some of these new methodologies suggest assessing a combination of scales in order to generate a more accurate picture of a technology s maturity. Integration Readiness Levels (IRLs) The existing TRL scale does not accurately address integration of a component technology into a complete system. In other words, component technologies may advance at different speeds along the TRL 7 auser,., et.al. (April 7-8, 2006). From TRL to RL: The Concept of ystem Readiness Levels. Conference on ystems Engineering Research, Los Angeles, CA. 8 Graettinger, C. et.al. (August 2002). Using the Technology Readiness Levels cale to upport Technology Management in the DoD s ATD/TO Environments: A Findings and Recommendations Report by the oftware Engineering Institute Conducted for Army CECOM. Carnegie Mellon oftware Engineering Institute, Pittsburgh, PA. 9 Nolte, W. Technology Readiness Level Calculator (ver 2.2). 10 mith, J. (2005). An Alternative to Technology Readiness Levels for Non-Developmental Item (NDI) oftware. Proceedings of the 38 th awaii International Conference on ystems ciences. 4

13 Department of omeland ecurity cience and Technology Readiness Level Calculator scale, however, as a system, an adequate TRL is difficult to assess. Developers of IRLs contend most development of complex systems fail at these integration points. IRLs were developed for an accurate assessment of interface maturity between developing technologies. 7 At this time IRLs are not formally a part of the D acquisition process, and hence are not included in the D &T RL Calculator. ystem Readiness Levels (RLs) TRLs most accurately apply to individual technologies or system components. It becomes more complex to apply the existing TRL scale to a system. RLs indicate the level of maturity applied at the systemlevel. RLs are determined by using the current concept of TRLs combined with IRLs. The RL of a given system is a function of individual component TRL maturities and the links between them, as indicated by the IRL. 7 RLs are useful when going from the individual technology to a system context which may involve multiple technologies, as is the case with most technologies in the operational environment. At this time RLs are not formally a part of the D acquisition process, and hence are not included in the D &T RL Calculator. Manufacturing Readiness Levels (MRLs) In 2003, the Government Accountability Office (GAO) recommended in GAO Report , establishing cost, schedule, and quality targets for product manufacturing early on in technology development in order to obtain process maturity. 11 The report suggests that design and manufacturing knowledge should be obtained early in product development for a product to be successful. In response, the Joint Defense Manufacturing Technology Panel developed MRL definitions as well as Manufacturing Readiness Assessments (MRAs) 12. This MRL scale helps program managers assess manufacturing risks, which will facilitate identification of areas that require additional management attention or investment. Manufacturing readiness and producibility are as important to the successful development of a system as are readiness and the capabilities for the system. 1 Though MRLs were created from the manufacturing perspective to evaluate manufacturing readiness of a product and supplement existing TRLs, they, too, have limitations. A limitation of the MRLs is that the lower MRL levels can be difficult to correlate to corresponding TRL numbers due to the technology immaturity (i.e., it is difficult to know what types of manufacturing steps are required when a technology concept hasn t yet been proven). As a result of these limitations, MRL levels 1 and 2 are not used in the D Readiness Level Calculator. Programmatic Readiness Levels (PRLs) PRLs were developed to address program management concerns, such as documentation of programmatic milestones seen as vital to successful technology product development. 13 A PRL scale was developed by I, in 2008, to align with the TRLs. The PRL scale follows basic systems engineering steps and is discussed further in the following sections. 11 GAO Report (May 2003). Defense Acquisitions, Assessments of Major Weapons Programs. 12 Joint Defense Manufacturing Technology Panel Manufacturing Readiness Level Working Group. (February 2007). MRL Guide. 13 Nolte, W. (October 20, 2003 ). Technology Readiness Level Calculator. NDIA ystems Engineering Conference. 5

14 Department of omeland ecurity cience and Technology Readiness Level Calculator 6

15 Department of omeland ecurity cience and Technology Readiness Level Calculator AFRL TRL CALCULATOR (VER 2.2) As described in the Introduction, William Nolte of the AFRL developed a TRL Calculator in 2001 to create a standard, repeatable method for determining TRLs. This original TRL calculator (ver 1.0), released in January 2002, is a Microsoft Excel spreadsheet application that presents a user with a series of questions by TRL about the technology. The methodology for the TRL assessment used in the calculator was refined by May 2004, to include questions relating to TRLs, MRLs, and PRLs (TRL Calculator (ver 2.2). The user has the option of assessing the overall TRL based on three combinations of questions: 1) only TRL questions, 2) TRL questions and PRL or MRL questions, or 3) all three categories of questions together. In any case, TRL questions are always required. 14 The AFRL TRL Calculator (ver 2.2) calculates the overall TRL of the technology or system in question by averaging the responses to all selected categories in a RL. In other words, PRLs and MRLs are ultimately additional TRL questions, if included. If the user cannot answer those questions associated with manufacturing, for example, the overall TRL of the technology or system for that RL will be low, even if the TRL questions alone can be adequately addressed. The user always has the option of looking at the results if only the technology readiness questions are considered, but not the option of looking at results for only manufacturing or programmatic readiness. 14 Nolte, W. (2004). TRL Calculator Version 2.2 Release Notes. 7

16 Department of omeland ecurity cience and Technology Readiness Level Calculator 8

17 Department of omeland ecurity cience and Technology Readiness Level Calculator OVERVIEW OF TE D &T RL CALCULATOR (VER1.1) The D &T RL Calculator Version 1.0 was developed by I, in 2008, for the D &T Chem/io Division PMs and others interested in assessing technology and associated programs. I facilitated developing end user requirements with D &T potential users. Like its predecessor, Version 1.1 of the calculator provides the user options to assess Technological, Programmatic, and/or Manufacturing Readiness Levels (herein, referred to as categories) for a given technology or system. The calculator allows a user to assess any or all of the RLs required for successful technology development and transition: TRLs, MRL, and PRLs. The user has the ability to do these assessments separately, and at the same time. Each of these assessments is independent of the other. In other words, it is essentially three calculators in one workbook. All of these categories have been defined along a scale of 1-9. (owever, the lowest MRL a user can achieve or begin to address readiness is 3, since it is assumed that a technology or system at a TRL 1 or 2 does not allow for a meaningful ability to address manufacturability.) In general, the levels can be grouped into three higher-level activities: 1. RLs 1-3: Research and Development (R&D): these activities most likely occur in a basic laboratory setting, prior to identification of a sponsoring organization. 2. RLs 4-6: Technology Demonstration: these activities occur as a result of funding provided by a sponsoring organization, such as D &T. 3. RLs 7-9: Production and Deployment: these activities occur once the technology has been transferred from the sponsoring organization to the customer or end user. Deviations from this grouping occur. For instance in some cases systems are taken into development when some of the required technologies are only at TRL 2 and in some cases the sponsoring organization supports development through TRL 7 or higher before transition to the customer or end user. These basic groups begin to show how all three RL categories (technology, manufacturing, programmatic) are intimately linked, generally requiring information and interaction with one another. ow RL categories impact one another or the importance of various questions in each of the RLs are not addressed in the D &T RL Calculator. Nor does the calculator address the question of what TRL, MRL, or PRL levels are required for different entrance and exit criteria. This aspect is left for future modifications based on D &T programmatic directives. As with the AFRL TRL Calculator (ver 2.2), the D &T RL Calculator (Ver1.1) is an Excel workbook. 15 The user specifies what categories of RLs are to be calculated (TRL, MRL, or PRL separately, or any combination of these). In addition to a visual scale for each category that indicates the readiness levels achieved and not achieved, this calculator also has the capability to generate categoryspecific summary reports. Each summary report details each RL and the responses provided by the user. These reports are intended to facilitate discussions about the results, in addition to preparing a PM for next steps in order to progress to the next readiness level. 15 Two variants are provided, one for use with Excel 2003 and compatible versions of Excel and one for use with Excel

18 Department of omeland ecurity cience and Technology Readiness Level Calculator The following tables provide the top-level definition for each level in the RL categories (as modified by I) (Tables 2-4). Definitions of specific terms can be found in Appendix C: Glossary of Terms. Full explanations of the RLs are provided in Appendix : TRL, MRL, and PRL Definitions and Descriptions used in the D &T RL Calculator (Ver1.1). Table 2: D &T RL Calculator (Ver1.1) high-level TRL definitions TRL 1 asic principles observed and reported. 2 Technology concept and/or application formulated. TRL Definition 3 Analytical and experimental critical function and/or characteristic proof-of-concept. 4 Component and/or breadboard validation in laboratory environment. 5 Component and/or breadboard validation in relevant environment. 6 ystem/subsystem model or prototype demonstration in a relevant environment. 7 ystem prototype demonstration in an operational environment. 8 Actual system completed and qualified through test and demonstration. 9 Actual system proven through successful mission operations. Table 3: D &T RL Calculator (Ver1.1) high-level MRL definitions MRL 3 Manufacturing process development. 4 Critical manufacturing processes prototyped. 5 Prototype manufacturing system. 6 Manufacturing process maturity demonstration. 7 Manufacturing processes proven. 8 Manufacturing concepts identified. 9 Laboratory manufacturing process demonstration. MRL Definition Table 4: D &T RL Calculator (Ver1.1) high-level PRL definitions PRL PRL Definition 1 Identification of basic scientific concepts and Performers. 2 Establishment of program with identified customer and technology. 3 Program risk, requirements, and performance characteristics and measures are determined. 4 Integrated Product Teams and working groups for developing and transitioning technology are established. 5 ystems engineering methodology, system architecture and end user involvement are established. 6 Formal requirement documents, final Test and Evaluation Master Plan, and ystems Engineering Plan are complete. 7 Finalized Verification, Validation and Accreditation of system. 8 Training and Test and Evaluation Documentation are complete. 9 afety and Training is complete. 10

19 Department of omeland ecurity cience and Technology Readiness Level Calculator Appendix A provides operating instructions for the calculator. While there are several worksheets within the workbook, only two require a user to physically enter information in order to generate TRL, MRL, and PRL-specific summary reports: the tart ere and RL Calculator workbooks. 11

20 Department of omeland ecurity cience and Technology Readiness Level Calculator 12

21 Department of omeland ecurity cience and Technology Readiness Level Calculator UMMARY AND RECOMMENDATION AIhas modified the existing AFRL TRL Calculator for use within the D &T. This new calculator, the D &T RL Calculator (Ver 1.1), allows users to assess the RLs of technology, manufacturing, and programmatic, independently of one another. This calculator is a step towards developing a standardized TRA methodology within D for assessing and tracking progress in technology development programs. While this calculator should be useful as a guide for Program Managers (PMs) and for conducting program- and technology-related budget allocations, there is still work to be done. AImakes the following recommendations based on its efforts associated with the research into TRAs, RLs, and the resultant D &T RL Calculator (Ver1.1). 1. Develop entrance and exit criteria for each RL category. Currently, the D &T RL Calculator has generic types of milestones presented as questions that have no more weight than a previous or following question. Further, a user can achieve any RL without achieving lower RLs because there is no requirement that the level below, or specific steps need to be achieved first. AI recommends that certain questions be identified as mandatory for passing to the next RL. In this way, PMs would be assured that all projects are addressing the same criterion for a given level and when a technology has been deemed RL 5, for example, have an understanding of exactly what has been accomplished with that classification. 2. Develop guidelines for specific applications of the calculator. While this task focused on D &T Chem/io Division PMs, the results of an assessment may vary by user. A laboratory PM, for example, may require different information than a sponsoring organization (e.g., manufacturing questions may not impact a laboratory s information collection in RLs 1-3, but may impact a sponsoring organization s willingness to assume responsibility for a technology development). 3. Continue refining concepts and terms specific to D &T. aving common understanding on the meaning of terms is important for ensuring minimal confusion among those assessing RLs and those making policy decisions based on them. 4. Develop a D &T-specific TRA methodology. TRAs historically have been modified to apply to specific organizations. To facilitate understanding between D, and other organizations (i.e., laboratories, manufacturers), a D TRA methodology should be vetted and made into a management directive. This would also allow a user to add questions specific to documented D programmatic milestones. 5. Validate D &T RL Calculator results. Consider having an outside organization assess a given project s RL using the calculator and compare the results to the same assessment performed by D &T PMs. Differences between the assessments would be valuable starting points for discussions and setting expectations. 6. Apply lessons learned by other organizations to modification of RL methodologies. TRLs have shortfalls, For instance, they do not tell us what efforts have preceded or what efforts are expected to follow the achievement of the TRL. In addition, they do not account for system integration issues. There is precedence, however, for modifying the RL methodology in such a way that accounts for other factors that should be considered, in order to more accurately understand what is required to move from one RL to the next or address integration issues. For example, James ilbro developed the Advancement Degree of Difficultly (AD2) as another 13

22 Department of omeland ecurity cience and Technology Readiness Level Calculator assessment scale for technology maturity. AD2 is the description of what is required to move a system, subsystem, or a component from one TRL to another taking into account the cost, schedule, risk, people and tools available, and organizational aspects, such as the ability of an organization to reproduce existing technology. 14

23 Department of omeland ecurity cience and Technology Readiness Level Calculator APPENDIX A: UER MANUAL FOR TE D &T RL CALCULATOR (VER1.1) tarting Up There is no installation of the D RL Calculator. The calculator is a Microsoft (M) Excel workbook template. Two variants are provided. One, D RL Calculator for Excel xls is for use with Excel 2003 and some compatible versions of Excel. The other, D RL Calculator for Excel xlsm, is for use with Excel For users with Excel 2003 and compatible versions of Excel Use the version of the calculator titled D RL Calculator for Excel xls. efore attempting to use the calculator, make sure that Excel is set to Medium security so that the macros in the workbook will run. To do this, open Microsoft Excel. elect "Tools", then "Macros" and "ecurity." Pick the "Medium" radio button. Open the calculator workbook - D RL Calculator for Excel xls. A dialog box will pop up asking whether or not you wish to allow macros to run. The default selection is "Disable Macros." Change this to "Enable Macros" or the calculator won't work. The calculator will be open to the TART ERE worksheet. For users with Excel 2007 Use the version of the calculator titled D RL Calculator for Excel xlsm. When using Excel 2007 it is necessary that macros be enabled each time the calculator is opened. To do this click on the Options box next to the ecurity Warning ome active content has been disabled warning that appears when the calculator is first opened. Choose Enable this content from the options that appear. The calculator will open to the TART ERE worksheet. 15

24 Department of omeland ecurity cience and Technology Readiness Level Calculator Quick tart Instructions 1 Enter identification information (tart ere) 2 elect RL categories to be used (tart ere) 3 If desired, select level to begin RL questions for each RL from drop down scales (tart ere) 4 Click Continue to Calculator 5 ave workbook with unique identifier 6 et green and yellow set points (RL Calculator) 7 egin answering questions until ready to generate summary report(s) 8 ave workbook with different name 9 Click Generate Report 10 ave workbook with yet another name 11 Print summary reports Detailed Operating Instructions The D &T RL Calculator (Ver1.1) is an Excel workbook containing six worksheets visible to a user: 1. tart ere: This is the default opening worksheet. The user should enter identification information, select RL categories and, if desired, the level for each category to begin assessing specific RLs on the next worksheet. 2. RL Calculator: The user should set threshold values for green and yellow set points and answer questions until desired stopping point (see RL Calculator worksheet explanation below). Also, the user can generate reports from this worksheet. 3. Three (3) summary report worksheets: A RL summary report worksheet will only appear if that RL category (TRL, MRL, and PRL) has been selected on the tart ere worksheet. 4. Glossary: Definitions for terms used throughout the calculator. This can be accessed from each worksheet or by directly clicking on the tab. ee Appendix C: Glossary of Terms. D &T RL Calculator Worksheets TART ERE Worksheet It is essential that the user begin by selecting categories desired on the TART ERE worksheet. Deviation from this will cause the calculator to function incorrectly. Figure 1 shows the tart ere worksheet. On the TART ERE worksheet the user should enter the Project Name, Project Manager, and Date the RL is being calculated in the spaces provided at the top of the worksheet. 16

25 Department of omeland ecurity cience and Technology Readiness Level Calculator Figure 1: D &T RL Calculator TART ERE worksheet. Glossary: This button opens the glossary of terms worksheet. These terms are used in the calculator and/or related to concepts addressed in the questions. The glossary may be consulted at any point in the operation of the calculator. elect technology types to be included: The user selects whether the project to be evaluated consists of hardware only, software only, or a combination of hardware and software. 17

26 Department of omeland ecurity cience and Technology Readiness Level Calculator elect types of RLs to evaluate: The user selects which types of RLs are to be included in the evaluation Technology, Manufacturing, and/or Programmatic. For each RL selected, the user has the option of choosing a specific level to begin the questions. In addition, full definitions for each scale are provided. Assuming a given RL has already been achieved: The user also has the option of assuming that a given RL has been achieved for each of the RLs to be included in the evaluation. This allows the user to avoid answering the questions for that and lower RLs. Note: If the user decides to begin assessing RLs at a given level, all levels below the one selected will be assumed 100% completed for that category. The user will not be able to change this once Continue to Calculator is clicked. Continue to Calculator : Once the user is satisfied with the answers supplied on the TART ERE work sheet, but not before, the user must select the Continue to Calculator button. This is an irreversible step that prompts the user to now save the workbook with a suitable name in a suitable location and takes the user to the TRL Calculator worksheet, which is the calculator proper. The newly named workbook can now be saved and opened at will while the questions on the TRL Calculator worksheet are being answered. Note: Once the file is saved after continue to calculator is selected, the user cannot go back to the TART ERE worksheet and change or enter new information to generate new reports without causing malfunction. The only way to change information on the TART ERE page, at this point, is to completely start over by closing the document and reopening the template. RL Calculator Worksheet Upper portion of the worksheet Program Name, Program Manager, and Date will have been copied from the TART ERE worksheet. There are two buttons on the top of the worksheet that will be of interest only if parts of the bottom portion of the worksheet have been completed. 1. tart Over on this Page button unassumes all Levels, unchecks all boxes, but does not show any RL selected as omitted on the TART ERE page. 2. Create Report button generates RL summary reports of questions answered. ee RL ummary Reports below. RL scales appear for each of the RL categories selected on the TART ERE worksheet. Figure 2 shows the key for those scales based on the answers to the questions in the lower part of the worksheet (see detailed explanation in the following section, Lower portion of the worksheet). This TRL has not been achieved. Many of the tasks required for this TRL are justifiably achieved. Most, if not all, tasks required for this TRL have been achieved. Figure 2: Color-code key for RL scales in RL Calculator and ummary Report worksheets. Colors are calculated based on two set points. et points are percentages of questions required by the user to be answered in order to achieve a green or yellow grade (Figure 3 shows default values of 18

27 Department of omeland ecurity cience and Technology Readiness Level Calculator 100% and 75%, respectively). If a set of questions has not been answered to a combined total percentage of the yellow set point, the level will be red. Green et Point: 100% Yellow et Point: 75% Figure 3: Green and yellow set points. In this example, the user must answer 100% of the questions for a given level to achieve green. If the user answers between 99.9% and 75%, the level will be yellow. Any percentage below 75 will show red. Lower portion of the worksheet For each RL category selected on the TART ERE worksheet, the calculator then poses a set of questions for each level on its related scale of 1-9. ee Appendix E for list of questions by category and level. Figure 4 provides an example of how the questions appear in the calculator. oth Catgry % Complete T 0 T 0 T 0 T 0 T 0 T 0 T 0 Level 1 (Check all that apply or use slider for % complete) Do rough calculations support the concept? Are physical laws/assumptions for new technology defined? Does it appear the concept can be supported by software? Know what software needs to do in general terms? Do paper studies confirm basic scientific principles of new technology? ave mathematical formulations of concepts that might be realizable in software been developed? ave the basic principles of a possible algorithm been formulated? Figure 4: Example of TRL questions for RL 1. Questions are preceded by several data fields. Figure 5 shows these fields and explanations are provided below. Apply? (Y/N): There is a drop-down menu here for Y or N. The default setting is that all questions apply. The user can decide that certain questions do not apply to the project/technology being evaluated. Entering an N in this field will remove a question from the evaluation (green Y turns to red N ). The question will still appear in the list, but any answer supplied will be ignored in the calculations. If N is selected for a question, it will also appear on the appropriate RL ummary Report worksheets in a field named Questions marked Not Applicable.. /W oth: In this field, indicates hardware, indicates software, and indicates both hardware and software. The user should not try to change these. Ques Catgry: In the Question Category field, T indicates a technology question, M indicates manufacturing, and P indicates programmatic. The user should not try to change these. % Complete: The user answers each question according to a percentage of 100 completed. For example, if the question is successfully answered as yes, the user will check the box that indicates 100%. If the answer is a partial yes, but not 100%, a percentage of 100 can also be inserted manually by changing the green 100 or by using the sliding bar associated with the question. Default for all questions is no or 0%. Answers provided will appear in the ummary Report. The green cell displays 100 by default and is technically counted as 0% or not completed unless the checkbox is manually clicked or a different value is added. Do you want to assume completion checkbox: Above the questions for each level there is a red check box that allows the user to assume completion of this RL. Note: This box applies to all the RLs being considered at that level (technology, manufacturing, and/or programmatics). The user 19

28 Department of omeland ecurity cience and Technology Readiness Level Calculator cannot assume completion of individual RLs on this page, only on the TART ERE worksheet using the drop-down scales. Ques Apply? /W Ques (Y/N) oth Catgry Y P 100 Y M 100 Y P 29 Y P 100 Y M 100 Y P 100 Y P 100 Y M 100 Y M 50 Y P 100 Y P 100 Y P 100 Y P 100 Figure 5: Example of RL layout. The background of the questions is color-coded to indicate whether the question refers to technology, manufacturing, or programmatics, information that is also included in the Ques Catgry field (lue: Technology, Green: Programmatic, and Pink: Manufacturing). RL Questions Do you want to assume completion of Level 3? % Complete Readiness Level 3 (Check all that apply or use slider for % complete) ave system performance characteristics and measures been documented? Does basic laboratory research equipment verify physical principles? as customer representative to work with R&D team been identified? Is customer participating in requirements generation? ave design techniques been identified and/or developed? as Technology Transition Agreement (TTA) including possible TRL for transition been drafted? ave scaling studies been started? ave current manufacturability concepts been assessed? Can key components needed for breadboard be produced? as analysis of alternatives been completed? ave programmatic risks been identified? ave programmatic risk mitigation strategies been documented? as preliminary value analysis been performed? Questions should be answered beginning with those at the lowest RL and proceeding down the page. As the questions are being answered the calculator continually recomputes the RLs achieved and displays the results in the scales in the upper portion of the worksheet. At its core, the calculator sums responses to questions for a given category and RL from the list of questions and follows a few pre-defined rules: summation of all questions answered yes, some percentage of yes, not applicable, and not answered. As the questions posed are answered, the calculator displays a red, green, yellow color-coded scale associated with each category of question (this will appear in the upper section of the worksheet). The calculator contains two user-defined thresholds or set points : green and yellow. As questions are answered, the calculator displays a color-coded scale for that RL according to the set points. The calculator displays a green status for a given RL if the percentage of questions answered equals or exceeds the green set point. The default setting for the green set point is 100% (i.e., each question for a given category must be answered 100% to achieve green), but this can be changed by the user. The calculator displays a yellow status for a given RL if the percentage of questions answered equals or exceeds the yellow set point, but is below the green set point. The default setting for the yellow set point is 75% (i.e., the combined totals of all the questions in that category for that RL must equal 75% or greater to achieve yellow). NOTE: The calculator has been modified to calculate the portion of 100% each question contributes. No questions within a category carry more value than another. For example, if there are 10 technology questions for RL 1, each question has a value of 10% of the total. If a percentage of 100 is inserted for a given question, that percentage is multiplied by the value of that question which is linked to the total number of questions for that category and RL. Answered questions are automatically summed for that RL in real time. If the total percentage of combined answers meets or exceeds a given 20

6.0 Systems Integration

6.0 Systems Integration 6.0 The Program s function provide a disciplined approach to the research, design, development and validation of complex systems to ensure that requirements are identified, verified, and met while minimizing

More information

Using the Technology Readiness Levels Scale to Support Technology Management in the DoD s ATD/STO Environments

Using the Technology Readiness Levels Scale to Support Technology Management in the DoD s ATD/STO Environments Using the Technology Readiness Levels Scale to Support Technology Management in the DoD s ATD/STO Environments A Findings and Recommendations Report Conducted for Army CECOM Caroline P. Graettinger, PhD

More information

U.S. Dept. of Defense Systems Engineering & Implications for SE Implementation in Other Domains

U.S. Dept. of Defense Systems Engineering & Implications for SE Implementation in Other Domains U.S. Dept. of Defense Systems Engineering & Implications for SE Implementation in Other Domains Mary J. Simpson System Concepts 6400 32 nd Northwest, #9 Seattle, WA 98107 USA Joseph J. Simpson System Concepts

More information

Sound Transit Internal Audit Report - No. 2014-3

Sound Transit Internal Audit Report - No. 2014-3 Sound Transit Internal Audit Report - No. 2014-3 IT Project Management Report Date: Dec. 26, 2014 Table of Contents Page Background 2 Audit Approach and Methodology 2 Summary of Results 4 Findings & Management

More information

Continuous Risk Management Guidebook

Continuous Risk Management Guidebook Carnegie Mellon Software Engineering Institute Continuous Guidebook Audrey J. Dorofee Julie A. Walker Christopher J. Alberts Ronald P. Higuera Richard L. Murphy Ray C. Williams The ideas and findings in

More information

System (of Systems) Acquisition Maturity Models and Management Tools

System (of Systems) Acquisition Maturity Models and Management Tools System (of Systems) Acquisition Maturity Models and Management Tools Brian J. Sauser, Ph.D. Jose Ramirez-Marquez, Ph.D. Stevens Institute of School of Systems and Enterprise Readiness Level (TRL) System

More information

Technology Readiness Assessment (TRA)

Technology Readiness Assessment (TRA) DEPARTMENT OF DEFENSE Technology Readiness Assessment (TRA) Guidance April 2011 Prepared by the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)) revision posted 13 May 2011 Contents

More information

Manufacturing Readiness Level (MRL) Deskbook Version 2.0 May, 2011

Manufacturing Readiness Level (MRL) Deskbook Version 2.0 May, 2011 Manufacturing Readiness Level (MRL) Deskbook Version 2.0 May, 2011 Prepared by the OSD Manufacturing Technology Program In collaboration with The Joint Service/Industry MRL Working Group This document

More information

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME >

PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME > PROJECT MANAGEMENT PLAN TEMPLATE < PROJECT NAME > Date of Issue: < date > Document Revision #: < version # > Project Manager: < name > Project Management Plan < Insert Project Name > Revision History Name

More information

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria

Summary of GAO Cost Estimate Development Best Practices and GAO Cost Estimate Audit Criteria Characteristic Best Practice Estimate Package Component / GAO Audit Criteria Comprehensive Step 2: Develop the estimating plan Documented in BOE or Separate Appendix to BOE. An analytic approach to cost

More information

System/Data Requirements Definition Analysis and Design

System/Data Requirements Definition Analysis and Design EXECUTIVE SUMMARY This document provides an overview of the Systems Development Life-Cycle (SDLC) process of the U.S. House of Representatives. The SDLC process consists of seven tailored phases that help

More information

Figure 1: Security Warning Enable Macros Excel 2003

Figure 1: Security Warning Enable Macros Excel 2003 Cisco Wireless Solution ROI Calculator This guide offers an overview on the ROI calculator created to measure the business value of implementing Cisco Wireless solutions. Overview Cisco commissioned Forrester

More information

Project Creation and Gantt Chart Design Using Microsoft Project. R. Baker. The University of Tampa

Project Creation and Gantt Chart Design Using Microsoft Project. R. Baker. The University of Tampa Project Creation and Gantt Chart Design Using Microsoft Project R. Baker The University of Tampa What is Microsoft Project? Microsoft Project is a software package designed help managers manage a variety

More information

Program Lifecycle Methodology Version 1.7

Program Lifecycle Methodology Version 1.7 Version 1.7 March 30, 2011 REVISION HISTORY VERSION NO. DATE DESCRIPTION AUTHOR 1.0 Initial Draft Hkelley 1.2 10/22/08 Updated with feedback Hkelley 1.3 1/7/2009 Copy edited Kevans 1.4 4/22/2010 Updated

More information

WORKFORCE COMPOSITION CPR. Verification and Validation Summit 2010

WORKFORCE COMPOSITION CPR. Verification and Validation Summit 2010 WORKFORCE COMPOSITION CPR PEO IEW&S Organizational Assessment VCSA Brief Date 2010 October 13, 2010 This briefing is UNCLASSIFIED/FOUO PREDECISIONAL LIMITED DISTRIBUTION AS OF: 11 Sep 2010 Verification

More information

UNITED STATES AIR FORCE. Air Force Product Support Enterprise Vision

UNITED STATES AIR FORCE. Air Force Product Support Enterprise Vision UNITED STATES AIR FORCE Air Force Product Support Enterprise Vision July 2013 Foreword Product Support is a set of functions and products that enables operational capability and readiness of systems, subsystems,

More information

USGS EOS SYSTEMS ENGINEERING MANAGEMENT PLAN (SEMP)

USGS EOS SYSTEMS ENGINEERING MANAGEMENT PLAN (SEMP) Department of the Interior U.S. Geological Survey USGS EOS SYSTEMS ENGINEERING MANAGEMENT PLAN (SEMP) September 2013 Executive Summary This Systems Engineering Management Plan (SEMP) outlines the engineering

More information

EXCEL PIVOT TABLE David Geffen School of Medicine, UCLA Dean s Office Oct 2002

EXCEL PIVOT TABLE David Geffen School of Medicine, UCLA Dean s Office Oct 2002 EXCEL PIVOT TABLE David Geffen School of Medicine, UCLA Dean s Office Oct 2002 Table of Contents Part I Creating a Pivot Table Excel Database......3 What is a Pivot Table...... 3 Creating Pivot Tables

More information

Technical Performance Measurement A Program Manager s Barometer

Technical Performance Measurement A Program Manager s Barometer PROGRAM MANAGEMENT TOOLS Technical Performance Measurement A Program Manager s Barometer DCMA Pilots a Modified Approach to TPMs Technical Performance Measurement has been in widespread use for many years

More information

EmpCenter Employee Training for Harvey Mudd College. TR-01: Time and Attendance Employee Functions

EmpCenter Employee Training for Harvey Mudd College. TR-01: Time and Attendance Employee Functions ` EmpCenter Employee Training for Harvey Mudd College TR-01: Time and Attendance Employee Functions LEGAL NOTICES Copyright 2012 WorkForce Software All Rights Reserved. WorkForce Software 38705 Seven Mile

More information

Project Implementation Plan (PIP) User Guide

Project Implementation Plan (PIP) User Guide eea financial mechanism Project Implementation Plan (PIP) User Guide 23 November 2007 norwegian financial mechanism Page 2 of 20 1 Introduction The Project Implementation Plan (PIP) is a representation

More information

U.S. DEPARTMENT OF ENERGY PROJECT REVIEW GUIDE FOR CAPITAL ASSET PROJECTS

U.S. DEPARTMENT OF ENERGY PROJECT REVIEW GUIDE FOR CAPITAL ASSET PROJECTS NOT MEASUREMENT SENSITIVE U.S. DEPARTMENT OF ENERGY PROJECT REVIEW GUIDE FOR CAPITAL ASSET PROJECTS DOE G 413.3-9 [This Guide describes suggested non-mandatory approaches for meeting requirements. Guides

More information

Kuali Coeus Research Management (KCRM) User Guide: Create a new Budget document for a Child Award

Kuali Coeus Research Management (KCRM) User Guide: Create a new Budget document for a Child Award Kuali Coeus Research Management (KCRM) User Guide: Create a new Budget document for a Child Award Version 3.0 November 27, 2013 Prepared By: K. Foster/E. Serrano Purpose: To create detailed or modular

More information

Aras Corporation. 2005 Aras Corporation. All rights reserved. Notice of Rights. Notice of Liability

Aras Corporation. 2005 Aras Corporation. All rights reserved. Notice of Rights. Notice of Liability Aras Corporation 2005 Aras Corporation. All rights reserved Notice of Rights All rights reserved. Aras Corporation (Aras) owns this document. No part of this document may be reproduced or transmitted in

More information

Event Planner Portal Quick Reference Guide

Event Planner Portal Quick Reference Guide Event Planner Portal Quick Reference Guide Table of Contents 1 Overview 1 About this guide 1 Who is this Quick Reference Guide designed for? 2 What s in it for me? 2 How long will it take me to go through

More information

DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN. April 2009 SLAC I 050 07010 002

DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN. April 2009 SLAC I 050 07010 002 DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN April 2009 SLAC I 050 07010 002 Risk Management Plan Contents 1.0 INTRODUCTION... 1 1.1 Scope... 1 2.0 MANAGEMENT

More information

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY

COMPLIANCE WITH THIS PUBLICATION IS MANDATORY BY ORDER OF THE COMMANDER AIR FORCE RESEARCH LABORATORY (AFRL) AIR FORCE RESEARCH LABORATORY INSTRUCTION 61-104 16 OCTOBER 2013 Scientific/Research and Development SCIENCE AND TECHNOLOGY (S&T) SYSTEMS

More information

Tutorial Segmentation and Classification

Tutorial Segmentation and Classification MARKETING ENGINEERING FOR EXCEL TUTORIAL VERSION 1.0.8 Tutorial Segmentation and Classification Marketing Engineering for Excel is a Microsoft Excel add-in. The software runs from within Microsoft Excel

More information

Release 2.1 of SAS Add-In for Microsoft Office Bringing Microsoft PowerPoint into the Mix ABSTRACT INTRODUCTION Data Access

Release 2.1 of SAS Add-In for Microsoft Office Bringing Microsoft PowerPoint into the Mix ABSTRACT INTRODUCTION Data Access Release 2.1 of SAS Add-In for Microsoft Office Bringing Microsoft PowerPoint into the Mix Jennifer Clegg, SAS Institute Inc., Cary, NC Eric Hill, SAS Institute Inc., Cary, NC ABSTRACT Release 2.1 of SAS

More information

Practice Overview. REQUIREMENTS DEFINITION Issue Date: <mm/dd/yyyy> Revision Date: <mm/dd/yyyy>

Practice Overview. REQUIREMENTS DEFINITION Issue Date: <mm/dd/yyyy> Revision Date: <mm/dd/yyyy> DEPARTMENT OF HEALTH AND HUMAN SERVICES ENTERPRISE PERFORMANCE LIFE CYCLE FRAMEWORK PRACTIICES GUIIDE REQUIREMENTS DEFINITION Issue Date: Revision Date: Document

More information

Intellect Platform - The Workflow Engine Basic HelpDesk Troubleticket System - A102

Intellect Platform - The Workflow Engine Basic HelpDesk Troubleticket System - A102 Intellect Platform - The Workflow Engine Basic HelpDesk Troubleticket System - A102 Interneer, Inc. Updated on 2/22/2012 Created by Erika Keresztyen Fahey 2 Workflow - A102 - Basic HelpDesk Ticketing System

More information

Writing a Systems Engineering Plan, or a Systems Engineering Management Plan? Think About Models and Simulations

Writing a Systems Engineering Plan, or a Systems Engineering Management Plan? Think About Models and Simulations Writing a Systems Engineering Plan, or a Systems Engineering Management Plan? Think About Models and Simulations Philomena Zimmerman Office of the Deputy Assistant Secretary of Defense for Systems Engineering

More information

Indiana County Assessor Association Excel Excellence

Indiana County Assessor Association Excel Excellence Indiana County Assessor Association Excel Excellence Basic Excel Data Analysis Division August 2012 1 Agenda Lesson 1: The Benefits of Excel Lesson 2: The Basics of Excel Lesson 3: Hands On Exercises Lesson

More information

Advanced Excel 10/20/2011 1

Advanced Excel 10/20/2011 1 Advanced Excel Data Validation Excel has a feature called Data Validation, which will allow you to control what kind of information is typed into cells. 1. Select the cell(s) you wish to control. 2. Click

More information

Technology Program Management Model (TPMM) A Systems-Engineering Approach to Technology Development Program Management

Technology Program Management Model (TPMM) A Systems-Engineering Approach to Technology Development Program Management UNCLASSIFIED Technology Program Management Model (TPMM) A Systems-Engineering Approach to Technology Development Program Management 10-26-2006 Mike Ellis TPMM Development Manager Dynetics, Inc. Mike.Ellis@Dynetics.com

More information

Department of Administration Portfolio Management System 1.3 June 30, 2010

Department of Administration Portfolio Management System 1.3 June 30, 2010 E 06/ 30/ 2010 EX AM PL 1. 3 06/ 28/ 2010 06/ 24/ 2010 06/ 23/ 2010 06/ 15/ 2010 06/ 18/ 2010 Portfolio System 1.3 June 30, 2010 Contents Section 1. Project Overview... 1 1.1 Project Description... 1 1.2

More information

Best Practices Statement Project Management. Best Practices for Managing State Information Technology Projects

Best Practices Statement Project Management. Best Practices for Managing State Information Technology Projects State of Arkansas Office of Information Technology 124 W. Capitol Ave. Suite 990 Little Rock, AR 72201 501.682.4300 Voice 501.682.4020 Fax http://www.cio.arkansas.gov/techarch Best Practices Statement

More information

QUality Assessment of System ARchitectures (QUASAR)

QUality Assessment of System ARchitectures (QUASAR) Pittsburgh, PA 15213-3890 QUality Assessment of System ARchitectures (QUASAR) Donald Firesmith Acquisition Support Program (ASP) Sponsored by the U.S. Department of Defense 2006 by Carnegie Mellon University

More information

ASAP Roadmap. Solution Use. Bill Wood Bill.wood@R3Now.com (704) 905 5175 http://www.r3now.com

ASAP Roadmap. Solution Use. Bill Wood Bill.wood@R3Now.com (704) 905 5175 http://www.r3now.com ASAP Roadmap Solution Use Bill Wood Bill.wood@R3Now.com (704) 905 5175 http://www.r3now.com 704. 905. 5175 Bill.Wood@R3Now.com www.r3now.com http://www.linkedin.com/in/billwood Page 1 of 22 Contents THE

More information

Modeling and Simulation (M&S) for Homeland Security

Modeling and Simulation (M&S) for Homeland Security Modeling and Simulation (M&S) for Homeland Security Dr. Charles Hutchings Deputy Director, Modeling and Simulation Test and Standards Division Science and Technology Directorate June 23, 2008 Outline How

More information

Excel Reports User Guide

Excel Reports User Guide Excel Reports User Guide Copyright 2000-2006, E-Z Data, Inc. All Rights Reserved. No part of this documentation may be copied, reproduced, or translated in any form without the prior written consent of

More information

Introduction to the CMMI Acquisition Module (CMMI-AM)

Introduction to the CMMI Acquisition Module (CMMI-AM) Pittsburgh, PA 15213-3890 Introduction to the CMMI Acquisition Module (CMMI-AM) Module 2: CMMI-AM and Project Management SM CMM Integration, IDEAL, and SCAMPI are service marks of Carnegie Mellon University.

More information

Session 4. System Engineering Management. Session Speaker : Dr. Govind R. Kadambi. M S Ramaiah School of Advanced Studies 1

Session 4. System Engineering Management. Session Speaker : Dr. Govind R. Kadambi. M S Ramaiah School of Advanced Studies 1 Session 4 System Engineering Management Session Speaker : Dr. Govind R. Kadambi M S Ramaiah School of Advanced Studies 1 Session Objectives To learn and understand the tasks involved in system engineering

More information

The Program Managers Guide to the Integrated Baseline Review Process

The Program Managers Guide to the Integrated Baseline Review Process The Program Managers Guide to the Integrated Baseline Review Process April 2003 Table of Contents Foreword... 1 Executive Summary... 2 Benefits... 2 Key Elements... 3 Introduction... 4 IBR Process Overview...

More information

Project Zeus. Risk Management Plan

Project Zeus. Risk Management Plan Project Zeus Risk Management Plan 1 Baselined: 5/7/1998 Last Modified: N/A Owner: David Jones/Zeus Project Manager Page Section 1. Introduction 3 1.1 Assumptions, Constraints, and Policies 3 1.2 Related

More information

BVR. Free Download. PitchBook Plugin for Excel USER GUIDE. What It s Worth

BVR. Free Download. PitchBook Plugin for Excel USER GUIDE. What It s Worth BVR What It s Worth Free Download PitchBook Plugin for Excel USER GUIDE Thank you for visiting Business Valuation Resources, the leading provider of quality acquisition data and analysis. For more information

More information

User Guide Package Exception Management

User Guide Package Exception Management User Guide Package Exception Management 70-3262-4.6 PRECISION Applications 2012 September 2012 2012 Precision Software, a division of QAD Inc. Precision Software products are copyrighted and all rights

More information

Call Centre Helper - Forecasting Excel Template

Call Centre Helper - Forecasting Excel Template Call Centre Helper - Forecasting Excel Template This is a monthly forecaster, and to use it you need to have at least 24 months of data available to you. Using the Forecaster Open the spreadsheet and enable

More information

Investments in major capital assets proposed for funding in the Administration's budget should:

Investments in major capital assets proposed for funding in the Administration's budget should: APPENDIX J PRINCIPLES OF BUDGETING FOR CAPITAL ASSET ACQUISITIONS Introduction and Summary The Administration plans to use the following principles in budgeting for capital asset acquisitions. These principles

More information

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment

Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire. P3M3 Project Management Self-Assessment Procurement Programmes & Projects P3M3 v2.1 Self-Assessment Instructions and Questionnaire P3M3 Project Management Self-Assessment Contents Introduction 3 User Guidance 4 P3M3 Self-Assessment Questionnaire

More information

Microsoft Access Basics

Microsoft Access Basics Microsoft Access Basics 2006 ipic Development Group, LLC Authored by James D Ballotti Microsoft, Access, Excel, Word, and Office are registered trademarks of the Microsoft Corporation Version 1 - Revision

More information

GAO MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices

GAO MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices GAO United States Government Accountability Office Report to Congressional Addressees March 2013 MAJOR AUTOMATED INFORMATION SYSTEMS Selected Defense Programs Need to Implement Key Acquisition Practices

More information

Department of Veterans Affairs VA DIRECTIVE 6071

Department of Veterans Affairs VA DIRECTIVE 6071 Department of Veterans Affairs VA DIRECTIVE 6071 Washington, DC 20420 Transmittal Sheet PROJECT MANAGEMENT ACCOUNTABILITY SYSTEM (PMAS) 1. REASON FOR ISSUE. To set forth policies and responsibilities for

More information

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc. February 2013 1 Executive Summary Adnet is pleased to provide this white paper, describing our approach to performing

More information

Sample- for evaluation purposes only! Advanced Excel. TeachUcomp, Inc. A Presentation of TeachUcomp Incorporated. Copyright TeachUcomp, Inc.

Sample- for evaluation purposes only! Advanced Excel. TeachUcomp, Inc. A Presentation of TeachUcomp Incorporated. Copyright TeachUcomp, Inc. A Presentation of TeachUcomp Incorporated. Copyright TeachUcomp, Inc. 2012 Advanced Excel TeachUcomp, Inc. it s all about you Copyright: Copyright 2012 by TeachUcomp, Inc. All rights reserved. This publication,

More information

UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL

UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL AUDIT SERVICES August 24, 2015 Control Number ED-OIG/A04N0004 James W. Runcie Chief Operating Officer U.S. Department of Education Federal

More information

GAO INFORMATION TECHNOLOGY DASHBOARD. Opportunities Exist to Improve Transparency and Oversight of Investment Risk at Select Agencies

GAO INFORMATION TECHNOLOGY DASHBOARD. Opportunities Exist to Improve Transparency and Oversight of Investment Risk at Select Agencies GAO United States Government Accountability Office Report to Congressional Requesters October 2012 INFORMATION TECHNOLOGY DASHBOARD Opportunities Exist to Improve Transparency and Oversight of Investment

More information

CS 2 SAT: The Control Systems Cyber Security Self-Assessment Tool

CS 2 SAT: The Control Systems Cyber Security Self-Assessment Tool INL/CON-07-12810 PREPRINT CS 2 SAT: The Control Systems Cyber Security Self-Assessment Tool ISA Expo 2007 Kathleen A. Lee January 2008 This is a preprint of a paper intended for publication in a journal

More information

CDC UNIFIED PROCESS JOB AID

CDC UNIFIED PROCESS JOB AID Purpose The purpose of this document is to provide guidance on the practice of using Microsoft Project and to describe the practice overview, requirements, best practices, activities, and key terms related

More information

MICROSOFT EXCEL 2011 MANAGE WORKBOOKS

MICROSOFT EXCEL 2011 MANAGE WORKBOOKS MICROSOFT EXCEL 2011 MANAGE WORKBOOKS Last Edited: 2012-07-10 1 Open, create, and save Workbooks... 3 Open an existing Excel Workbook... 3 Create a new Workbook... 6 Save a Workbook... 6 Set workbook properties...

More information

ELECTRO-MECHANICAL PROJECT MANAGEMENT

ELECTRO-MECHANICAL PROJECT MANAGEMENT CHAPTER-9 ELECTRO-MECHANICAL PROJECT MANAGEMENT Y K Sharma,SDE(BS-E), 9412739241(M) E-Mail ID: yogeshsharma@bsnl.co.in Page: 1 Electro-mechanical Project Management using MS Project Introduction: When

More information

CDC UNIFIED PROCESS PRACTICES GUIDE

CDC UNIFIED PROCESS PRACTICES GUIDE Document Purpose The purpose of this document is to provide guidance on the practice of Requirements Definition and to describe the practice overview, requirements, best practices, activities, and key

More information

Introduction to Microsoft Access 2003

Introduction to Microsoft Access 2003 Introduction to Microsoft Access 2003 Zhi Liu School of Information Fall/2006 Introduction and Objectives Microsoft Access 2003 is a powerful, yet easy to learn, relational database application for Microsoft

More information

MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices

MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices United States Government Accountability Office Report to Congressional Committees March 2014 MAJOR AUTOMATED INFORMATION SYSTEMS Selected Defense Programs Need to Implement Key Acquisition Practices GAO-14-309

More information

How to Use a Data Spreadsheet: Excel

How to Use a Data Spreadsheet: Excel How to Use a Data Spreadsheet: Excel One does not necessarily have special statistical software to perform statistical analyses. Microsoft Office Excel can be used to run statistical procedures. Although

More information

Subject: Defense Software: Review of Defense Report on Software Development Best Practices

Subject: Defense Software: Review of Defense Report on Software Development Best Practices United States General Accounting Office Washington, DC 20548 Accounting and Information Management Division B-285626 June 15, 2000 The Honorable John Warner Chairman The Honorable Carl Levin Ranking Minority

More information

Scheduling Process Maturity Level Self Assessment Questionnaire

Scheduling Process Maturity Level Self Assessment Questionnaire Scheduling Process Maturity Level Self Assessment Questionnaire Process improvement usually begins with an analysis of the current state. The purpose of this document is to provide a means to undertake

More information

User Guide. Opening secure email from the State of Oregon Viewing birth certificate edits reports in MS Excel

User Guide. Opening secure email from the State of Oregon Viewing birth certificate edits reports in MS Excel User Guide Opening secure email from the State of Oregon Viewing birth certificate edits reports in MS Excel Birth Certifier Edition Last Revised: August, 0 PUBLIC HEALTH DIVISION Center for Public Health

More information

OVERVIEW. Microsoft Project terms and definitions

OVERVIEW. Microsoft Project terms and definitions PROJECT 2003 DISCLAIMER: This reference guide is meant for experienced Microsoft Project users. It provides a list of quick tips and shortcuts for familiar features. This guide does NOT replace training

More information

Quality Assurance Guide. IRMS-5.8.4.0c-Quality Assurance Guide.doc 01.0 November 2, 2009

Quality Assurance Guide. IRMS-5.8.4.0c-Quality Assurance Guide.doc 01.0 November 2, 2009 Quality Assurance Guide IRMS-5.8.4.0c-Quality Assurance Guide.doc 01.0 November 2, 2009 IRMS Quality Assurance Guide 1011 State Street, Suite 210 Lemont, IL 60439 Phone: 630-243-9810 Fax: 630-243-9811

More information

AUDIT REPORT. The Department of Energy's Management of Cloud Computing Activities

AUDIT REPORT. The Department of Energy's Management of Cloud Computing Activities U.S. Department of Energy Office of Inspector General Office of Audits and Inspections AUDIT REPORT The Department of Energy's Management of Cloud Computing Activities DOE/IG-0918 September 2014 Department

More information

Software Quality Assurance: VI Standards

Software Quality Assurance: VI Standards Software Quality Assurance: VI Standards Room E 3.165 Tel. 60-3321 Email: hg@upb.de Outline I Introduction II Software Life Cycle III Quality Control IV Infrastructure V Management VI Standards VII Conclusion

More information

EXCEL Tutorial: How to use EXCEL for Graphs and Calculations.

EXCEL Tutorial: How to use EXCEL for Graphs and Calculations. EXCEL Tutorial: How to use EXCEL for Graphs and Calculations. Excel is powerful tool and can make your life easier if you are proficient in using it. You will need to use Excel to complete most of your

More information

Guidelines for Completing the VDOT Form C 13CPM

Guidelines for Completing the VDOT Form C 13CPM Guidelines for Completing the VDOT Form C 13CPM CONSTRUCTION DIVISION 1. OVERVIEW The VDOT Form C 13CPM is required to prepare and submit the Contractor s Progress Earnings Schedule as specified in the

More information

3 What s New in Excel 2007

3 What s New in Excel 2007 3 What s New in Excel 2007 3.1 Overview of Excel 2007 Microsoft Office Excel 2007 is a spreadsheet program that enables you to enter, manipulate, calculate, and chart data. An Excel file is referred to

More information

Inquisite Reporting Plug-In for Microsoft Office. Version 7.5. Getting Started

Inquisite Reporting Plug-In for Microsoft Office. Version 7.5. Getting Started Inquisite Reporting Plug-In for Microsoft Office Version 7.5 Getting Started 2006 Inquisite, Inc. All rights reserved. All Inquisite brand names and product names are trademarks of Inquisite, Inc. All

More information

Surveying and evaluating tools for managing processes for software intensive systems

Surveying and evaluating tools for managing processes for software intensive systems Master Thesis in Software Engineering 30 Credits, Advanced Level Surveying and evaluating tools for managing processes for software intensive systems Anuradha Suryadevara IDT Mälardalen University, ABB

More information

INFORMATION TECHNOLOGY PROJECT REQUESTS

INFORMATION TECHNOLOGY PROJECT REQUESTS INFORMATION TECHNOLOGY PROJECT REQUESTS Guidelines & Instructions for Maryland State Agencies Revised Two Step PPR/PIR Approval Process Fiscal Year 2013 Table of Contents Part 1: Overview... 2 1.1 Introduction...

More information

The Online Health Program Planner Part 1: Beginner's Guide

The Online Health Program Planner Part 1: Beginner's Guide The Online Health Program Planner Part 1: Beginner's Guide 1.1 Introduction This audio presentation is the first in a series of six parts that will provide an overview on how to use the Online Health Program

More information

EzyScript User Manual

EzyScript User Manual Version 1.4 Z Option 417 Oakbend Suite 200 Lewisville, Texas 75067 www.zoption.com (877) 653-7215 (972) 315-8800 fax: (972) 315-8804 EzyScript User Manual SAP Transaction Scripting & Table Querying Tool

More information

SERVICE EXCELLENCE SUITE

SERVICE EXCELLENCE SUITE USERS GUIDE Release Management Service Management and ServiceNow SERVICE EXCELLENCE SUITE Table of Contents Introduction... 3 Overview, Objectives, and Current Scope... 4 Overview... 4 Objectives... 4

More information

NOTICE: This publication is available at: http://www.nws.noaa.gov/directives/.

NOTICE: This publication is available at: http://www.nws.noaa.gov/directives/. Department of Commerce $ National Oceanic & Atmospheric Administration $ National Weather Service NATIONAL WEATHER SERVICE POLICY DIRECTIVE 80-3 October 28, 2009 Science and Technology SYSTEMS ENGINEERING

More information

DATA VALIDATION and CONDITIONAL FORMATTING

DATA VALIDATION and CONDITIONAL FORMATTING DATA VALIDATION and CONDITIONAL FORMATTING Data validation to allow / disallow certain types of data to be entered within a spreadsheet Using Data Validation to choose a value for a cell from a dropdown

More information

SIEMENS. Teamcenter 11.2. Change Manager PLM00140 11.2

SIEMENS. Teamcenter 11.2. Change Manager PLM00140 11.2 SIEMENS Teamcenter 11.2 Change Manager PLM00140 11.2 Contents What is Change Manager?.............................................. 1-1 Related topics........................................................

More information

GENERAL SERVICES ADMINISTRATION Federal Supply Schedule Authorized Federal Supply Schedule

GENERAL SERVICES ADMINISTRATION Federal Supply Schedule Authorized Federal Supply Schedule GENERAL SERVICES ADMINISTRATION Federal Supply Schedule Authorized Federal Supply Schedule On line access to contract ordering information, terms and conditions, up-to-date pricing, and the option to create

More information

Getting Started with Excel 2008. Table of Contents

Getting Started with Excel 2008. Table of Contents Table of Contents Elements of An Excel Document... 2 Resizing and Hiding Columns and Rows... 3 Using Panes to Create Spreadsheet Headers... 3 Using the AutoFill Command... 4 Using AutoFill for Sequences...

More information

Chapter 17 School Cash Catalog

Chapter 17 School Cash Catalog Chapter 17 School Cash Catalog In Palm Beach County, schools have the ability to accept online payments from parents, guardians, other relatives, and members of the public. Acceptable methods of payment

More information

National Institute of Standards and Technology. HIPAA Security Rule Toolkit. User Guide

National Institute of Standards and Technology. HIPAA Security Rule Toolkit. User Guide National Institute of Standards and Technology HIPAA Security Rule Toolkit User Guide October 31, 2011 Table of Contents Background... 1 Purpose... 1 Audience... 1 Intended Use of the HSR Toolkit... 1

More information

Microsoft Project 2007 Level 1: Creating Project Tasks

Microsoft Project 2007 Level 1: Creating Project Tasks Microsoft Project 2007 Level 1: Creating Project Tasks By Robin Peers Robin Peers, 2008 ABOUT THIS CLASS Regardless of job title, most of us have needed to act as a project manager, at one time or another.

More information

Phase I Conduct a Security Self-Assessment

Phase I Conduct a Security Self-Assessment 61 The SEARCH IT Security Self- and Risk- Assessment Tool: Easy to Use, Visible Results To complete your self-assessment, you can use the questions we have adopted and revised from the NIST guidance under

More information

The Application Readiness Level Metric

The Application Readiness Level Metric The Application Readiness Level Metric NASA Application Readiness Levels (ARLs) The NASA Applied Sciences Program has instituted a nine-step Application Readiness Level (ARL) index to track and manage

More information

Development of User Requirements and Use Cases for a Contamination Warning System Dashboard

Development of User Requirements and Use Cases for a Contamination Warning System Dashboard May 2013 Philadelphia Water Department Contamination Warning System Demonstration Pilot Project: Development of User Requirements and Use Cases for a Contamination Warning System Dashboard Online Water

More information

U.S. Department of Education Federal Student Aid

U.S. Department of Education Federal Student Aid U.S. Department of Education Federal Student Aid Lifecycle Management Methodology Stage Gate Review Process Description Version 1.3 06/30/2015 Final DOCUMENT NUMBER: FSA_TOQA_PROC_STGRW.NA_001 Lifecycle

More information

Siebel Professional Services Automation Guide

Siebel Professional Services Automation Guide Siebel Professional Services Automation Guide Version 7.7 November 2004 Siebel Systems, Inc., 2207 Bridgepointe Parkway, San Mateo, CA 94404 Copyright 2004 Siebel Systems, Inc. All rights reserved. Printed

More information

Using the ITSM Metrics Modeling Tool

Using the ITSM Metrics Modeling Tool Using the ITSM Metrics Modeling Tool ITSM Metrics Model Tool Overview The ITSM Metrics Model is a simple spreadsheet tool that can be used for a variety of measurement and reporting purposes. The model

More information

AARP Tax-Aide Helpful Hints for Using the Volunteer Excel Expense Form and Other Excel Documents

AARP Tax-Aide Helpful Hints for Using the Volunteer Excel Expense Form and Other Excel Documents AARP Tax-Aide Helpful Hints for Using the Volunteer Excel Expense Form and Other Excel Documents This document is designed to give you information to help you perform Excel file functions associated with

More information

Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs (D-2004-109)

Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs (D-2004-109) August 17, 2004 Acquisition Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs (D-2004-109) Department of Defense Office of the Inspector General Quality

More information

Targeted Risk Assessment Addendum to the User Guide for the Standalone Consumer Tool Version 3.1 -

Targeted Risk Assessment Addendum to the User Guide for the Standalone Consumer Tool Version 3.1 - Targeted Risk Assessment Addendum to the User Guide for the Standalone Consumer Tool Version 3.1 - June 2014 EUROPEAN CENTRE FOR ECOTOXICOLOGY AND TOXICOLOGY OF CHEMICALS www.ecetoc.org Contents ADDENDUM

More information

September 2015. IFAC Member Compliance Program Strategy, 2016 2018

September 2015. IFAC Member Compliance Program Strategy, 2016 2018 September 2015 IFAC Member Compliance Program Strategy, 2016 2018 This Strategy is issued by the International Federation of Accountants (IFAC ) with the advice and oversight of the Compliance Advisory

More information

NGNP Risk Management Database: A Model for Managing Risk

NGNP Risk Management Database: A Model for Managing Risk INL/EXT-09-16778 Revision 1 NGNP Risk Management Database: A Model for Managing Risk John M. Beck November 2011 DISCLAIMER This information was prepared as an account of work sponsored by an agency of

More information