1. Background. 2. Applicability and Scope. 2.1 Cautionary Note
|
|
|
- Gillian Chastity Norton
- 10 years ago
- Views:
Transcription
1
2 Table of Contents i. Preface Background Applicability and Scope Cautionary Note Policy Guidance Performance Measures Input/Process Measures Input/Process Measures Examples Output Measures Output Measures Examples Outcome Measures Outcome Measures Examples Note on the Examples Performance Measurement Process Chart Performance Measurement Implementation Headquarters and Field Level Interaction Conclusion References Appendix A: Quick Reference Guide Appendix B: Annotated Bibliography ii
3 i. Preface The Interagency Security Committee (ISC) originated by Executive Order after the Oklahoma City bombing of the Alfred Murrah Federal Building in 1995.The day after the attack, the President ordered an assessment of vulnerability of Federal facilities to terrorism or violence. The Vulnerability Report developed minimum physical security standards for civilian federally owned or leased facilities. Protecting employees and private citizens who visit U.S. government-owned or leased facilities from all hazards is a complex and challenging responsibility. It is one of the top national priorities and the mission of the ISC. In keeping with the authority provided in Section 5 of Executive Order and amended by Executive Order 13286, this document provides ISC policy, which requires Federal departments and agencies to use performance measurement and testing to assess physical security programs. This document outlines recommended guidance to Federal departments and agencies for implementing this policy. The guidance provides a basic performance model that measures inputs and accomplishments. It identifies the performance measurement cycle processes and provides examples of performance metrics for physical security. The ISC recognizes Federal departments and agencies will implement this policy and guidance in a manner reflecting the unique and varied mission requirements of their respective components. 1
4 1. Background All major terrorist attacks in the United States (the Oklahoma City bombing, the two World Trade Center attacks, the attack on the Pentagon, and the Brentwood, Maryland anthrax attack) involved Federal facilities. Subsequently, many government facilities have received increased security funding in response to domestic and international terrorism. In several homeland security studies, the Government Accountability Office (GAO) concluded that much remains to be done to improve the overall management and protection of the Federal facility infrastructure. In two separate reports, the GAO identified policy and management issues specifically directed to the ISC. The GAO recommended the ISC promote key practices associated with the management of physical security programs (GAO 05-49), including the development and use of performance measurement. In another study, the GAO found there is no government-wide guidance or standards for measuring facility protection performance (GAO 6-612). Without effective performance measurement data, the GAO said decision makers may not have sufficient information to evaluate whether their investments have improved security, reduced Federal facilities vulnerability, and reduced the level of risk to an acceptable level. The GAO concluded the ISC should issue guidance applicable to all Federal departments and agencies on the use of performance measurement and testing procedures to assess the effectiveness of their security programs. Measurement is an essential component of the requirements of the Government Performance and Results Act of 1993 (GPRA) and the Program Assessment Rating Tool (PART) from the Office of Management and Budget (OMB). GPRA requires a five-year strategic plan providing mission, goals, and a description of how the accomplishment of goals will be measured. PART is a tool by which the OMB assesses the effectiveness of an agency s or department s program based on responses to a series of questions generic to all programs. OMB rates the respective program as either Effective, Moderately Effective, Adequate, Ineffective, or Results Not Demonstrated, and then makes budget decisions accordingly. The GPRA and PART principles should be adhered to for internal goal setting, program assessment, and resource allocation. 2. Applicability and Scope Performance measurement data is essential to appropriate decision making on the allocation of resources. Objective, unbiased information as to what is being accomplished, what needs additional attention (management focus and resources), and what is performing at target expectation levels, is vital to appropriate resource allocation decisions. Security countermeasures must compete with other program objectives for limited funding. Performance measurement tools offer security professionals a way to measure a program s capabilities and effectiveness and can help demonstrate the need to obligate funds for facility security. 2.1 Cautionary Note While performance measurement and testing are necessary for effective management and oversight, they can become burdensome if senior management does not utilize them properly. GAO observed in a study (GAO-6-612) that agencies face obstacles in developing meaningful, outcome-oriented performance goals and in collecting data that can be used to assess the true impact of facility protection efforts. Further, in some programs, such as facility protection, outcomes are not quickly achieved or readily observable or its relationship to the program is 2
5 often not clearly defined. Without consistent management support, performance measurement and testing have the potential to become counterproductive and could evolve into ends in themselves rather than serving as a means of ensuring program success. Overcoming these obstacles will require sustained leadership, long term investment, and clearly defined performance goals, metrics and data. The costs associated with developing the initial requirements, particularly to establish performance databases, will require significant front-end funding. At the agency level, leadership must communicate the mission-related priority and commitment assigned to performance measurement actions. Management attention will be required at the facility level as well to ensure buy-in and cooperation among facility operators, security managers, building occupants, and other stakeholders. If management can meet these challenges, the physical security performance measures will help to ensure accountability, prioritize security needs, and justify investment decisions to maximize available resources. 2.2 Policy Pursuant to Section 5 of Executive Order 12977, the following policy is hereby established for the security and protection of all buildings and facilities in the United States occupied by Federal employees for nonmilitary activities. Federal departments and agencies shall take the necessary action to comply with the following policies as soon as practicable: Federal departments and agencies shall assess and document the effectiveness of their physical security programs through performance measurement and testing. Performance measures shall be based on agency mission goals and objectives. Performance results shall be linked to goals and objectives development, resource needs, and program management. 3. Guidance This guidance is provided to assist departments and agencies establish or refine a comprehensive measurement and testing program for assessing the effectiveness of their physical security programs. It is recognized that within large agencies or departments, security performance measurement and testing might best function at the major component organizational level (bureau, directorate, or office) and its field locations rather than at the senior management headquarters level. Nonetheless, senior management the Chief Security Officer or equivalent should ensure the consistent application and testing of performance measures throughout the agency or department. 4. Performance Measures Performance measures can be categorized into three basic groups: input/process measures, output measures, and outcome measures. For consistency in the assessment of the effectiveness of physical security programs, the following definitions apply: 4.1 Input/Process Measures Inputs are the budgetary resources, human capital, materials, and services, and facilities and equipment associated with a goal or objective. Process measures are the functions and activities undertaken that are geared toward accomplishing an objective. 3
6 4.1.1 Input/Process Measures Examples The following are examples of input measures, including descriptions explaining how they relate to program assessment: Asset Inventory: This measure may encompass the entire facility asset inventory or a subset. For example, program managers could measure only those assets that have been (or need to be) assessed to those whose level of risk is acceptable. The inventory measure could also reflect various classifications, such as the ISC Facility Security Level (FSL) designations, or other mission-driven criteria, to establish priorities. Depending on the status, program managers should establish intermediate and long-term target objectives for the asset inventory for tracking and achieving long-term goals. An example of this is a measure indicating whether all assets have an acceptable risk rating. Number of countermeasures in use: Similar to the inventory of facilities, this measure provides a baseline for the number of countermeasures (by type) requiring maintenance, testing, or scheduled for replacement. This number may increase or decrease as the asset inventory fluctuates, or recurring risk assessments indicate the need for additional security equipment. As the number of countermeasures in use, increases, and the number of tested and repaired or replaced countermeasures increases, the acceptable risk rating should also increase for your asset inventory as suggested in the first example. Resource Requirements: These measures track the resources required to accomplish the security program mission: o Full-Time Equivalent (FTE) employees, contract support, and training; o FSL determinations and risk assessments; o Countermeasure installation, maintenance, testing, evaluation and replacement; and o Overall Security Program Management (salaries, IT cost, administrative cost). Tracking the resources applied to physical security efforts provides program managers with an understanding of the necessary resources, including expenditures and personnel, required for effective physical security program operations. Program managers can use this information to determine program growth, increases in cost, efficiency gains, and output costs. Essentially, this information provides an overview of the resources required to achieve program goals and to accomplish overall program mission goals. When considered in conjunction with output and outcome measures, they help determine the benefit of using various resource levels. Moreover, program managers should use this information to plan and justify resource requirements for future efforts. 4.2 Output Measures Outputs are the products and services produced by the organization and generally can be observed and measured. Efficiency is a measure of the relationship between an organization s inputs/processes and its outputs Output Measures Examples The following are examples of output measures and how they relate to assessing program effectiveness: 4
7 Security assessments completed versus planned: A core component of a physical security program is the scheduling of initial and recurring risk assessments and the accompanying FSL determination. Every agency or department should have an established schedule for assessing each facility. Tracking and measuring the percentage of completed assessments versus what was planned for the year, by quarter, or other period indicates management s commitment to maintaining an organized and efficient physical security program. More importantly, risk assessments performed on a regular schedule provides a means of effectively addressing changes in threats and vulnerabilities, and corresponding countermeasure needs. A typical target objective would be to complete a specific number of assessments annually, based on a planned schedule. Countermeasures deployed: This measure reflects how well the deployment of countermeasures is managed throughout the procurement, installation, and acceptance cycle. Once funding has been made available, target dates (e.g., a specific date, month, or quarter) should be established. This target date is then compared with the actual deployment date. If there is no existing data available for projecting a reasonable target date, a baseline should be established using representative countermeasures to determine the typical time frame for deployment of various kinds of countermeasures. This enables the manager to reasonably project target dates for future countermeasures. A typical target objective for this measure may be to deploy all fully-funded countermeasures on time (on or prior to the scheduled date) 95 percent of the time. The 5 percent margin of error allows for unforeseen events or circumstances that could not have been reasonably anticipated when the target dates were initially established. Once actual results are achieved, incremental improvement target dates may be necessary until the processes, planning, and scheduling procedures can be refined to ensure successful deployment 95 percent of the time. Note: This measure encompasses capital investments facility enhancements and equipment, new process changes, and countermeasure activities. Separate reporting is encouraged for each of these categories since the responsibility for each may differ, and corrective process improvements vary, among the organizational elements involved. Countermeasures tested: 1 This measure focuses on accomplishing an established schedule for testing countermeasures to determine how well they are working. Testing encompasses such elements as determining whether or not equipment is calibrated properly, security guards are knowledgeable in post order procedures, and intrusion detection systems (IDSs) are activating properly. For critical infrastructure, testing may include planned exercises to breach security to ensure existing countermeasures are capable of securing the facility against the most sophisticated attempts to illegally access the facility. All testing should be based on an established set of testing protocols. Because individual facilities may have numerous countermeasures in place, it is unrealistic to attempt to test all countermeasures annually. Random sampling may be necessary for larger facilities. 1 Testing - Encompasses those procedures used to assess the performance of security equipment, security guards, and emergency planning and response. Security equipment testing includes, but is not limited to, alarm/detection systems testing, examining equipment calibration, detection of training weapons and other simulated contraband, and appropriate positioning of surveillance equipment. 5
8 Incident Response Time: This measure is suitable for a number of security related requirements but only when the security manager has operational control over response capability, or has negotiated a service agreement with a response provider. Use of this type of measure usually requires a baseline assessment of existing average response times. This average should be compared with a benchmark or desired standard. If there is a high volume of incidents within a given facility inventory and there is no automated time recording database available, random sampling of incidents may be necessary. Sampling should be large enough to reflect normal operational circumstances. Incremental performance target objectives may be necessary to guide development of improved procedures and future funding needs. 4.3 Outcome Measures Outcomes or results represent the impact of the organization upon its customers or problems. Results are often classified in terms of the achievement of a desired condition, the prevention of an undesired condition, or user satisfaction. Effectiveness is a measure of the relationship between an organization s inputs/processes and outcomes/results Outcome Measures Examples Outcome measures are used to assess the cumulative results of output activities in achieving objectives and indicate how well individual tasks or target objectives contribute to the accomplishment of broad-based security program goals. Outcome measures may also support more than one program objective or goal. Examples include: Facility Asset Inventory Secured (Strategic Goal): This measure reflects the cumulative impact of reducing individual facility risk levels through the deployment of security countermeasures throughout the asset inventory. The strategic goal is to achieve and sustain an acceptable risk rating for all facilities. Tracking this strategic goal is a multi-year process. The risk rating is reflective of countermeasures in place and working properly throughout the inventory. An acceptable risk rating may be defined based on a scoring system for evaluating the perimeter, facility envelope, and interior security features of an asset, or it could be simply defined as being ISC standard compliant. Emergency Preparedness (Strategic Goal): This measure focuses on the degree to which employees and senior management are trained and perform up to expectations in emergency training exercises. It reflects the cumulative results of Continunity of Operations Plan (COOP) activation training exercises, Occupant Emergency Plans (OEP) drills, and other emergency exercises. Assuming all output measure target objectives are met, a typical strategic outcome goal for this measure might be to achieve an overall 98 percent success rate in accordance with expected behaviors. Program Efficiency (Program Goal): This outcome measure is intended to capture the cumulative effect of individual process efficiency initiatives (outputs). A typical long- term goal might be to limit overall security program cost increases to a vairable percentage per year. The results of individual efficiencies must be tracked, recorded, and summed. 6
9 4.4 Note on the Examples The examples included above are provided for agencies as they develop or refine their performance measurement program. They may be adopted or modified to meet their particular mission and program needs. Departments and agencies should utilize only those measures suitable to and supportive of their particular physical security program. Variances within department or agency components in both number and content may also be appropriate due to program or budgetary constraints. In short, the examples below are provided to assist departments and agencies, and their components, in developing the measures that best suit their needs. Additional comments can be found in Appendix A. 4.5 Performance Measurement Process Chart The following chart (Table 1) illustrates how the process of using performance measures ties to mission, goals, objectives, specific actions (outputs), and outcomes. This hypothetical example is based on the mission of securing all facilities and a goal of ensuring all facilities comply with ISC security standards within 36 months. To achieve the goal, two program objectives were established. The first objective is to assess all 100 of the hypothetical agency facilities within 18 months; the second is to deploy all approved security measures identified in those assessments within 18 months after the last assessment is completed. The chart identifies several tasks or actions required to accomplish the objectives, but they should not be viewed as all-inclusive. In the example, the results indicate some slippage, but overall, the delay in approving all recommended countermeasures did not adversely affect the accomplishment of the goal within the target timeframe. The bottom portion of the process chart shows how the input, output and outcome measures support each phase of the process and ultimately the goal of ensuring all facilities are ISC compliant within 36 months was achieved. Goal: Ensure all [agency] facilities are ISC compliant within 36 Mission: Secure Facilities months. Objectives Actions Results 1. Assess all 100 [agency] facilities for compliance within 18 months 2. Implement corrective measures as needed within 18 months of last assessment 1. Complete all scheduled risk assessments on time (quarterly schedule) 2. Obtain consensus/ approval on recommended corrective measures (CMs) within 45 days of risk assessment 1. Identify priority CMs, and coordinate as appropriate with facility managers 2. Award ID/IQ contract(s) for CM installation 3. Conduct post deployment 4. ISC compliance inspection 100% of risk assessments completed on time. 18 compliant facilities 90% of recommended CMs approved within 45 days (Remaining 10% approved within 60 days.) 250 CMs identified as needed to make facilities ISC compliant Five ID/IQ contracts awarded to install 250 CMs in 82 facilities within 18 months of last risk assessment All CMs installed and validated 7
10 Inputs: Inputs Outputs Outcome 1. Necessary travel and support funding budgeted 2. Quarterly risk assessment schedule developed with dates. 3. Estimated CM purchase and installation funding budgeted 4. CM installation plan developed and approved (multiple ID/IQ contracts) Outputs: approved assessments 2. Approved CMs prioritized 3. CMs deployed within 18 months of last risk assessment 4. Post CM deployment inspection reports completed Outcome: 1. All 100 [agency] facilities are ISC compliant within 36 months Goal achieved Table 1: Table 1: Performance Measurement Process Chart 5. Performance Measurement Implementation Performance measures are a useful tool for decision makers at all levels. Program managers at the agency headquarters level use performance measures to determine if their security program is accomplishing or supporting agency mission, goals, and objectives. Field level managers may use performance measures to demonstrate program effectiveness to stakeholders, assess emergency preparedness capabilities, oversee security equipment maintenance and testing programs, and determine the adequacy of resources to support operational security requirements. Physical security related performance measures provide valuable information used to support funding requests, accomplish program goals and identify areas for improvement, and process change or additional training. 5.1 Headquarters and Field Level Interaction Implementing a performance measurement program at the agency level is required to link the specific measures to the agency s established goals. Generally, a strategic plan contains one or more goals, which impacts or requires the direct support of the physical security program operations, over a multi-year time span. Therefore, performance measurement initiatives at the agency headquarters level are also generally multi-year efforts with phased implementation aligned with the agency strategic plan. At the field level, performance measurement activities must support the agency level goals and objectives. However, they may include measures aimed at assessing and demonstrating the effectiveness of the security program at the local level in ways different from the agency program measures. These field performance measures may be short term or multi-year initiatives. The Performance Measurement Process Chart (Table 1) illustrates the implementation of an agency headquarters level goal [ensure all facilities are ISC compliant within 36 months] with two supporting objectives [assess 100 facilities within 18 months and implement corrective measures within 18 months of the last assessment]. These two objectives support the goal of achieving ISC compliance with a three-year timeframe for the entire organization. At the field level, the security program manager may be heavily involved in conducting the risk assessments and, once funding is available, implementing the approved countermeasures. The security program manager may also be involved in measuring the time and resources needed to complete individual assessments or the time required to obtain full approval of recommended countermeasures. This information may be helpful in justifying additional resource requirements necessary to meet the headquarters assessment schedule or to initiate process changes to reduce 8
11 approval timeframes. The security program manager may track the accuracy of countermeasure deployment costs compared to the budget provided by headquarters. This will provide valuable information in developing input measure data for preparing a future budget submission. The field manager may also establish local objectives. For example, the manager may establish a performance objective to develop and issue revised guard orders addressing the use of the new security equipment recommended in the required risk assessments. This output measure could be based on measuring the planned versus actual issuance date, using the date of countermeasure deployment as the planned date. Another example of a field manager establishing a performance measure is testing existing countermeasures to ensure they are working properly, such as setting a goal of 99 percent effectiveness. Testing confirms reliability, or lack thereof, of maintenance programs, ensures credibility with facility occupants, and provides empirical data to support countermeasure replacement if necessary, all of which would be essential to support the conclusion that all facilities are ISC compliant. Whether the performance measures are driven by agency headquarters goals or field manager initiatives, all performance measures should provide a basis for assessing program effectiveness, establish objective data for resource and process improvements, and lead to overall security program effectiveness. Goals and objectives established at the headquarters or field level, illustrates the effective use of performance measures that requires a collaborative effort. The team should be led by the security professional but should include budget, procurement, and facility management officials and, where appropriate, human resource and training officials. Each participant should be fully briefed and share a common understanding of the measurement initiative, including an understanding of the actual measures, definition of terms, data sources, and most importantly, a commitment to utilize the results to improve program performance. 6. Conclusion The guidance in this document provides the foundation for a measurement program that will endure both in terms of the metrics themselves and, more importantly, the use of performance measurement as a management tool. The use of performance measurement and testing is one of six key management practices the ISC is promoting within the Federal physical security community. Combined with future ISC management documents, ISC membership seeks to achieve consistent, professional, and cost effective management of physical security programs across the Federal government that improve the protection of and security within Federal facilities. To assist in the development of a physical security performance measurement program, Appendix B in this document includes an annotated bibliography of source documents. 9
12 7. References 7.1 Appendix A: Quick Reference Guide Type Category Example Purpose Asset Inventory Number of facilities, Number assessed, Number at Acceptable Level of Risk Input/Process Measures Output Measures Countermeasures in Use Resources Requirements Process Governing Approval of Facility Security Assessment (FSA) Security Assessments Completed Level of Risk Countermeasures Deployed Countermeasures Needed (backlog) Countermeasures Tested Response Time Emergency Exercises CM Inventory by type: guards, surveillance systems, magnetometers, x-rays, canines, blast protection, vehicle barrier protection, etc. FTE (number and salary), FSL and risk assessment workload; countermeasure procurement, installation, maintenance, and testing costs, database expense, contract support, training, travel, contract security guards, equipment Track time and costs from initial completion to final approval of the FSA recommendations Percentage of planned assessments completed within the timeframe Number/Percentage of facilities at acceptable risk levels (e.g., ISC compliant), annual target/incremental improvement Installation/deployment schedule, (percentage of planned completed by target date); track procurement, installation, and acceptance progress Inventory of new and replacement countermeasures (annual backlog reduction target) Testing schedule, (percentage passing vs. failed) annual target leading to long-term performance objective Time required for responders (guard, law enforcement, emergency response technician) to arrive/initiate response protocol OEP, COOP exercises (actual vs. expected behaviors); after action report assessment Program scope identification Program scope, resource development, CM repair/replacement cost base, testing inventory Oversight, program management, efficiency targets, trends/projections To maximize efficient use of resources (human capital) Program management (annual target objective), stakeholder communication Program management, stakeholder communication Program management; stakeholder communication Program management Program management; assessment validation Program management, response readiness, stakeholders trust/confidence Emergency response enhancement, program management, stakeholder communication 10
13 Outcome Measures Stakeholder Satisfaction Development and Training Inventory Secured Security Measures Working Emergency Preparedness Incident Reduction Program Efficiency Tenant or customer satisfaction assessment (survey); annual improvement targets 1. Staff development (scheduled training vs. actual) 2. Customer training (crime awareness, security training) planned vs. actual All facilities are protected to an acceptable risk level rating and are ISC compliant Security countermeasure inventory working at strategic goal level Employees, contractors, senior management trained and prepared to response to emergency incident Security violations, thefts, vandalism reduced Physical Security program operating more efficiently Program assessment, stakeholder confidence, identification of areas needing improvement Program development; stakeholder communication and feedback Strategic goal accomplishment, facilities equipped with adequate countermeasures Strategic goal accomplishment; security measures are effective Strategic goal accomplishment, OEP, COOP Plans validated and employees prepared based on successful training Strategic goal accomplishment; inventory experienced fewer security violations, etc Strategic goal accomplishment; mission accomplished within resources/more cost effective delivery 11
14 7.2 Appendix B: Annotated Bibliography The following reflect publications reviewed in preparing this document. They assist the physical security manager in developing a physical security measurement and testing program. 1. Homeland Security: Further Actions Needed to Coordinate Federal Agencies Facility Protection Efforts and Promote Key Practices, Government Accountability Office, GAO-05-49, November Document reflects GAO s review of the progress made in coordinating the government s facility protection efforts. It recommended the ISC develop an action plan to guide future initiatives and that it promotes key management practices, one of which is performance measurement. 2. Homeland Security: Guidance and Standards Are Needed for Measuring the Effectiveness of Agencies Facility Protection Efforts, Government Accountability Office, GAO Document reflects GAO s review of physical security performance measurement and testing practices at Federal agencies, State, and local governments, private industry, and foreign countries. The GAO found no standards governing physical security measures and recommend the ISC develop such guidance for Federal agencies. 3. Performance Measurement and Evaluation, Government Accountability Office, GAO SP, May Document provides definitions and summarizes the relationships between performance measurement and program evaluation including outcome evaluation, impact evaluation and cost-benefit and cost-effective analyses. 4. National Infrastructure Protection Plan (NIPP), Department of Homeland Security, 2006, The NIPP provides a comprehensive risk management framework that can apply across all 18 Critical Infrastructure and Key Resource Sectors to enhance protection of the Nation s vital assets. 5. Government Performance Results Act of 1993, Law establishes the requirements for Federal departments and agencies to submit a 5-year strategic plan with mission goals as well as, methods used to accomplish strategic goals. It focuses on budget and performance integration. 6. Program Assessment Rating Tool (PART), First initiated in 2002 for the FY 2004 budget cycle, the PART is a tool used by OMB to assess program performance to include program management, measurement, and program results. 7. ExpectMore.gov, Office of Management and Budget (OMB), OMB provides results of ratings for all Federal programs. 8. Measures and Metrics in Corporate Security, George K. Campbell, Security Executive Council Publication Series, 2006, Document provides guidance in development of a metrics program that aligns with business goals. 9. A Background Paper on Measuring Police Agency Performance, Edward R. Maguire, Ph.D., George Mason University, Commissioned by the Commission on Accreditation For Law Enforcement Agencies, Inc., May 1,
15 Document provides guidance in developing comparative performance measures for police organizations to compare organizations or multiple agencies over time. 10. The Implausibility of Benchmarking Police Performance, James R. Brunet, Department of Public Administration, North Carolina State University. This document discusses the difficulties in using traditional police performance measures and suggests alternatives for better managing police performance. 11. Disaster Exercise Management, Part 1: Performance Measurement, William F. Comtois, This document discusses the need for and the value of performance measurement in conducting disaster exercises to better prepare of emergency incidents. 12. Annual Performance Progress Report (APPR) for Fiscal Year , Oregon Department of Energy, October 16, 2006, This report provides on overview of how the Oregon Department of Energy develops and uses performance measures. 13. Performance Data and Analysis, Part 2 FY 2006 Performance and Accountability Report, Department of the Interior, This report provides important financial and performance information for the Department of the Interior (DOI). It is DOI's principal publication and report to Congress and the American people on the stewardship, management, and leadership of the public funds entrusted to DOI. 14. OMB Circular No. A-123 Management Responsibility for Internal Control, December 21, 2004, Federal Preparedness Circular 65 (FPC 5), 13
16 Interagency Security Committee Participants ISC Chair James Snyder (Acting Chair) Deputy Assistant Secretary for Infrastructure U.S. Department of Homeland Security ISC Executive Director Austin L. Smith U.S. Department of Homeland Security Office of Infrastructure Protection Working Group Chair Mark Strickland Security Specialist Administrative Office of the U.S. Courts Court Security Office Working Group Members Gwainevere C. Hess Senior Policy Analyst U.S. Department of Homeland Security Office of Infrastructure Protection Joseph Gerber Physical Security Specialist U.S. Department of Homeland Security Office of the Chief Security Acknowledgement This working group acknowledges the work of Mr. Mark Harvey (Federal Protective Service) on the first draft of the Performance Measures document. 14
The Interagency Security Committee and Security Standards for Federal Buildings
Order Code RS22121 Updated November 23, 2007 The Interagency Security Committee and Security Standards for Federal Buildings Summary Stephanie Smith Analyst in American National Government Government and
chieving organizational and management excellence
M Aa Nn Aa Gg Ee Mm Ee Nn T t I Nn tt ee gg rr aa tt ii oo n N G Oo Aa L l * P e r f o r m a n c e S e c t i o n M a n a g e m e n t I n t e g r a t i o n G o a l Achieve organizational and management
Federal Bureau of Investigation s Integrity and Compliance Program
Evaluation and Inspection Division Federal Bureau of Investigation s Integrity and Compliance Program November 2011 I-2012-001 EXECUTIVE DIGEST In June 2007, the Federal Bureau of Investigation (FBI) established
NASA OFFICE OF INSPECTOR GENERAL
NASA OFFICE OF INSPECTOR GENERAL OFFICE OF AUDITS SUITE 8U71, 300 E ST SW WASHINGTON, D.C. 20546-0001 April 14, 2016 TO: SUBJECT: Renee P. Wynn Chief Information Officer Final Memorandum, Review of NASA
SAFETY AND SECURITY: Opportunities to Improve Controls Over Police Department Workforce Planning
SAFETY AND SECURITY: Opportunities to Improve Controls Over Police Department Audit Report OIG-A-2015-006 February 12, 2015 2 OPPORTUNITIES EXIST TO IMPROVE WORKFORCE PLANNING PRACTICES AND RELATED SAFETY
Information Security Guide For Government Executives. Pauline Bowen Elizabeth Chew Joan Hash
Information Security Guide For Government Executives Pauline Bowen Elizabeth Chew Joan Hash Introduction Table of Contents Introduction 1 Why do I need to invest in information security? 2 Where do I need
Five-Year Strategic Plan
U.S. Department of Education Office of Inspector General Five-Year Strategic Plan Fiscal Years 2014 2018 Promoting the efficiency, effectiveness, and integrity of the Department s programs and operations
STATEMENT OF J. RICHARD BERMAN ASSISTANT INSPECTOR GENERAL FOR AUDITS OFFICE OF INSPECTOR GENERAL U. S. DEPARTMENT OF HOMELAND SECURITY BEFORE THE
STATEMENT OF J. RICHARD BERMAN ASSISTANT INSPECTOR GENERAL FOR AUDITS OFFICE OF INSPECTOR GENERAL U. S. DEPARTMENT OF HOMELAND SECURITY BEFORE THE SELECT COMMITTEE ON HOMELAND SECURITY U. S. HOUSE OF REPRESENTATIVES
5 FAM 670 INFORMATION TECHNOLOGY (IT) PERFORMANCE MEASURES FOR PROJECT MANAGEMENT
5 FAM 670 INFORMATION TECHNOLOGY (IT) PERFORMANCE MEASURES FOR PROJECT MANAGEMENT (CT:IM-92; 08-01-2007) (Office of Origin: IRM/BPC/PRG) 5 FAM 671 WHAT ARE IT PERFORMANCE MEASURES AND WHY ARE THEY REQUIRED?
U.S. Nuclear Regulatory Commission
U.S. Nuclear Regulatory Commission 2011 Data Center Consolidation Plan and Progress Report Version 2.0 September 30, 2011 Enclosure Contents 1 Introduction... 2 2 Agency Goals for Data Center Consolidation...
TITLE III INFORMATION SECURITY
H. R. 2458 48 (1) maximize the degree to which unclassified geographic information from various sources can be made electronically compatible and accessible; and (2) promote the development of interoperable
Audit of the Department of State Information Security Program
UNITED STATES DEPARTMENT OF STATE AND THE BROADCASTING BOARD OF GOVERNORS OFFICE OF INSPECTOR GENERAL AUD-IT-15-17 Office of Audits October 2014 Audit of the Department of State Information Security Program
SENTINEL AUDIT V: STATUS OF
SENTINEL AUDIT V: STATUS OF THE FEDERAL BUREAU OF INVESTIGATION S CASE MANAGEMENT SYSTEM U.S. Department of Justice Office of the Inspector General Audit Division Audit Report 10-03 November 2009 Redacted
M ANAGEMENT I NTEGRATION G OAL. Achieve organizational and management excellence
M ANAGEMENT I NTEGRATION G OAL Achieve organizational and management excellence 71 MANAGEMENT INTEGRATION STRATEGIC GOAL Management Integration Goal Achieve organizational and management excellence The
AUDIT REPORT. Follow-up on the Department of Energy's Acquisition and Maintenance of Software Licenses
U.S. Department of Energy Office of Inspector General Office of Audits and Inspections AUDIT REPORT Follow-up on the Department of Energy's Acquisition and Maintenance of Software Licenses DOE/IG-0920
POSTAL REGULATORY COMMISSION
POSTAL REGULATORY COMMISSION OFFICE OF INSPECTOR GENERAL FINAL REPORT INFORMATION SECURITY MANAGEMENT AND ACCESS CONTROL POLICIES Audit Report December 17, 2010 Table of Contents INTRODUCTION... 1 Background...1
GAO DATA CENTER CONSOLIDATION. Strengthened Oversight Needed to Achieve Cost Savings Goal. Report to Congressional Requesters
GAO United States Government Accountability Office Report to Congressional Requesters April 2013 DATA CENTER CONSOLIDATION Strengthened Oversight Needed to Achieve Cost Savings Goal GAO-13-378 April 2013
VA Office of Inspector General
VA Office of Inspector General OFFICE OF AUDITS AND EVALUATIONS Department of Veterans Affairs Audit of Office of Information Technology s Strategic Human Capital Management October 29, 2012 11-00324-20
Legislative Language
Legislative Language SEC. 1. COORDINATION OF FEDERAL INFORMATION SECURITY POLICY. (a) IN GENERAL. Chapter 35 of title 44, United States Code, is amended by striking subchapters II and III and inserting
Subject: National Preparedness
For Immediate Release Office of the Press Secretary The White House December 17, 2003 Homeland Security Presidential Directive / HSPD-8 Subject: National Preparedness Purpose (1) This directive establishes
REAL PROPERTY ASSET MANAGEMENT (DOE O 430.1B) REQUIREMENTS CHECKLIST
DOE O 430.1B Requirements The management of real property assets must take a corporate, holistic, and performance-based approach to real property life-cycle asset management that links real property asset
Science/Safeguards and Security. Funding Profile by Subprogram
Safeguards and Security Safeguards and Security Funding Profile by Subprogram (dollars in thousands) Protective Forces 35,059 37,147 Security Systems 11,896 10,435 Information Security 4,655 4,595 Cyber
Subject: Critical Infrastructure Identification, Prioritization, and Protection
For Immediate Release Office of the Press Secretary The White House December 17, 2003 Homeland Security Presidential Directive / HSPD-7 Subject: Critical Infrastructure Identification, Prioritization,
Department-wide Systems & Capital Investment Programs
Department-wide Systems & Capital Investment Programs Mission Statement The Department-wide Systems and Capital Investments Programs (DSCIP) is authorized to be used by or on behalf of the Treasury Department
Audit of Veterans Health Administration Blood Bank Modernization Project
Department of Veterans Affairs Office of Inspector General Audit of Veterans Health Administration Blood Bank Modernization Project Report No. 06-03424-70 February 8, 2008 VA Office of Inspector General
December 17, 2003 Homeland Security Presidential Directive/Hspd-7
For Immediate Release Office of the Press Secretary December 17, 2003 December 17, 2003 Homeland Security Presidential Directive/Hspd-7 Subject: Critical Infrastructure Identification, Prioritization,
Department of Homeland Security Information Sharing Strategy
Securing Homeland the Homeland Through Through Information Information Sharing Sharing and Collaboration and Collaboration Department of Homeland Security April 18, 2008 for the Department of Introduction
FINAL AUDIT REPORT WITH RECOMMENDATIONS Internal Operations No. 12-001 BACKGROUND
FINAL AUDIT REPORT WITH RECOMMENDATIONS Internal Operations No. 12-001 SUBJECT: Review of Contract CQ 7068 Safety Management Assessment and Enhancement Program DATE: February 14, 2012 FROM: OIG Helen Lew
Audit of NRC s Network Security Operations Center
Audit of NRC s Network Security Operations Center OIG-16-A-07 January 11, 2016 All publicly available OIG reports (including this report) are accessible through NRC s Web site at http://www.nrc.gov/reading-rm/doc-collections/insp-gen
JOINT EXPLANATORY STATEMENT TO ACCOMPANY THE CYBERSECURITY ACT OF 2015
JOINT EXPLANATORY STATEMENT TO ACCOMPANY THE CYBERSECURITY ACT OF 2015 The following consists of the joint explanatory statement to accompany the Cybersecurity Act of 2015. This joint explanatory statement
AUDIT REPORT. The Energy Information Administration s Information Technology Program
U.S. Department of Energy Office of Inspector General Office of Audits and Inspections AUDIT REPORT The Energy Information Administration s Information Technology Program DOE-OIG-16-04 November 2015 Department
FROM THE INTERAGENCY DISPATCH OPTIMIZATION PILOT PROJECT (IDOPP)
NATIONAL TOOLBOX FROM THE INTERAGENCY DISPATCH OPTIMIZATION PILOT PROJECT (IDOPP) January 31, 2013 TABLE OF CONTENTS 1 Introduction... 1 2 Project Start-Up... 1 3 Data Requests... 2 3.1 Information Needs
State of Minnesota. Enterprise Security Strategic Plan. Fiscal Years 2009 2013
State of Minnesota Enterprise Security Strategic Plan Fiscal Years 2009 2013 Jointly Prepared By: Office of Enterprise Technology - Enterprise Security Office Members of the Information Security Council
Comprehensive Master Plan. Strategic Security Plan 2010 2015. East Carolina University. December 31, 2009. Prepared by:
East Carolina University Comprehensive Master Plan Strategic Security Plan 2010 2015 December 31, 2009 Prepared by: 14900 Bogle Drive, Suite 200 Chantilly, VA 20151 Telephone: 703-488-9990 This Page Intentionally
Cybersecurity Enhancement Account. FY 2017 President s Budget
Cybersecurity Enhancement Account FY 2017 President s Budget February 9, 2016 Table of Contents Section 1 Purpose... 3 1A Mission Statement... 3 1.1 Appropriations Detail Table... 3 1B Vision, Priorities
JOB ANNOUNCEMENT. Chief Security Officer, Cheniere Energy, Inc.
JOB ANNOUNCEMENT Chief Security Officer, Cheniere Energy, Inc. Position Overview The Vice President and Chief Security Risk Officer (CSRO) reports to the Chairman, Chief Executive Officer and President
Information Security Program CHARTER
State of Louisiana Information Security Program CHARTER Date Published: 12, 09, 2015 Contents Executive Sponsors... 3 Program Owner... 3 Introduction... 4 Statewide Information Security Strategy... 4 Information
Office of Inspector General
DEPARTMENT OF HOMELAND SECURITY Office of Inspector General Security Weaknesses Increase Risks to Critical United States Secret Service Database (Redacted) Notice: The Department of Homeland Security,
BUSINESS CONTINUITY PLANNING
Policy 8.3.2 Business Responsible Party: President s Office BUSINESS CONTINUITY PLANNING Overview The UT Health Science Center at San Antonio (Health Science Center) is committed to its employees, students,
Table of Contents PERFORMANCE REVIEWS STRATEGIC REVIEWS
SECTION 270 PERFORMANCE AND STRATEGIC REVIEWS Table of Contents 270.1 To which agencies does this section apply? 270.2 What is the purpose of this section? PERFORMANCE REVIEWS 270.3 What is the purpose
U.S. Department of the Treasury. Treasury IT Performance Measures Guide
U.S. Department of the Treasury Treasury IT Performance Measures Guide Office of the Chief Information Officer (OCIO) Enterprise Architecture Program June 2007 Revision History June 13, 2007 (Version 1.1)
Preventing and Defending Against Cyber Attacks November 2010
Preventing and Defending Against Cyber Attacks November 2010 The Nation s first ever Quadrennial Homeland Security Review (QHSR), delivered to Congress in February 2010, identified safeguarding and securing
File 6: Appendix A, 36 CFR 1234 (formally numbered as 36 CFR 1228 subpart K) Federal Facility Security Standards (version 2.0 issued May 15, 2014)
File 6: Appendix A, 36 CFR 1234 (formally numbered as 36 CFR 1228 subpart K) Federal Facility Security Standards (version 2.0 issued May 15, 2014) Appendix A -- Minimum Security Standards for Level Federal
No. 33 February 19, 2013. The President
Vol. 78 Tuesday, No. 33 February 19, 2013 Part III The President Executive Order 13636 Improving Critical Infrastructure Cybersecurity VerDate Mar2010 17:57 Feb 15, 2013 Jkt 229001 PO 00000 Frm 00001
White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program
White Paper from Global Process Innovation by Jim Boots Fourteen Metrics for a BPM Program This white paper presents 14 metrics which may be useful for monitoring progress on a BPM program or initiative.
Department of Veterans Affairs Office of Inspector General Strategic Plan FY 2009-2015
Department of Veterans Affairs Office of Inspector General Strategic Plan FY 2009-2015 Plans are nothing; planning is everything. -Dwight D. Eisenhower The Strategic Plan of the VA Office of Inspector
John Keel, CPA State Auditor. An Audit Report on The Dam Safety Program at the Commission on Environmental Quality. May 2008 Report No.
John Keel, CPA State Auditor An Audit Report on The Dam Safety Program at the Commission on Environmental Quality Report No. 08-032 An Audit Report on The Dam Safety Program at the Commission on Environmental
GAO COMBATING TERRORISM. Observations on Options to Improve the Federal Response. Testimony
GAO For Release on Delivery Expected at 3:00 p.m. Tuesday, April 24, 2001 United States General Accounting Office Testimony Before the Subcommittee on Economic Development, Public Buildings, and Emergency
NARA s Information Security Program. OIG Audit Report No. 15-01. October 27, 2014
NARA s Information Security Program OIG Audit Report No. 15-01 October 27, 2014 Table of Contents Executive Summary... 3 Background... 4 Objectives, Scope, Methodology... 7 Audit Results... 8 Appendix
Lease Security Standards: Instructions for Lease Acquisitions
Addendum 1 Lease Security Standards: Instructions for Lease Acquisitions 1. Summary. a. The Interagency Security Committee (ISC) Standards will provide a consistent level of security to federal tenants
MEMORANDUM FOR THE HEADS OF DEPARTMENTS AND AGENCIES
MEMORANDUM FOR THE HEADS OF DEPARTMENTS AND AGENCIES FROM: SUBJECT: Peter R. Orszag Director Managing the Multi-Sector Workforce Federal agencies use both federal employees and private sector contractors
UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL 400 MARYLAND AVENUE, S.W. WASHINGTON, DC 20202-1500.
UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL 400 MARYLAND AVENUE, S.W. WASHINGTON, DC 20202-1500 February 1, 2006 Michell Clark, Designee Assistant Secretary for Management and Acting
TABLE OF CONTENTS. 2006.1259 Information Systems Security Handbook. 7 2006.1260 Information Systems Security program elements. 7
PART 2006 - MANAGEMENT Subpart Z - Information Systems Security TABLE OF CONTENTS Sec. 2006.1251 Purpose. 2006.1252 Policy. 2006.1253 Definitions. 2006.1254 Authority. (a) National. (b) Departmental. 2006.1255
a GAO-05-700 GAO INFORMATION SECURITY Department of Homeland Security Needs to Fully Implement Its Security Program
GAO United States Government Accountability Office Report to the Ranking Minority Member, Committee on Homeland Security and Governmental Affairs, U.S. Senate June 2005 INFORMATION SECURITY Department
September 2005 Report No. 06-009
An Audit Report on The Health and Human Services Commission s Consolidation of Administrative Support Functions Report No. 06-009 John Keel, CPA State Auditor An Audit Report on The Health and Human Services
Standards for Security Categorization of Federal Information and Information Systems
FIPS PUB 199 FEDERAL INFORMATION PROCESSING STANDARDS PUBLICATION Standards for Security Categorization of Federal Information and Information Systems Computer Security Division Information Technology
Best Practices in Performance Measurement Developing Performance Measures
Best Practices in Performance Measurement A National State Auditors Association Best Practices Document Published by the National State Auditors Association Copyright 2004 by the National State Auditors
E X E C U T I V E O F F I CE O F T H E P R E S I D EN T
EXECUTIVE OFFICE OF THE PRESIDENT OFFICE OF MANAGEMENT AND BUDGET WASHINGTON, D.C. 20503 THE DIRECTOR M-05-24 August 5, 2005 MEMORANDUM FOR THE HEADS OF ALL DEPARTMENTS AND AGENCIES FROM: SUBJECT: Joshua
Audit of the Office of Emergency Management and Homeland Security: FEMA Urban Area Security Initiative Grant Management 2012 through 2014
Audit of the Office of Emergency Management and Homeland Security: FEMA Urban Area Security Initiative Grant Management 2012 through 2014 MARTIN MATSON City Comptroller STACEY MAZMANIAN Audit Manager City
U.S. Office of Personnel Management. Actions to Strengthen Cybersecurity and Protect Critical IT Systems
U.S. Office of Personnel Management Actions to Strengthen Cybersecurity and Protect Critical IT Systems June 2015 1 I. Introduction The recent intrusions into U.S. Office of Personnel Management (OPM)
Experience the commitment WHITE PAPER. Information Security Continuous Monitoring. Charting the Right Course. cgi.com 2014 CGI GROUP INC.
Experience the commitment WHITE PAPER Information Security Continuous Monitoring Charting the Right Course May 2014 cgi.com 2014 CGI GROUP INC. During the last few months of 2013, six federal agencies
Office of the Auditor General Performance Audit Report. Statewide Oracle Database Controls Department of Technology, Management, and Budget
Office of the Auditor General Performance Audit Report Statewide Oracle Database Controls Department of Technology, Management, and Budget March 2015 071-0565-14 State of Michigan Auditor General Doug
Water Critical Infrastructure and Key Resources Sector-Specific Plan as input to the National Infrastructure Protection Plan Executive Summary
Water Critical Infrastructure and Key Resources Sector-Specific Plan as input to the National Infrastructure Protection Plan Executive Summary May 2007 Environmental Protection Agency Executive Summary
PHASED FACILITY SECURITY PROGRAM DEVELOPMENT HANDBOOK
DEPARTMENT OF COMMERCE OFFICE OF SECURITY HOMELAND SECURITY GUIDANCE PHASED FACILITY SECURITY PROGRAM DEVELOPMENT HANDBOOK HOMELAND SECURITY ADVISORY SYSTEM LOW Low Risk of Terrorist Attacks GUARDED General
STATEMENT OF MARK A.S. HOUSE OF REPRESENTATIVES
STATEMENT OF MARK A. FORMAN ASSOCIATE DIRECTOR FOR INFORMATION TECHNOLOGY AND ELECTRONIC GOVERNMENT OFFICE OF MANAGEMENT AND BUDGET BEFORE THE COMMITTEE ON GOVERNMENT REFORM SUBCOMMITTEE ON GOVERNMENT
Security Control Standard
Department of the Interior Security Control Standard Security Assessment and Authorization January 2012 Version: 1.2 Signature Approval Page Designated Official Bernard J. Mazer, Department of the Interior,
FEDERAL PREPAREDNESS CIRCULAR Federal Emergency Management Agency Washington, D.C. 20472 FPC 67
FEDERAL PREPAREDNESS CIRCULAR Federal Emergency Management Agency Washington, D.C. 20472 FPC 67 April 30, 2001 TO: SUBJECT: HEADS OF FEDERAL DEPARTMENTS AND AGENCIES ACQUISITION OF ALTERNATE FACILITIES
TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION
TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION Improvements Are Needed to the Information Security Program March 11, 2008 Reference Number: 2008-20-076 This report has cleared the Treasury Inspector
U.S. ELECTION ASSISTANCE COMMISSION OFFICE OF INSPECTOR GENERAL
U.S. ELECTION ASSISTANCE COMMISSION OFFICE OF INSPECTOR GENERAL FINAL REPORT: U.S. Election Assistance Commission Compliance with the Requirements of the Federal Information Security Management Act Fiscal
5 FAH-11 H-500 PERFORMANCE MEASURES FOR INFORMATION ASSURANCE
5 FAH-11 H-500 PERFORMANCE MEASURES FOR INFORMATION ASSURANCE 5 FAH-11 H-510 GENERAL (Office of Origin: IRM/IA) 5 FAH-11 H-511 INTRODUCTION 5 FAH-11 H-511.1 Purpose a. This subchapter implements the policy
The Transportation Security Administration Does Not Properly Manage Its Airport Screening Equipment Maintenance Program
The Transportation Security Administration Does Not Properly Manage Its Airport Screening Equipment Maintenance Program May 6, 2015 OIG-15-86 HIGHLIGHTS The Transportation Security Administration Does
IFMA Facility Management Learning System - Table of Contents
Competency: Communication Chapter 1: Communication Fundamentals o Topic 1: The Nature of Communication o Topic 2: Effective and Efficient Communication o Topic 3: Cross-Cultural Communication o Topic 4:
INFORMATION MANAGEMENT
United States Government Accountability Office Report to the Committee on Homeland Security and Governmental Affairs, U.S. Senate May 2015 INFORMATION MANAGEMENT Additional Actions Are Needed to Meet Requirements
STATEMENT OF ANNE L. RICHARDS ACTING DEPUTY CHIEF OPERATING OFFICER DEPARTMENT OF HOMELAND SECURITY OFFICE OF INSPECTOR GENERAL BEFORE THE
STATEMENT OF ANNE L. RICHARDS ACTING DEPUTY CHIEF OPERATING OFFICER DEPARTMENT OF HOMELAND SECURITY OFFICE OF INSPECTOR GENERAL BEFORE THE SUBCOMMITTEE ON TRANSPORTATION SECURITY COMMITTEE ON HOMELAND
AUDIT REPORT. The Department of Energy's Management of Cloud Computing Activities
U.S. Department of Energy Office of Inspector General Office of Audits and Inspections AUDIT REPORT The Department of Energy's Management of Cloud Computing Activities DOE/IG-0918 September 2014 Department
Office of Inspector General Evaluation of the Consumer Financial Protection Bureau s Consumer Response Unit
Office of Inspector General Evaluation of the Consumer Financial Protection Bureau s Consumer Response Unit Consumer Financial Protection Bureau September 2012 September 28, 2012 MEMORANDUM TO: FROM: SUBJECT:
DIRECTIVE TRANSMITTAL
U.S. NUCLEAR REGULATORY COMMISSION DIRECTIVE TRANSMITTAL TN: DT-07-08 To: Subject: Purpose: Office and Division of Origin: NRC Management Directives Custodians Transmittal of Management Directive 2.8,
Statement of Danny Harris, Ph.D. Chief Information Officer U.S. Department of Education
Statement of Danny Harris, Ph.D. Chief Information Officer U.S. Department of Education Before the U.S. House Oversight and Government Reform Committee Hearing on Agency Compliance with the Federal Information
