OFFICE OF THE SECRETARY OF DEFENSE 1700 DEFENSE PENTAGON WASHINGTON, DC 20301-1700

Similar documents
OFFICE OF THE SECRETARY OF DEFENSE 1700 DEFENSE PENTAGON WASHINGTON, DC

Cybersecurity. Cybersecurity 331

OFFICE OF THE SECRETARY OF DEFENSE 1700 DEFENSE PENTAGON WASHINGTON, DC

Implementation of the DoD Management Control Program for Navy Acquisition Category II and III Programs (D )

Information Assurance and Interoperability

Department of Defense NetOps Strategic Vision

MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices

U.S. Department of Energy Office of Inspector General Office of Audits and Inspections

Department of Defense INSTRUCTION. SUBJECT: Communications Security (COMSEC) Monitoring and Information Assurance (IA) Readiness Testing

1 July 2015 Version 1.0

Cybersecurity Training in OT&E for DOT&E Action Officers

GAO MAJOR AUTOMATED INFORMATION SYSTEMS. Selected Defense Programs Need to Implement Key Acquisition Practices

Report No. D February 5, Contingency Planning for DoD Mission-Critical Information Systems

ODIG-AUD (ATTN: Audit Suggestions) Department of Defense Inspector General 400 Army Navy Drive (Room 801) Arlington, VA

DoD Strategy for Defending Networks, Systems, and Data

ort Office of the Inspector General Department of Defense YEAR 2000 COMPLIANCE OF THE STANDARD ARMY MAINTENANCE SYSTEM-REHOST Report Number

How To Audit The Mint'S Information Technology

CYBERSECURITY CHALLENGES FOR DOD ACQUISITION PROGRAMS. Steve Mills Professor of Information Technology

OPNAVINST A USFF/CNO N3/N5 10 Nov 2014

GAO DEFENSE DEPARTMENT CYBER EFFORTS. More Detailed Guidance Needed to Ensure Military Services Develop Appropriate Cyberspace Capabilities

RMF. Cybersecurity and the Risk Management. Framework UNCLASSIFIED

Cyber R &D Research Roundtable

Department of Defense DIRECTIVE

FY 2014 Annual Report

Information Assurance Metrics Highlights

WEAPONS ACQUISITION REFORM

STATEMENT BY DAVID DEVRIES PRINCIPAL DEPUTY DEPARTMENT OF DEFENSE CHIEF INFORMATION OFFICER BEFORE THE

Department of Defense INSTRUCTION

Encl: (1) Surface Warfare Tactical Requirement Group Membership

Cybersecurity in Test & Evaluation. James S. Wells Deputy Director, Cyberspace & HSE Programs Office of Test & Evaluation

DOD BUSINESS SYSTEMS MODERNIZATION. Additional Action Needed to Achieve Intended Outcomes

AUDIT REPORT. Cybersecurity Controls Over a Major National Nuclear Security Administration Information System

Flexible, Life-Cycle Support for Unique Mission Requirements

Report No. D January 14, Management of the General Fund Enterprise Business System

Department of Defense DIRECTIVE

POLICIES CONCERNING THE NAVAL POSTGRADUATE SCHOOL. 1. Purpose. To update and clarify policies concerning the Naval Postgraduate School.

DoD Cloud Computing Strategy Needs Implementation Plan and Detailed Waiver Process

Information Technology

FIREWALLS & NETWORK SECURITY with Intrusion Detection and VPNs, 2 nd ed. Chapter 4 Finding Network Vulnerabilities

DoD Application Store: Enabling C2 Agility?

Department of Defense. SUBJECT: Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS)

U.S. Department of Energy Office of Inspector General Office of Audits and Inspections

F-35 JOINT STRIKE FIGHTER. Problems Completing Software Testing May Hinder Delivery of Expected Warfighting Capabilities

Security Control Standard

Audit Report. Management of Naval Reactors' Cyber Security Program

CYBERSECURITY CHALLENGES FOR DOD ACQUISITION PROGRAMS. Steve Mills DAU-South

ENTERPRISE COMPUTING ENVIRONMENT. Creating connections THROUGH SERVICE & WORKFORCE EXCELLENCE

DEFENSE BUSINESS SYSTEMS. Further Refinements Needed to Guide the Investment Management Process

WRITTEN TESTIMONY OF

Department of Defense INSTRUCTION. Measurement and Signature Intelligence (MASINT)

Information Technology

Subj: CYBERSPACE/INFORMATION TECHNOLOGY WORKFORCE CONTINUOUS LEARNING

Cybersecurity is one of the most important challenges for our military today. Cyberspace. Cybersecurity. Defending the New Battlefield

Department of Defense

Army Doctrine Update

Improvements Needed for Awarding Service Contracts at Naval Special Warfare Command

DEPARTMENT OF THE NAVY OFFICE OF THE CHIEF OF NAVAL OPERATIONS 2000 NAVY PENTAGON WASHINGTON, DC

SECRETARY OF THE NAVY SAFETY EXCELLENCE AWARDS

Navy Enterprise Resource Planning System Does Not Comply With the Standard Financial Information Structure and U.S. Government Standard General Ledger

Information System Security

Defense Security Service

Navy and Marine Corps Have Weak Procurement Processes for Cost reimbursement Contract Issuance and Management

Memorandum. ACTION: Report on Computer Security Controls of Financial Management System, FTA FE May 23, 2000.

U.S. Department of Energy Office of Inspector General Office of Audits & Inspections

DEPARTMENT OF THE NAVY. OFFICE OF THE SECRETARY JAN 3a NAVY PENTAGON WASHINGTON, D.C

Acquisition. Army Claims Service Military Interdepartmental Purchase Requests (D ) June 19, 2002

Service Oriented Architecture (SOA) for DoD

COMBATSS-21 Scalable combat management system for the world s navies

U.S. Department of Energy Office of Inspector General Office of Audits and Inspections. Evaluation Report

Managing Vulnerabilities for PCI Compliance White Paper. Christopher S. Harper Managing Director, Agio Security Services

U.S. Department of Energy Office of Inspector General Office of Audits & Inspections. Evaluation Report

Department of Defense INSTRUCTION

Information Assurance Program at West Point

IAI/Malat Solutions for the Maritime Arena

U.S. Army Corps of Engineers, New York District Monitoring of a Hurricane Sandy Contract Needs Improvement

GAO INFORMATION SECURITY. Computer Attacks at Department of Defense Pose Increasing Risks

Department of Defense

NAVFAC EXWC Platform Information Technology (PIT) Cyber Security Initiatives

REQUEST FOR INFORMATION

LESSON SEVEN CAMPAIGN PLANNING FOR MILITARY OPERATIONS OTHER THAN WAR MQS MANUAL TASKS: OVERVIEW

Who s got the grease pencil?! What Cyber Security can Learn from the Outer Air Battle

Followup Audit: Enterprise Blood Management System Not Ready for Full Deployment

Solving the CIO s Cybersecurity Dilemma: 20 Critical Controls for Effective Cyber Defense

STATEMENT OF JOHN E. MCCOY II DEPUTY ASSISTANT INSPECTOR GENERAL FOR AUDITS U.S. DEPARTMENT OF HOMELAND SECURITY BEFORE THE

Discovering problems when

Independent Evaluation of NRC s Implementation of the Federal Information Security Modernization Act of 2014 for Fiscal Year 2015

Subj: DEPARTMENT OF THE NAVY CYBERSECURITY/INFORMATION ASSURANCE WORKFORCE MANAGEMENT, OVERSIGHT, AND COMPLIANCE

CYBERSPACE SECURITY CONTINUUM

How To Evaluate A Dod Cyber Red Team

Improvements Needed With Host-Based Intrusion Detection Systems

Software Reprogramming Policy for Electronic Warfare and Target Sensing Systems

Office of the Inspector General Department of Defense

MEDICAL PLATOON LEADERS HANDBOOK TACTICS, TECHNIQUES, AND PROCEDURES TABLE OF CONTENTS

Host Hardening. Presented by. Douglas Couch & Nathan Heck Security Analysts for ITaP 1

Project Manager Integrated

Defense Logistics Agency Effectively Managed Continental U.S. Mission-Critical Batteries

Introduction to Cyber Defense Competition. Module 16

Small Business Contracting at Regional Contracting Office-National Capital Region Needs Improvement

FEDERAL RÉSUMÉ. Client Name PROFILE SUMMARY

UNCLASSIFIED. UNCLASSIFIED Office of Secretary Of Defense Page 1 of 8 R-1 Line #50

Transcription:

OFFICE OF THE SECRETARY OF DEFENSE 1700 DEFENSE PENTAGON WASHINGTON, DC 20301-1700 OPERATIONAL TEST AND EVALUATION FEB 0 1 2013 MEMORANDUM FOR COMMANDER, ARMY TEST AND EVALUATION COMMAND DIRECTOR, MARINE CORPS OPERATIONAL TEST AND EVALUATION ACTIVITY COMMANDER, OPERATIONAL TEST AND EVALUATION FORCE COMMANDER, AIR FORCE OPERATIONAL TEST AND EVALUATION CENTER COMMANDER, JOINT INTEROPERABILITY TEST CENTER SUBJECT: Test and Evaluation of Information Assurance in Acquisition Programs DOT &E has provided detailed guidance for the test and evaluation of information assurance in acquisition programs and I recently reviewed the implementation of this guidance in FY12 across a number of test reports. 1 I have attached a copy of that review. While we have collectively improved information assurance testing and evaluation, my review identified a number of areas where we can further improve, as follows: Independent Penetration Testing- Due to limited test durations, sharing system information and interconnections between the cooperative cyber vulnerability assessment teams (usually a "Blue Team") and the independent cyber penetration/exploitation teams (usually a "Red Team") is acceptable. However, shared information should not include specific vulnerabilities or system shortfalls. The effort of the cyber penetration/exploitation team should go beyond merely validating prior findings, and focus on examining the system under test in an operational and threat representative event. Additionally, separate teams preferably should perform vulnerability assessments and penetration/exploitation assessments to enhance independence and opportunities for assessing protect, detect, react and respond components. Finally, the correction of vulnerabilities discovered in the cooperative assessments should be an entrance criterion for subsequent penetration/exploitation testing. Network Defense Analysis- The test environment should encompass those network defense elements (including trained personnel, standard tools, and normal network defense procedures) that may not be locally resident and are increasingly provided at higher tiers by other activities. The test should quantitatively examine not only the inherent system/network protections for the system under test, but also the network defense ability to detect penetration or 1 Director, Operational Test and Evaluation, Procedures for Operational Test and Evaluation of Information Assurance in Acquisition Programs, 21 January 2009; Clarification of Procedures for Operational Test and Evaluation of Information Assurance in Acquisition Programs, 4 November 2010. 0

exploitation, react to those events (either procedurally or automatically), and restore the system to full capability following the event. Where appropriate, continuity of operations should be demonstrated or assessed for enterprise system programs; weapons system programs should coordinate with my office to confirm the requirement for continuity of operations testing. Operational Effects Analysis- Testing should include an assessment of operational risk presented by vulnerabilities and shortfalls exploited by a representative threat, and the most direct way to assess that risk is to demonstrate and record relevant operational effects. When operational threat representative effects cannot be conducted on live-networks, alternate evaluation approaches (including the use of cyber range facilities) should be employed and included in the test planning. We must routinely review information assurance test and evaluation procedures and outcomes as the cyber environment in the Department evolves. I request your support in continuing to improve these information assurance tests to be not only as rigorous as every other test you conduct, but as rigorous and challenging as the cyber threats these systems will confront. Attachment: As stated j.m~ J. Michael Gilmore Director 2

FY12 Overarching Assessment of Information Assurance During Operational Test and Evaluation 17 January 2013 1/23/2013-1 Attachment

Purpose Provide an initial assessment of FY12 information assurance testing during OT&E The objective was to assess information assurance testing for consistency in implementation, commonality of vulnerabilities, and identify improvements Data from 14 oversight systems that had penetration testing undertaken were obtained from OED research staff These systems did not go through the DOT&E TEMP and Test Plan checklist reviews first implemented during late FY11 1/23/2013-2 Acronyms on this slide: Operational Test and Evaluation (OT&E), Operational Evaluation Division (OED), Test and Evaluation Master Plan (TEMP).

Background DOT&E information assurance assessment procedure is given in the memorandum dated 21 January 2009 and clarification memorandum dated 4 November 2010 Assessment procedure is defined by: Step 1: Determine applicability of DOT&E information assurance procedures Step 2: Initial review Step 3: Risk assessment Step 4: Cooperative operational vulnerability evaluation Step 5: Independent realistic penetration testing to assess PDRR Step 6: Continuity of operations evaluation Primarily Steps 4, 5, and 6 constitute the operational assessment portion of the information assurance assessment process - Steps 1, 2, and 3 are done during certification and accreditation and developmental test phase DOT&E and AT&L/DT&E are currently jointly reviewing this policy for potential updates and enhancements Acronyms on this slide: Protect, Detect, React, and Restore (PDRR), Acquisition, Technology, and Logistics (AT&L)/Developmental Test and Evaluation (DT&E). 1/23/2013-3

OT&E Assessment Teams Organization Cooperative Vulnerability Assessment (Step 4) Independent Penetration Assessment (Step 5) U.S. Army 1 st IOC, ARL-SLAD TSMO U.S. Navy NIOC, COTF NIOC, COTF U.S. Air Force 92 nd IOS 92 nd IOS U.S. Marine Corps MCIAAT MCIAAT Inconsistency in how Services address cooperative vulnerability assessment (Step 4) and independent penetration assessment (Step 5) Potential for lessened independence between Steps 4 and 5 Need to improve consistency across Services 1/23/2013-4 Acronyms on this slide: Information Operations Command (IOC), Army Research Laboratory-Survivability/Lethality analysis Directorate (ARL-SLAD), Threat Systems Management Office (TSMO), Navy Information Operations Command (NIOC), Commander Operational Test and Evaluation Force (COTF), Information Operations Squadron (IOS), Marine Corps Information Assurance Assessment Team (MCIAAT), Operational Test and Evaluation (OT&E).

Marines Navy Air Force Army Systems Used in Assessment AB3 WIN-T JTRS BCS-F PAC-3 Gray Eagle GCSS-A DCGS-A GPS-SAASM Apache Block III Helicopter Warfighter Information Network-Tactical Joint Tactical Radio System Manpack Battle Control System-Fixed Patriot Gray Eagle UAV Global Combat Support System-Army Distributed Common Ground System-Army Global Positioning System-Selective Availability Anti-spoofing Mode ISPAN DCGS-N T-AKE AEGIS NMT Integrated Strategic Planning and Analysis Network Distributed Common Ground System-Navy Dry Cargo/Ammunition Ships ARGIS Weapon System Navy Multiband Terminal None reviewed for this study 1/23/2013-5

Test Conduct Assessment Cooperative vulnerability assessment (Step 4) testing generally conforms to DOT&E expectations, but independent penetration assessment (Step 5) generally does not Vulnerability findings were not corrected before penetration testing in any of the systems reviewed Penetration assessment testing did not have participation of trained network defenders Acronyms on this slide: Director, Operational Test and Evaluation 1/23/2013-6

Test Conduct Deficiencies Inadequate testing of continuity of operations Mission effects assessment not conducted Detect, React, and Restore data not collected during independent penetration assessment High threat level not represented in penetration assessment Vulnerability assessment data provided to penetration assessment team Vulnerabilities not corrected prior to independent penetration assessment 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Percentage of Systems 1/23/2013-7

Vulnerability Assessment Methodology Identified vulnerabilities from 14 systems Vulnerabilities were categorized using Defense Information Systems Agency categories as: - CAT I Allows an attacker immediate access into a machine and allow super-user access - CAT II Provides information which have a high potential of giving access to an intruder - CAT III Provides information that potentially could lead to compromise. Individual vulnerabilities collapsed into seven common categories Open Source Intelligence Password Management Log Configuration Open Accessible Physical Ports 1/23/2013-8

Observed Information Assurance Deficiencies Log Configuration 0.0 2.5 2.0 2.5 2.0 2.5 2.0 0.0 0.0 0.0 0.0 0.0 0.0 2.0 Account Management 0.0 0.0 2.0 0.0 0.0 2.2 2.0 1.0 0.0 0.0 0.0 0.0 1.0 3.0 Openly Available Information 0.0 1.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 3.0 1.0 0.0 0.0 Network Vulnerabilities 1.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 0.0 0.0 0.0 0.0 0.0 1.0 Open Accessible Physical Ports 0.0 1.0 3.0 1.3 0.0 2.7 2.0 0.0 1.0 1.5 0.0 3.0 0.0 0.0 Password 14 Systems 0.0 1.3 1.5 1.0 0.0 1.3 0.0 1.0 0.0 1.0 1.3 1.0 0.0 1.3 Software Configuration 1.3 1.0 1.8 2.1 1.0 1.9 1.3 1.5 0.0 2.0 1.0 1.5 1.7 2.3 No Vulnerability Reported CAT I CAT II CAT III 1/23/2013-9

Observed Information Assurance Deficiencies CAT I CAT II CAT III No Reported Vulnerability Log Configuration Account Management Openly Available Network Vulnerabilities Open Accessible Password Software Configuration 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 1/23/2013-10

Conclusions Cooperative vulnerability assessments (Step 4) show most systems have easy-to-exploit vulnerabilities Inconsistent and inadequate independent penetration assessment (Step 5) execution among Services Failure to correct vulnerabilities limits the value of the independent penetration assessment Detect and React data are lacking Information on mission effects is lacking due to not being allowed to exploit data and effect operations Continuity of operations evaluation (Step 6) generally lacking 1/23/2013-11

Recommendations Improve consistency and adequacy of assessments Require Detect and React data to be collected for all systems Require separate teams for cooperative vulnerability and threat representative independent penetration assessments Improve operational realism of independent penetration testing Provide penetration testers with limited or no system and network data to ensure appropriate threat realism, even if more time is required to conduct test Require full cyber defense capability to participate during penetration testing to measure detection ability Require information assurance testing as a concurrent part of operational testing and permit exploitation to determine mission effects Revise current DOT&E information assurance memorandum Introduce criterion that known vulnerabilities be corrected before penetration testing Clarify that continuity of operations testing only applies to enterprise systems and not to individual weapon systems 1/23/2013-12