FY15 Quarter 1 Chief Information Officer Federal Information Security Management Act Reporting Metrics V1.0



Similar documents
FY14 Q2 Chief Information Officer Federal Information Security Management Act Reporting Metrics v1.0

FY15 CIO Annual FISMA Metrics

FY 2014 Chief Information Officer Federal Information Security Management Act Reporting Metrics v2.0

Department of Homeland Security

U.S. ELECTION ASSISTANCE COMMISSION OFFICE OF INSPECTOR GENERAL

FY 2015 Inspector General Federal Information Security Modernization Act Reporting Metrics V1.2

ForeScout CounterACT CONTINUOUS DIAGNOSTICS & MITIGATION (CDM)

Evaluation of DHS' Information Security Program for Fiscal Year 2015

Audit Report. The Social Security Administration s Compliance with the Federal Information Security Management Act of 2002 for Fiscal Year 2013

Audit of the Board s Information Security Program

How To Monitor Your Entire It Environment

Enterprise Cybersecurity Best Practices Part Number MAN Revision 006

Description of Actual State Sensor Types for the Software Asset Management (SWAM) Capability. 7 Jul 2014

NATIONAL CREDIT UNION ADMINISTRATION OFFICE OF INSPECTOR GENERAL

Fiscal Year 2014 Federal Information Security Management Act Report: Status of EPA s Computer Security Program

Evaluation of DHS' Information Security Program for Fiscal Year 2014

EVALUATION REPORT. Weaknesses Identified During the FY 2014 Federal Information Security Management Act Review. March 13, 2015 REPORT NUMBER 15-07

CONTINUOUS DIAGNOSTICS BEGINS WITH REDSEAL

SANS Top 20 Critical Controls for Effective Cyber Defense

Audit of Security Controls for DHS Information Technology Systems at San Francisco International Airport

U.S. Office of Personnel Management. Actions to Strengthen Cybersecurity and Protect Critical IT Systems

Top 20 Critical Security Controls

NARA s Information Security Program. OIG Audit Report No October 27, 2014

Did you know your security solution can help with PCI compliance too?

DHS Information Security Performance Plan

Larry Wilson Version 1.0 November, University Cyber-security Program Critical Asset Mapping

Statement of Danny Harris, Ph.D. Chief Information Officer U.S. Department of Education

Evaluation Report. Weaknesses Identified During the FY 2013 Federal Information Security Management Act Review. April 30, 2014 Report Number 14-12

Continuous Monitoring

Evaluation of DHS' Information Security Program for Fiscal Year 2015

Critical Security Controls

GFI White Paper PCI-DSS compliance and GFI Software products

VA Office of Inspector General

Report of Evaluation OFFICE OF INSPECTOR GENERAL. OIG 2014 Evaluation of the Farm Credit OIG 2014 Administration s. Management Act.

Protecting Your Organisation from Targeted Cyber Intrusion

FY 2012 Chief Information Officer Federal Information Security Management Act Reporting Metrics

VA Office of Inspector General

DIVISION OF INFORMATION SECURITY (DIS) Information Security Policy Threat and Vulnerability Management V1.0 April 21, 2014

Preparing for a Cyber Attack PROTECT YOUR PEOPLE AND INFORMATION WITH SYMANTEC SECURITY SOLUTIONS

Secure Content Automation Protocol (SCAP): How it is increasingly used to automate enterprise security management activities

NETWORK AND CERTIFICATE SYSTEM SECURITY REQUIREMENTS

How To Improve Nasa'S Security

SMITHSONIAN INSTITUTION

Technology Blueprint. Protect Your Servers. Guard the data and availability that enable business-critical communications

Cyber Side-Effects: How Secure is the Personal Information Entered into the Flawed Healthcare.gov? Statement for the Record

CDM Hardware Asset Management (HWAM) Capability

Office of Inspector General

FY 2016 Inspector General Federal Information Security Modernization Act of 2014 Reporting Metrics V1.0

How To Manage Security On A Networked Computer System

INTERNATIONAL TRADE ADMINISTRATION Improvements Are Needed to Strengthen ITA s Information Technology Security Program

Experience the commitment WHITE PAPER. Information Security Continuous Monitoring. Charting the Right Course. cgi.com 2014 CGI GROUP INC.

Office of Inspector General

24/7 Visibility into Advanced Malware on Networks and Endpoints

Security Management. Keeping the IT Security Administrator Busy

Securing OS Legacy Systems Alexander Rau

BMC Client Management - SCAP Implementation Statement. Version 12.0

Defending Against Data Beaches: Internal Controls for Cybersecurity

POSTAL REGULATORY COMMISSION

CDM Vulnerability Management (VUL) Capability

How To Audit The Mint'S Information Technology

Final Audit Report FEDERAL INFORMATION SECURITY MANAGEMENT ACT AUDIT FY Report No. 4A-CI

U.S. Department of Energy Office of Inspector General Office of Audits & Inspections. Evaluation Report

FEDERAL INFORMATION SECURITY. Mixed Progress in Implementing Program Components; Improved Metrics Needed to Measure Effectiveness

Wasting Money on the Tools? Automating the Most Critical Security Controls. Mason Brown Director, The SANS Institute

I N T E L L I G E N C E A S S E S S M E N T

Management (CSM) Capability

Continuous Diagnostics & Mitigation:

Information Security for Managers

Cisco Advanced Services for Network Security

Honeywell Industrial Cyber Security Overview and Managed Industrial Cyber Security Services Honeywell Process Solutions (HPS) June 4, 2014

Host Hardening. Presented by. Douglas Couch & Nathan Heck Security Analysts for ITaP 1

VA Office of Inspector General

TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

Audit of the Department of State Information Security Program

Deputy Chief Financial Officer Peggy Sherry. And. Chief Information Security Officer Robert West. U.S. Department of Homeland Security.

TASK TDSP Web Portal Project Cyber Security Standards Best Practices

Sygate Secure Enterprise and Alcatel

CS 356 Lecture 25 and 26 Operating System Security. Spring 2013

ANNUAL REPORT TO CONGRESS: FEDERAL INFORMATION SECURITY MANAGEMENT ACT

Guideline on Auditing and Log Management

Comprehensive Malware Detection with SecurityCenter Continuous View and Nessus. February 3, 2015 (Revision 4)

ForeScout CounterACT and Compliance June 2012 Overview Major Mandates PCI-DSS ISO 27002

2014 Audit of the Board s Information Security Program

Executive Summary Program Highlights for FY2009/2010 Mission Statement Authority State Law: University Policy:

Staying Secure After Microsoft Windows Server 2003 Reaches End of Life. Trevor Richmond, Sales Engineer Trend Micro

How To Get The Nist Report And Other Products For Free

Compliance Guide ISO Compliance Guide. September Contents. Introduction 1. Detailed Controls Mapping 2.

Independent Evaluation of NRC s Implementation of the Federal Information Security Modernization Act of 2014 for Fiscal Year 2015

OFFICE OF THE INSPECTOR GENERAL SOCIAL SECURITY ADMINISTRATION

Symantec Endpoint Protection Analyzer Report

ANNUAL REPORT TO CONGRESS: FEDERAL INFORMATION SECURITY MANAGEMENT ACT

FISMA / NIST REVISION 3 COMPLIANCE

Anti-Phishing Best Practices for ISPs and Mailbox Providers

Final Audit Report FEDERAL INFORMATION SECURITY MANAGEMENT ACT AUDIT FY Report No. 4A-CI Date:

THE AUSTRALIAN SIGNALS DIRECTORATE (ASD) STRATEGIES TO MITIGATE TARGETED CYBER INTRUSIONS

FISMA Compliance: Making the Grade

IT Best Practices Audit TCS offers a wide range of IT Best Practices Audit content covering 15 subjects and over 2200 topics, including:

FITSP-M. Lab Activity Guide

United States Department of Agriculture. Office of Inspector General

FISH AND WILDLIFE SERVICE INFORMATION RESOURCES MANAGEMENT. Chapter 7 Information Technology (IT) Security Program 270 FW 7 TABLE OF CONTENTS

Transcription:

FY15 Quarter 1 Chief Information Officer Federal Information Security Management Act Reporting Metrics V1.0 Prepared by: US Department of Homeland Security Office of Cybersecurity and Communications Federal Network Resilience November 14, 2014

Document History Version Date Comments Author See/Page 0.1 10/29/2014 Initial Draft of FY15 CIO Q1 FISMA Metrics 0.2 11/10/2014 Added new appendix with calculations P. Morimoto B. Frey J. McGrath C. Chase A. Heydman J. McGrath B. Frey All 12 1.0 11/14/2014 Added clarification remote access of FNR CPM 13 Name: FY 2015 Chief Information Officer Q1 Federal Information Security Management Act Reporting Metrics Created: October 29, 2014 Authors: Branch: Division: Paige Morimoto, Brian Frey, Jim McGrath, Craig Chase, Amy Heydman Cybersecurity Performance Management Federal Network Resilience 1

GENERAL INSTRUCTIONS RESPONSIBILITIES Organization 1 heads are responsible for complying with the Federal Information Security Management Act of 2002 (FISMA) and have full authority to require reporting by their components that form their enterprise. Fiscal Year (FY) 15 FISMA Metric Development Process While we move the Federal government toward Information Security Continuous Monitoring (ISCM) solutions, such as Continuous Diagnostics and Mitigation (CDM), it is important that we take appropriate actions to continue making the current direct-entry reporting methods less burdensome to Departments and Agencies (D/As) and to improve the quality of the data being reported. The current FISMA Chief Information Officer (CIO) metrics have been improved to provide more value to congressional and executive audiences, as well as, individual D/As. In coordination with the Office of Management and Budget (OMB) and the National Security Council (NSC) staff, the Federal Network Resilience (FNR) Division of the Department of Homeland Security (DHS) is developing long-term solutions to automate the CIO reporting process by leveraging the benefits of emerging continuous monitoring capabilities and other data collection mechanisms. However, FNR knows there are opportunities in the short-term to improve the FISMA cybersecurity metrics. This year DHS/FNR did so by facilitating an online collaborative effort incorporating the input of more than 100 cybersecurity professionals from over 24 D/As utilizing an Agile methodology. The goal of this effort was to improve the validity, quality, and efficiency of cybersecurity governance data and collection efforts. The participating cybersecurity professional made over 200 recommendations, and the DHS/FNR cybersecurity experts incorporated these recommendations into this set of FY 2015 CIO Annual FISMA Metrics. This set of metrics, for use in FY15 Quarterly reporting, represents a selection of Administration Priority metrics derived from the FISMA FY15 CIO Annual metrics. OMB requires CFO-Act agencies report quarterly per OMB M-15-01. A full set of the Annual CIO Metrics, with accompanying definitions, references, and guidance may be found here. Appendix B provides a correlation of the Quarterly and Annual metric question sets. 1 The term organization refers to each Federal D/A that is a reporting unit under CyberScope. 2

Expected Levels of Performance Cross-Agency Priorities (CAP) The expected levels of performance for CAP FISMA metrics are based on review and input from multiple cybersecurity experts as well as threat information from public, private, and intelligence sources. 2 Q1 and Q2 FY15 will be used to establish a baseline that will then be used to generate a future scoring methodology for the CAP goals (See Appendix 1). The Administration s Priority (AP) cybersecurity capabilities are currently: Information Security Continuous Monitoring Provide ongoing observation, assessment, analysis, and diagnosis of an organization s cybersecurity: posture, hygiene, and operational readiness. Identity Credential and Access Management Implement a set of capabilities that ensure users must authenticate to information technology resources and have access to only those resources that are required for their job function. Anti-phishing and Malware Defense Implement technologies, processes and training that reduce the risk of malware introduced through email and malicious or compromised web sites. Key FISMA Metrics (KFM) The expected level of performance for these metrics is defined as adequate security, which means security commensurate with the risk and magnitude of the harm resulting from the loss, misuse, or unauthorized access to or modification of government information. This includes assuring that systems and applications used by the organization operate effectively and provide appropriate confidentiality, integrity, and availability, through the use of cost-effective management, personnel, operational, and technical controls. 3 In compliance with OMB FISMA guidance (M-11-33, FAQ 15), the D/A head is responsible for determining the acceptable level of risk, with input from system owners, program officials, and CIOs. National Institute of Standards and Technology Special Publication (NIST SP) 800 Revisions For legacy information systems, D/As are expected to be in compliance with NIST guidelines within one year of the publication date. D/As must become compliant with any new or updated materials in revised NIST guidelines within one year of the revision. For information systems 2 See Cross-Agency Priority Goals for further details. 3 Office of Management and Budget (OMB) Circular A-130, Appendix III, definitions 3

under development or for legacy systems undergoing significant changes, D/As are expected to be in compliance with the NIST publications immediately upon deployment of the information system. Each D/A should consider its ability to meet this requirement when developing the Plan of Action and Milestones (POA&M). Federal Information Processing Standards (FIPS) Versions References in this document to FIPS Standards refer to the latest (non-draft) published version. 4

1. INFORMATION SECURITY CONTINUOUS MONITORING Hardware Asset Management 1.1 What is the total number of the organization s hardware assets connected to the organization s unclassified 4 network(s)? 5 (Base) 1.1.1 What is the total number of endpoints connected to the organization s unclassified network(s)? (Base) 1.2. Percent (%) of the organization s network fabric covered by a capability to detect and alert on the addition of unauthorized hardware assets onto the organization s network. (AP) 1.3. Percent (%) of the organization's network fabric covered by an automatic capability (scans/device discovery processes) that provides enterprise-level visibility into the current state of all hardware assets. (AP) 1.4. What is the mean time 6 to detect a new device (time between scans in 1.2)? (AP) Software Asset Management 1.5. Percent (%) of endpoints from 1.1.1 covered by an automated software asset inventory capability to scan the current state of installed software (e.g.,.bat,.exe,.dll). (AP) 1.6 Percent (%) of endpoints from 1.1.1 covered by a desired-state software asset management capability to detect and block unauthorized software from executing (e.g. AppLocker, certificate, path, hash value, services, and behavior based whitelisting solutions). 7 (AP) 4 Unclassified refers to low impact (non-sensitive) and sensitive but unclassified (SBU) data. 5 Unless specified otherwise in a footnote, add numbers across networks and organizational components to get the reportable result. 6 Mean time is measured in calendar days. 7 This may include software whitelisting tools that identify executable software by a digital fingerprint and selectively block these. It might also include sandboxing of mobile code to determine before execution whether to allow it to run, where static files do not allow whitelisting. In general, any method included should be able to block zero-day and APT threats. 5

Secure Configuration Management (SecCM) 1.7. Please complete Table 1. Future configurations will be added as needed. Data calls for layer 2, layer 3, mobile, printers, or other devices or operating systems will be used as needed. List of top U.S. Government Operating Systems, as reported in SCAP feeds 1.7.1 What is the number of hardware assets with each OS? (Base) 1.7.2 What is the common security configuration baseline for each OS listed? (Base) (e.g. USGCB) 1.7.3 How many configuration exceptions are granted by the enterprise? (Base) 1.7.4 What is organization s enterprise policy for maximum audit interval (target)? (Base) 1.7.5 What is organization s enterprise average audit interval (actual)? (AP) 1.7.6 Percent (%) of assets in 1.7.1 covered by the auditing activities described in 1.7.4 and 1.7.5. (AP) Windows 8.x Windows 7.x Windows Vista Windows Unsupported (include XP) Windows Server 2003 Windows Server 2008 Windows Server 2012 Linux (all versions) Unix / Solaris (all versions) Mac OS X Table 1: Metric 1.7.1-1.7.6. Vulnerability and Weakness Management 1.8. Percent (%) of hardware assets listed in 1.1 assessed using credentialed scans with Security Content Automation Protocol (SCAP) validated vulnerability tools. (AP) 8 1.9. What is the mean time 9 between vulnerability scans? (AP) 8 Vulnerability scanning tools are SCAP validated assets are not 9 Mean time is measured in calendar days. 6

1.10 What is the mean time 10 to mitigate for high 11 findings? (AP) 2. IDENTITY CREDENTIAL AND ACCESS MANAGEMENT Unprivileged Network Users 2.1. How many users have unprivileged network accounts (Exclude privileged user accounts and non-user accounts.) (Base) 2.1.1. Percent (%) of users from 2.1 technically required to log onto the network with a twofactor PIV card 12. (AP) Privileged Network Users 2.2. How many users have privileged network accounts? (Exclude unprivileged network accounts and non-user accounts.) (KFM) 2.2.1. Percent (%) of users from 2.2 technically required to log onto the network with a twofactor PIV card 13. (AP) 3. ANTI-PHISING AND MALWARE DEFENSE 3.1. Percent (%) of privileged user accounts that have a technical control preventing internet access. (AP) 3.2. Percent (%) of incoming email traffic analyzed for clickable URLs, embedded content, and attachments. (AP) 3.3. Percent (%) of hardware assets covered by a host-based intrusion prevention system. (AP) 3.4. Percent (%) of hardware assets covered by an antivirus (AV) solution using file reputation services, checking files against cloud-hosted, continuously updated malware information. (AP) 10 Mean time is measured in calendar days. 11 The National Vulnerability Database (NVD) provides severity rankings of Low Medium and High for all Common Vulnerabilities and Exposures (CVE) in the database. The NVD is accessible at http://nvd.nist.gov 12 For a person with one or more unprivileged network accounts, the person should be counted in the percentage only if the person is required to use a two-factor PIV card to authenticate to all network accounts. User-based Enforcement (UBE) at the user account level and Machine-based Enforcement (MBE) solutions that adhere to the principles of Identity and Access Management are counted as PIV-enabled for HSPD-12 reporting. 13 For a person with one or more privileged network accounts, the person should be counted in the percentage only if the person is required to use a two-factor PIV card to authenticate to all network accounts. User-based Enforcement (UBE) at the user account level and Machine-based Enforcement (MBE) solutions that adhere to the principles of Identity and Access Management are counted as PIV-enabled for HSPD-12 reporting. 7

3.5. Percent (%) of email attachments opened in sandboxed environment or detonation chamber. (AP) 3.6. Percent (%) of incoming emails using email sender authentication protocols such as DomainKeys Identified Mail (DKIM), Author Domain Signing Practices (ADSP), Domainbased Message Authentication, Reporting & Conformance (DMARC), Vouch by Reference (VBR), or IP Reverse (iprev). (AP) 3.7. Percent (%) of incoming emails scanned using a reputation filter 14 tool to perform threat assessment of email sender. (AP) 3.8. Percent (%) of hardware assets covered by an anti-exploitation tool (e.g., Microsoft s Enhanced Mitigation Experience Toolkit (EMET) or similar). (AP) 3.9. Percent (%) of inbound email traffic passing through anti-phishing/anti-spam filtration technology at the outermost border Mail Transport Agent or email server. (AP) 3.10. Percent (%) of inbound network traffic that passes through a web content filter that provides anti-phishing, anti-malware, and blocking of malicious websites (e.g. fake software updates, fake antivirus offers, and phishing offers). (AP) 3.11. Percent (%) of hardware assets that have implemented a browser-based (e.g. Microsoft Phishing filter) or enterprise-based tool to block known phishing websites and IP addresses. (AP) 3.12. Percent (%) of outbound communications traffic checked at the external boundaries to detect covert exfiltration of information. (AP) 3.13. Percent (%) of sent email that is digitally signed. (AP) 3.14. Percent (%) of email traffic quarantined or otherwise blocked. (AP) 3.15. Percent (%) of remote access connections scanned for malware upon connection. (AP) 3.16 Percent (%) of the users that participated in cybersecurity-focused exercises who successfully completed exercises focusing on phishing, designed to increase awareness and/or measure effectiveness of previous training. (e.g., organization conducts spoofed phishing emails, clicking link leads to phishing information page) (AP) 14 Outer layer of email protection filters potentially malicious email based on sender reputation, sender IP address, or other sender information. 8

Appendix A: Correlation of the Quarterly and Annual Metric Question Sets FY15 Quarterly FISMA CIO Metrics FY 15 Annual FISMA CIO Metrics FY15 Quarterly FISMA Metric 1.1 2.1 What is the total number of the organization s hardware assets connected to the organization s unclassified network(s)? (Base) 1.1.1 2.1.2 What is the total number of endpoints connected to the organization s unclassified network(s)? (Base) 1.2 2.2 Percent (%) of the organization s network fabric covered by a capability to detect and alert on the addition of unauthorized hardware assets onto the organization s network. (AP) 1.3 2.3 Percent (%) of the organization's network fabric covered by an automatic capability (scans/device discovery processes) that provides enterprise-level visibility into the current state of all hardware assets. (AP) 1.4 2.4 What is the mean time to detect a new device (time between scans in 1.2)? (AP) 1.5 2.6 Percent (%) of endpoints from 1.1.1 covered by an automated software asset inventory capability to scan the current state of installed software (e.g.,.bat,.exe,.dll). (AP) 1.6 2.7 Percent (%) of endpoints from 1.1.1 covered by a desired-state software asset management capability to detect and block unauthorized software from executing (e.g. AppLocker, certificate, path, hash value, services, and behavior based whitelisting solutions). (AP) 1.7 2.10 Please complete Table 1. Future configurations will be added as needed. Data calls for layer 2, layer 3, mobile, printers, or other devices or operating systems will be used as needed. 1.8 2.11 Percent (%) of hardware assets listed in 1.1 assessed using credentialed scans with Security Content Automation Protocol (SCAP) validated vulnerability tools. (AP) 1.9 2.12 What is the mean time between vulnerability scans? (AP) 1.10 2.14 What is the mean time to mitigate for high findings? (AP) 2.1 3.1 How many users have unprivileged network accounts (Exclude privileged user accounts and non-user accounts.) (Base) 2.1.1 3.1.1 Percent (%) of users from 2.1 technically required to log onto the network with a two-factor PIV card. (AP) 9

2.2 3.2 How many users have privileged network accounts? (Exclude unprivileged network accounts and non-user accounts.) (KFM) 2.2.1 3.2.1 Percent (%) of users from 2.2 technically required to log onto the network with a two-factor PIV card. (AP) 3.1 4.1 Percent (%) of privileged user accounts that have a technical control preventing internet access. (AP) 3.2 4.2 Percent (%) of incoming email traffic analyzed for clickable URLs, embedded content, and attachments. (AP) 3.3 4.3 Percent (%) of hardware assets covered by a host-based intrusion prevention system. (AP) 3.4 4.4 Percent (%) of hardware assets covered by an antivirus (AV) solution using file reputation services, checking files against cloud-hosted, continuously updated malware information. (AP) 3.5 4.5 Percent (%) of email attachments opened in sandboxed environment or detonation chamber. (AP) 3.6 4.6 Percent (%) of incoming emails using email sender authentication protocols such as DomainKeys Identified Mail (DKIM), Author Domain Signing Practices (ADSP), Domain-based Message Authentication, Reporting & Conformance (DMARC), Vouch by Reference (VBR), or IP Reverse (iprev). (AP) 3.7 4.7 Percent (%) of incoming emails scanned using a reputation filter tool to perform threat assessment of email sender. (AP) 3.8 4.8 Percent (%) of hardware assets covered by an anti-exploitation tool (e.g., Microsoft s Enhanced Mitigation Experience Toolkit (EMET) or similar). (AP) 3.9 4.9 Percent (%) of inbound email traffic passing through antiphishing/anti-spam filtration technology at the outermost border Mail Transport Agent or email server. (AP) 3.10 4.10 Percent (%) of inbound network traffic that passes through a web content filter that provides anti-phishing, anti-malware, and blocking of malicious websites (e.g. fake software updates, fake antivirus offers, and phishing offers). (AP) 3.11 4.11 Percent (%) of hardware assets that have implemented a browserbased (e.g. Microsoft Phishing filter) or enterprise-based tool to block known phishing websites and IP addresses. (AP) 3.12 4.12 Percent (%) of outbound communications traffic checked at the 10

external boundaries to detect covert exfiltration of information. (AP) 3.13 4.13 Percent (%) of sent email that is digitally signed. (AP) 3.14 4.14 Percent (%) of email traffic quarantined or otherwise blocked. (AP) 3.15 6.1.4 Percent (%) of remote access connections scanned for malware upon connection. (AP) 3.16 8.2.1 Percent (%) of the users that participated in cybersecurity-focused exercises who successfully completed exercises focusing on phishing, designed to increase awareness and/or measure effectiveness of previous training. (e.g., organization conducts spoofed phishing emails, clicking link leads to phishing information page) (AP) 11

Appendix B: Metric Algorithms/Additional Calculations Context Metric Metric Number 1.2 Percent (%) of the organization s network fabric covered by a capability to detect and alert on the addition of unauthorized hardware assets onto the organization s network. 1.3 Percent (%) of the organization's network fabric covered by an automatic capability (scans/device discovery processes) that provides enterprise-level visibility into the current state of all hardware assets. 1.9 What is the mean time between vulnerability scans? 1.10 What is the mean time to mitigate for high findings? 2.1.1 Percent (%) of users from 2.1 technically required to log onto the network with a two-factor PIV card. Context As it relates to FISMA, network fabric is defined as the overall total of the Agency s networked hardware assets. This includes the network topology of the organization, such as servers, storage, client machines, and other networked assets in a cohesive switched infrastructure. This may also be referred to as the Agency s network infrastructure(s). As it relates to FISMA, network fabric is defined as the overall total of the Agency s networked hardware assets. This includes the network topology of the organization, such as servers, storage, client machines, and other networked assets in a cohesive switched infrastructure. This may also be referred to as the Agency s network infrastructure(s). Based on credentialed scans in 1.8 Based on credentialed scans in 1.8 Time from identification/detection to Mitigated (remediated and/or compensating controls in place). 100% = (2.1.1 Percent (%) of users from 2.1 technically required to log onto the network with a two-factor PIV card) + (Percent (%) of users from 2.1 with PIV cards, but not technically required to use it for two-factor authentication)+ (Percent (%) of users from 2.1 without PIV cards) 2.2.1 Percent (%) of users from 2.2 technically required 100% = (2.2.1 Percent (%) of 12

to log onto the network with a two-factor PIV card. 3.1 Percent (%) of privileged user accounts that have a technical control preventing internet access. 3.3 Percent (%) of hardware assets covered by a hostbased intrusion prevention system. 3.4 Percent (%) of hardware assets covered by an antivirus (AV) solution using file reputation services, checking files against cloud-hosted, continuously updated malware information. 3.8 Percent (%) of hardware assets covered by an antiexploitation tool. 3.11 Percent (%) of hardware assets that have implemented a browser-based or enterprise-based tool to block known phishing websites and IP addresses. 3.14 Percent (%) of email traffic quarantined or otherwise blocked. 3.15 Percent (%) of remote access connections scanned for malware users from 2.2 technically required to log onto the network with a two-factor PIV card) + (Percent (%) of users from 2.2 with PIV cards, but not technically required to use it for two-factor authentication)+ (Percent (%) of users from 2.2 without PIV cards) Based on user accounts from 2.2 Based on assets in 1.1 Based on assets in 1.1 Based on assets in 1.1 Based on assets in 1.1 Total Inbound messages Remote access connections are defined as the ability for an organization s users to access its non-public computing resources from locations external to the organization s facilities. This applies to remote access solutions that protect access to the organization s desktop LAN/WAN resources and services. Remote access excludes non-gfe systems using externally facing applications (e.g., Outlook Web Access, Remote Desktop/Citrix Solutions, Good Messaging, etc.). 13

Appendix C: CAP Goal Scoring This section is a placeholder for FY15Q1 and Q2 reporting periods. By FY15Q3, it is anticipated that scoring and baseline metrics will be developed and documented in this section. 14