Ground Truth Competency Assessment for Smart Grid Cyber Security TCIPG May 4, 2012 Michael Assante President, NBISE David H. Tobey, Ph.D. Director of Research
The Need for Cyber Defenders "Everywhere I go, across the country, CEOs and business leaders tell me that one of their chief concerns is having the highly skilled workers they need to power their companies. They believe, and this administration believes, that a globally competitive economy requires a globally competitive workforce Secretary of Commerce John Bryson 2
Grid Modernization Explosion of interconnected automation & intelligence Growing complexity & sophistication of cyber concerns Develop adaptive & resilient systems (people & technology) Onboard new & transforming our current workforce
21 st Century Grid Cross Cutting Challenge of Cyber Security Plug-In Hybrid Electric Vehicles / Storage Demand Response reliability Wind & Variable Generation Demand smart grid Conventional & Hydro Generation Energy Efficiency Nuclear Rooftop Solar / Local Wind Development cyber security Drivers Policy Security Economic Building the 21 st century grid requires a comprehensive and coordinated approach to policy and resource development looking at the grid as a whole, not as component parts.
Growing Numbers/Complexity Requires major productivity gains and accelerated skill development Devices Things Devices & Microcontrollers People Internet Hosts Devices Organizations Departments PC Mini Mainframe 1960 1970 1980 1990 2000 2010
NBISE s Mission Our mission is to increase the security of our nation s critical infrastructure by improving the science by which we identify and measure the proficiency, the performance, and the potential of the cyber security workforce. Our vision is to establish a national, open source library of assessment, instruction, target practice, and performance support tools that advance cyber defense skills through combining lessons learned from cognitive science and psychometrics. We seek to codify, validate, and disseminate evidence-based practices for accelerating the development of competencies that enable effective performance in addressing the growing cybersecurity threat.
How can we assess and develop the future cybersecurity workforce? 1. Job Definition and Competency Analysis 5. Professional Development Plans 4. Proficiency and Performance Assessment 6. Ongoing Performance Support & Simulation 3. Instructional & Simulation Design 2. Aptitude Assessment Source: Assante, M. J. & Tobey, D. H. (2011) Enhancing the Cybersecurity Workforce. IEEE IT Professional, 13: 12-15.
Moving from summative to formative assessments Example from NBME Examinee Performance Profile
Foundational Support to Achieving the Benefits of Grid Modernization Purpose: The project contributes to the Department of Energy s efforts to develop a competency model and explore assessment methods focused on the job responsibilities and unique skill set of Smart Grid cybersecurity specialists. This work is designed to provide a foundation for industry s ongoing efforts to transform and develop the workforce necessary to achieve the benefits of grid modernization. Who: Those primarily responsible for operational security functions for dayto-day operations, but not engineering and architecture, in smart grid environments. How: Examination of the technical, problem solving, social and analytical skills used by senior cyber security staff in the daily execution of their responsibilities. Verify: A measurement model for assessing knowledge, skills, and abilities in the areas of technical and operational skills.
Subject Matter Expert Panel and Advisory Group Panel Officers Chair Justin Searle UtiliSec Vice Chair - Scott King Sempra Energy Advisory Group John Allen IEIA Forum Joel Garmon - Wake Forest Baptist Medical Center (CISO) Dr. Emannuel - Hooper Global Info Intel and Harvard Bill Hunteman Jamey Sample - PG&E Panel Member Representation Service Gov't Industry Research Vendor Panel Members Lee Aber - OPower Sandeep Agrawal - Neilsoft Limited Bora Akyol - PNNL Andres Andreu - NeuroFuzz, LLC Balusamy Arumugam - Infosys Chris Blask - AlienVault Andy Bochman - IBM Jason Christopher - FERC Art Conklin - University of Houston Benjamin Damm - Silver Springs Network Anthony David Scott - Accenture Steve Dougherty - IBM Global Technology Services Ido Dubrawsky - Itron Michael Echols - Salt River Project Dr. Barbara Endicott-Popovsky - University of Washington Cliff Eyre - PNNL Maria Hayden - Pentagon Charles Reilly Southern California Edison Craig Rosen - PG&E Scott Saunders - SMUD Chris Sawall - Ameren Paul Skare - PNNL Clay Storey - Avista Dan Thanos - GE Digital Energy Kevin Tydings - SAIC Don Weber - InGuardians Mike Wenstrom - Mike Wenstrom Development Partners Nic Ziccardi - Network & Security Technologies
Security Testing & Smart Grid Source: Searle, Justin (2012) AMI Penetration Test Plan, National Electric Sector Cybersecurity Organization.
Current Draft OST Competency Model Constructs (Goal & Task Categories) Manage the project Develop project plan Monitor project plan Identify the critical vulnerabilities Identify critical vulnerabilities Analyze and map critical vulnerabilities Develop and execute mitigation strategy Penetrate targets Identify targets to penetrate Analyze targets to penetrate Develop and execute penetration strategy Exploit vulnerabilities Infrastructure Web and Applications Other Perform tasks in a safe and lawful fashion Constructs (Goal & Task Categories) Mitigate vulnerabilities Identify resources Plan and document actions Communicate actions Understand and demonstrate impact Specify target-specific impact Determine implications/plan response Communicate impact Educate team and clients Educate team Educate clients
Developing the Science of Competency Assessment: JOB PERFORMANCE MODELING
What is a competency? Broad Consistent Ability (transfer across domains) Skills (consistency of performance) Inconsistent Narrow = Novice Shallow Knowledge (understanding of strategy or procedure) = Apprentice = Journeyman = Master Deep Source: Tobey, D. H. et. al. (in press) Predictive Performance Modeling: An innovative approach to defining critical competencies that distinguish levels of performance," National Board of Information Security Examiners, Idaho Falls, ID, OST Working Group Report NBISE-OST-11-01
Defining proficiency at multiple levels for multiple roles Source: Tobey, D. H. et. al. (2011) Predictive Performance Modeling: An innovative approach to defining critical competencies that distinguish levels of performance," National Board of Information Security Examiners, Idaho Falls, ID, OST Working Group Report NBISE-OST-11-01
Critical Differentiation Analysis Low Task Criticality High Task Differentiation Low High #1 #2 #3 #4 #5 #1 #2 #3 #4 #5 #1 #2 #3 #4 #5 #1 #2 #3 #4 #5
Defining the path to performance Masters Source: Tobey, D. H. et. al. (2011) Predictive Performance Modeling: An innovative approach to defining critical competencies that distinguish levels of performance," National Board of Information Security Examiners, Idaho Falls, ID, OST Working Group Report NBISE-OST-11-01 Apprentices Journeymen
Self-Assessment Instrument Knowledge (Understanding) Learning Modes/Hours Degree of understanding Degree of difficulty Skill (Consistency) Self-efficacy scale Frequency scale Performance scale Ability Planning and Monitoring Problem Solving
Individual Competency Profile: Radial Chart Views by Composite, Knowledge, Skill, and Ability Mitigate Vulnerabilities 10 8 6 Penetrate Targets 4 Exploit vulnerabilities 2 0 Identify Vulnerabilities Performance Levels Red: Low performance Yellow: Borderline performance Green: High performance
Individual Competency Profile: Drill-down (Penetrate targets) Foundational Tasks Composite score Comparative score Identify ownership of gateway devices (16.77) 83.8 Average Identify recon that is within project scope (15.63) 46.8 Low Search online sources for useful information about a target (15.45) 53.5 Average Differentiating Tasks (with weights) Analyze data found on compromised machines to enable exploitation deeper into the network (24.02) 36.0 Average Identify major assets subject to attacks (23.67) 87.2 High Identify targets for potential exploitation (23.67) 56.0 High Analyze data found on compromised machines for strategic value as seen by a worst case attacker (23.60) Overall Score 26.2 Low My Score 54.9 Average Knowledge Skill Ability Identify recon that is within project scope 86.5 27.4 47.3
Workforce Planning Source: International Center for Leadership in Education (ICLE) Rigor/Relevance Framework (Daggett, 2000)
Rigor/Relevance Framework Analysis Symbols for major competencies Instances represent task areas Location of instances represent current achievement (KSAs) by quadrant: 1. Foundational Knowledge 2. Foundational Skill 3. Differentiating Skill 4. Ability Drill down on any instance to a personal development plan
Personal Development Plan Clicking on any symbol instance in the RRF brings up the table What are my strengths? What are my weaknesses? Feedback I have received? Conclusion Reference work #1 Reference work #2 Online course #1 Online course #2 When do I want or need to achieve the desired state? What is my schedule to work on these focus areas? State Analysis 1, 2., 3, 4., 5. 1, 2., 3, 4., 5. 1, 2., 3, 4., 5. 1, 2., 3, 4., 5. Resource Listing US-Cert: Cyber Security Policy Planning and Preparation PDF download NIST: Guide to Industrial Control Systems (ICS) Security PDF download SANS: Hacker Techniques, Exploits & Incident Handling LMS course SANS: Reverse-Engineering Malware: Analysis Tools and Techniques LMS course Improvement Timeline Focus Area #1 : Short term ; Medium term ; Long term Focus Area #2 : Short term ; Medium term ; Long term Focus Area #1 : Day / time 1 ; Day / time 2 ; Day / time 3 Focus Area #2 : Day / time 1 ; Day / time 2 ; Day / time 3 Notes Document any important notes or other items in this section
The GTED Cybersecurity Performance Management System Workforce Planning and Development Organization Assessment Summary Individual Assessment Summary Task Assessment Results with KSA Metrics
Advanced Defender Aptitude, Performance Testing and Simulation (ADAPTS) Framework Competency Model Content Model Object Model Development Model Assessments Content Metrics Simulations Etc. Assessment Objects (TestLets) SCORM/CMI-5 XML Library Training Objects (CourseLets) Simulation Objects (SimLets) Adapted from Ostyn (2005) Competency Data for Training Automation
Why job performance models are important Facilitate translation of functional roles into job roles Clearly distinguish knowledge, skill, and ability Determine factors that differentiate performance at varying levels of skill Identify the critical factors that predict performance Competency models describe Job performance models prescribe JPM s help to determine who should be developed (aptitude), how to development them (skill profiles), and when they are ready to take the next step (performance-based learning) 26
Questions? Contact: Michael Assante Michael.assante@nbise.org