Applying Software Quality Models to Software Security Software Engineering Institute Carnegie Mellon University Pittsburgh, PA 15213 Carol Woody, Ph.D. April 21, 2015
Copyright 2015 Carnegie Mellon University This material is based upon work funded and supported by the Department of Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Department of Defense. NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN AS-IS BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This material has been approved for public release and unlimited distribution except as restricted below. This material may be reproduced in its entirety, without modification, and freely distributed in written or electronic form without requesting formal permission. Permission is required for any other use. Requests for permission should be directed to the Software Engineering Institute at permission@sei.cmu.edu. Team Software Process SM and TSP SM are service marks of Carnegie Mellon University. DM-0001890 2
Cyber Security Engineering (CSE) Team Mission: Build Security In Address security, software assurance, and survivability throughout the development and acquisition lifecycle by creating methods, solutions, and training that can be integrated into existing practices. CSE Focus Areas Education and Competencies Measurement and Analysis Lifecycle Management Engineering http://www.cert.org/cybersecurity-engineering/ 2015 2015 Carnegie Carnegie Mellon Mellon University 3 3
CSE Portfolio Software Assurance Education and Competencies Masters of Software Assurance Curriculum Model endorsed by IEEE and ACM Software Assurance Competency Model Software Assurance Course Delivery and Material Development Security & Software Assurance Measurement and Analysis Predictive Analytics Research Researching the use of Quality Models to Support Software Assurance Security & Software Assurance Management Mission Risk Diagnostic (MRD) Survivability Analysis Framework (SAF) Security & Software Assurance Engineering Security Quality Requirements Engineering (SQUARE) Security Engineering Risk Analysis (SERA) Risk in the Software Supply Chain Focus of today's presentation 2015 2015 Carnegie Carnegie Mellon Mellon University 4 4
Cyber Security is a Lifecycle Challenge Mission thread (Business process) Design Weaknesses Coding Weaknesses Implementation Weaknesses 5
Can Predictions of Quality Inform Security Risk Predictions? The SEI has quality data for over 100 Team Software Process (TSP) development projects used to predict operational quality. Data from five projects with low defect density in system testing reported very low or zero safety critical and security defects in production use. 6
Semantic Gaps Quality tracks defects/faults (engineering and testing) Defect: non-fulfilment of intended usage requirements (ISO/IEC 9126) [essentially nonconformity to a specified requirement, missing or incorrect requirements] Software fault: accidental condition that causes a functional unit to fail to perform its required function (IEEE Standard Dictionary of Measures to produce reliable software 982.1, 1988) Security cares about vulnerabilities (operations) Information security vulnerability: mistake in software that can be exploited by a hacker to gain access to a system or network (http://cve.mitre.org/about/terminology.html) Software vulnerability: instance of an error in the specification, development, or configuration of software such that its execution can violate a security policy (Shin and Williams, 2010) 7
Vulnerabilities are Defects 1-5% of defects are vulnerabilities Analysis of defects for five versions of Microsoft windows operating systems and two versions of Red Hat Linux systems) (Alhazmi, et.al., 2007) Win 95 (14.5 MLOC) and Win 98 (18 MLOC) vulnerabilities are 1.00% and 0.84% respectively of identified defects Red Hat Linux 6.2 (1.8 MLOC) and 7.1 (6.4 MLOC) vulnerabilities are 5.63% and 4.34% respectively of identified defects. Tom Longstaff asserted that vulnerabilities might represent 5% of total defects (http://research.microsoft.com/en-us/um/redmond/events/swsecinstitute/slides/longstaff.pdf) Ross Anderson: it's reasonable to expect a 35,000,000 line program like Windows 2000 to have 1,000,000 bugs, only 1% of them are security-critical. (Anderson, 2001) 8
Data: Five Projects from Three Organizations Projects Types: Legacy system replacement, Medical devices Successful security/safety critical results in operation for at least a year Org. Project Type Secure or Safety Critical Defects Defect Density Size D D1 Safety Critical 20 46.07 2.8 MLOC D D2 Safety Critical 0 4.44.9 MLOC D D3 Safety Critical 0 9.23 1.3 MLOC A A1 Secure 0 91.70.6 MLOC T T1 Secure 0 20.00.1 MLOC Quality Threshold With one exception, projects implemented below 20 defects per MLOC had no reported operational security or safety-critical defects. The exception utilized specialized defect removal practices for secure systems. 9
Quality Focuses on Defect Injection and Removal 60 50 40 30 20 10 0 Early Defect Removal across Life Cycle Poor quality does predict poor security: 1-5% of the defects are vulnerabilities Cost to fix substantially increases the later a defect is discovered 10
Software Faults: Introduction, Discovery, and Cost Faults account for 30 50% percent of total software project costs. Most faults are introduced before coding (~70%). Most faults are discovered at system integration or later (~80%). 11 11
Successful Projects Embed Quality and Safety/Security Inspection at Each Lifecycle Step 12
Successful Projects Use Metrics Extensively Development Metrics Incoming/week Triage rate % closed Development work for cycle Software change request per developer per week # developers Software change request per verifier & validator per week # verification persons Software Change Metrics Fixed work per cycle Deferred planned work per cycle Measure constantly from many dimensions to identify problems early 13
Successful Projects Show Improved Reliability 14
How Will Quality Help Security? Good quality will ensure proper implementation of specified results Effective code checking will identify improper implementations of specifications (11 of SANS Top 25) Effective design reviews will identify missing requirements (12 of SANS Top 25) if appropriate security results are considered in the development of requirements if requirements are effectively translated into detail designs and code specifications to support the required security results SANS Top 25: SysAdmin, Audit, Network, Security Top 25 Most Dangerous Programming Errors (http://cwe.mitre.org/top25) Security Requirements Must be Properly Specified 15
Poor Quality Predicts Poor Security If you have a quality problem then you have a security problem Quality does not happen by accident and neither does security Neither quality nor security can be tested in Quality approaches such as TSP focus on personal accountability at each stage of the life cycle Effective results require clearly define what right looks like measuring and rewarding the right behaviors reinforcement by training, tracking and independent review 16
Linking Security and Quality Measures If defects are measured, from 1-5% of these should be considered to be security vulnerabilities. It is also feasible that when security vulnerabilities are measured then code quality can be estimated by considering these to be 1-5% of the expected defects. Reducing Defects Reduces Vulnerabilities 8-10 Feb 2011 International Conference on Software Quality - ICSQ 2015 17 17
Challenges for Applicability Metrics are not collected about vulnerabilities specific to each product release Open Source products National Vulnerability Database Data about vulnerabilities are not collected in a form that can be parsed and analyzed using quality tools and measurements Update history does not report product vulnerability data Difficulty in evaluating size of products (lines of code or function points) Life cycles such as Agile do not typically collect defects until integration 18
Contact Information Carol Woody, Ph.D. Technical Manager CERT/CSF/CSE Telephone: +1 412-268-5800 Email: info@sei.cmu.edu U.S. Mail Software Engineering Institute Customer Relations 4500 Fifth Avenue Pittsburgh, PA 15213-2612 USA Web http://www.cert.org/cybersecurityengineering/ Customer Relations Email: info@sei.cmu.edu Telephone: +1 412-268-5800 SEI Phone: +1 412-268-5800 SEI Fax: +1 412-268-6257 19