Computer Security CIS 5370 Evaluating Systems Chapter 21 1 Evaluation Methodology 1. Set of security functionality requirements 2. Set of assurance a requirements e e 3. Methodology to determine if the system meets (1) based on (2) 4. A measure of the evaluation (scale for level of trust) 2 Value of Independent Analysis There is a theory that with independent analysis: the evaluated product is less likely to contain major flaws than a product that has not been evaluated, (Bishop, p573, 2 nd full paragraph). There is NO proof that this is true. 3 1
Meaningful Measurement Requires a Standard Trusted Computer System Evaluation Criteria (TCSEC) Information Technology Security Evaluation Criteria (ITSEC) Federal Information Processing Standards (FIPS) Common Criteria (CC) Software Security Engineering Capability Maturity Model (SSE-CMM) 4 The Orange Book (1983-99) Trusted Computer System Evaluation Criteria (TCSEC) Yes, it's Orange Categorize system by their security characteristics 5 TCSEC Functional Requirements Discretionary Access Control Object Reuse Requirements Mandatory Access Control 6 2
Orange Book Ratings (Cumulative) A1: Verified Design B3: Security Domains B2: Structured Security Protection B1: Labeled Security U fil Protection C2: Controlled Access Protection C1: Discretionary Security Protection D: Minimal Protection Same requirements, more rigorous implementation Auditing, secure crash Big step, covert channels, secure kernel, etc. Users, files, processes, etc must be labeled Access control at user level, clear memory when released Resources protected by ACLs, memory overwrites prevented Did not qualify for higher rating, or not rated. 7 C2 Security Audit Data Orange Book C2 certification audit requirement Record the crossing of instructions executed by the processor in the user space and instructions executed in the TCB Abstracts full system call trace. The following do not appear in the audit trace Context switches Memory allocation Internal semaphores Consecutive file reads 8 C2 Security Audit Data Advantages and Disadvantages Strong authentication Categorization of audit events facilitates audit system configuration Fine-grain parameterization of info gathered Shutdown response to detected bad acts Resource intensive Possible DOS attack by filling audit file Complex setup System size/complexity User/action-based, rarely considers objects 9 3
Orange Book Paradox Certification takes time Technology won't wait 10 ITSEC (1991-2000+) No TCB Introduced Target of Evaluation Effectiveness Requirements Suitability of Requirements Binding of Requirements 11 TCSEC not ITSEC Tamperproof reference validation Process isolation Least privilege Well-defined user interface System integrity requirements 12 4
ITSEC Evaluation Levels E1: Security target & E2: architecture Detailed design description, configuration control, etc. E3: Source code standards, more stringent detailed design standards 13 ITSEC Evaluation Levels E4: Formal security model E5: Requirements, design, and implementation tracing E6: Formal methods 14 FIPS 140 (1994-Present) Crypto standard Basic design Crypto algorithms Module interfaces OS Security Roles Key management Services Emanations Physical security Self-testing Software security Four levels 15 5
Common Criteria (1998-) Developed through a collaboration among security and standards organizations from: Canada, France, Germany, The Netherlands, The United Kingdom The United States As a common standard to replace existing security evaluation criteria. 16 Interested Parties 17 Relevance to Interested Parties 18 6
CC Informally Defined A standard method of expressing requirements A catalog of possible security requirements A matching set of assurance test procedures A standard testing methodology 19 Common Criteria Security Requirements Standards Evaluation methodology Privacy, integrity, anonymity, traceability, etc. Best practices, rules of thumb, operating procedures, etc. Metrics, test plans, etc. 20 Three Parts of the Common Criteria CC Part 1 Background information and references. CC Part 2 Guidance and reference when formulating requirements for security functions CC Part 3 21 Evaluation Process 7
CC Evaluation Assurance Levels EAL1 - Functionally tested EAK2 - Structurally tested EAL3 - Methodically tested & checked EAL4 - Methodically designed, tested & reviewed EAL5 - semiformally designed and tested EAL6 - semiformally verified design and tested EAL7 - formally verified design and tested 22 Evaluation Assurance Levels EAL1: Confidence required EAL2: Low to moderate security, no substantial investment EAL3: Moderate Security EAL4: More secure, but requires no specialist knowledge, skills, and other resources EAL5: Security engineering, commercial practice EAL6: High Assurance, counters high risk EAL7: Counters extremely high risk 23 Terminology Target of Evaluation (TOE): An IT product or system that is the subject of an evaluation Protection Profile (PP): An implementationindependent set of security requirements for a category of TOEs Security Target (ST): A set of security requirements and specifications used as the basis for evaluation of an identified TOE 24 8
CC Methodology Overview Protection Profile 1. Introduction 2. Product description 3. Product Security Environment 4. Security Objectives (traced to threats) 5. IT Functional Security Requirements 6. Rationale Objectives are traceable to assumptions, threats, policies Requirements are traceable to objectives 25 Comparative Evaluation Processes TSSEC ITSEC CC D E0 NA NA NA EAL1 C1 E1 EAL2 C2 E2 EAL3 B1 E3 EAL4 B2 E4 EAL5 B3 E5 EAL6 A1 E6 EAL7 26 SSE-CMM Process Areas 1. Administer Security Controls 2. Assess Impact 3. Assess Security Risk 4. Assess Threat 5. Assess Vulnerability 6. Build Assurance Argument 27 9
SSE-CMM Process Areas (cont) 7. Coordinate Security 8. Monitor System Security Posture 9. Provide Security Input 10. Specify Security Needs 11. Verify and Validate Security 28 CMM Levels 1. Performed Informally 2. Planned and Tracked 3. Well-Defined 4. Quantitatively Controlled 5. Continuously Improving 29 Questions? 30 10