Static versus Dynamic Data Information Fusion analysis using DDDAS for Cyber Security Trust Erik Blasch AFRL/RIEA Rome, NY e erik.blasch@gmail.com Youssif Al-Nashif, Salim Hariri NSF Center for Cloud and Autonomic Computing, The University of Arizona Sponsor AFRL/AFOSR ICCS 2014 June 2014
PRESENTATION OUTLINE Motivation Information Fusion for Decision Making (Trust) Understand the flow of information in Man and Machines Information Fusion and DDDAS Information Management Flow Managers and QoS Flow SOA-based DDDAS - DaaS End to End Trust Metrics Summary of metrics Calculations Potential Use Case Scenarios for DaaS DDDAS-based Crisis Management
High Level Information Fusion Low-level Information Fusion High-level Information Fusion Human Level 0 SAW SENSATION Level 1 SAW PERCEPTION Level 2 SAW COMPREHENSION Level 3 SAW PROJECTION Situation Awareness (SAW) Interface Visualization (Evaluation) Observables (dots on maps) Objects (lines on maps) Situations (storytelling) Scenarios (forecasting) Machine Level 0 MIF SUB-OBJECT ASSESSMENT Level 1 MIF OBJECT ASSESSMENT Level 2 MIF SITUATION ASSESSMENT Level 3 MIF IMPACT ASSESSMENT Machine Information Fusion (MIF) Decompose problem into elements of LLIF and HLIF Determine the user (situation awareness) and machine (computation) Discussion on evaluation/visualization and projection 3
Blasch s book: HLIF Information/Resource Management Scenario Situation Object Observable Assessment High-Level Information Fusion Projection Comprehension TRUST Perception Sensation Awareness Low-Level Information Fusion Situation Analysis Evaluation Human - Machine Interface Machine Systems Design Human 4
Information Fusion and DDDAS 5
IF-DDDAS Design Challenges How do you handle, manage, and analyze Big amount of data, results, and users that are heterogeneous, dynamic, and continuously changing? How do you trust, quantify trust of users, data, results, etc.? How do you adopt, design, QoS, Quality of Protection (QoP) of such DDDAS environments? How can you integrate design stage and runtime stage to achieve continous integration of design and runtime? 6
OUTLINE Motivation Information Fusion for Decision Making (Trust) Understand the flow of information in Man and Machines Information Fusion and DDDAS Information Management Flow Managers and QoS Flow SOA-based DDDAS - DaaS End to End Trust Metrics Summary of metrics Calculations Potential Use Case Scenarios for DaaS DDDAS-based Crisis Management
Information Management Model 8
IF-DDDAS Environments Adapt DDDAS QoS, QoP at rutnime - DDDAS Analytics - Resilient DDDAS - Trustworthy DDDAS Managed Information Object MIO comprises a payload and metadata that characterize the object such as topic, time, and location. Manage Types Security Policy Resource Allocation Data Mediation Monitor & Audit Maintain Federations Maintain Currency Publish Advertise Get Feedback Receive RFIs Retract Managers Producers Actors Federates Consumer s Seamless Access Restrictions Mediation Integrity Information SLAs Search/Browse Subscribe Query Transform Assess 9
Distributed Information Management Coalition Issues Acquisition cannot be standardized or synchronized. Universal data standards are unrealistic. Data standards version skew is the norm not the exception. Coalition information spaces must adapt quickly to new partners and processes. Information Management Best Practices 1. Information Sharing - Package information for sharing. - Adopt common syntax and semantics for common information attributes such as location, time, and subject. 2. Reduce Complexity - Adopt dedicated IM infrastructure - Create simple ubiquitous common services 3. Control and Flexibility - Common Information Space 10
Proposed Solution: DDDAS as a Service (DaaS) Producer Information Brokering Service? Producer Submission Service Repository Service Dissemination Service = Query Service? Subscribing Consumer Query Consumer Online Storage 11
OUTLINE Motivation Information Fusion for Decision Making (Trust) Understand the flow of information in Man and Machines Information Fusion and DDDAS Information Management Flow Managers and QoS Flow (trust is a QoS service) Polices for Information Flow End to End Trust Metrics Summary of metrics Calculations Potential Use Case Scenarios DDDAS-based Crisis Management
Trust in Action Trust (Multidiscipline) - Users - Systems Jin-Hee Cho, Member, IEEE, Ananthram Swami, Fellow, IEEE, and Ing-Ray Chen, Member, IEEE, A Survey on Trust Management for Mobile Ad Hoc Networks, IEEE COMMUNICATIONS SURVEYS & TUTORIALS, VOL. 13, NO. 4, FOURTH QUARTER 2011 13
Human Trust Blasch, 2001, SPIE Rempel s stages, (Muir & Moray, 1996) postulated that Trust = Predictability + Dependability + Faith + Competence + Responsibility + Reliability Trust Dimensions Operator s trust & allocation of function Trusts and uses the automation Distrusts and rejects the automation Appropriate Trust (optimize system performance) False Distrust (lose benefits of automation, inc. workload) Quality of the automation Good Poor False Trust (risk automated disaster) Appropriate Distrust (optimize system performance) 14
Trust in Action Jin-Hee Cho, Member, IEEE, Ananthram Swami, Fellow, IEEE, and Ing-Ray Chen, Member, IEEE, A Survey on Trust Management for Mobile Ad Hoc Networks, IEEE COMMUNICATIONS SURVEYS & TUTORIALS, VOL. 13, NO. 4, FOURTH QUARTER 2011 15
Trust and Risk Management Trust (Communication) - Networks - Message Passing Jin-Hee Cho, Member, IEEE, Ananthram Swami, Fellow, IEEE, and Ing-Ray Chen, Member, IEEE, A Survey on Trust Management for Mobile Ad Hoc Networks, IEEE COMMUNICATIONS SURVEYS & TUTORIALS, VOL. 13, NO. 4, FOURTH QUARTER 2011 16
Proposed Approach: Trust Analysis and Quantification 4 Trust can be defined as a state of mind consisting of 0 Expectancy of what the trustee will do 0 Believe expected behavior will be performed 0 Willingness to take risk in trustee 4 Trust can be based on reputation 0 Collect from collaborating and interacting entities 4 Trust a factor of technology and computations End-to-end trust = f (Trust in infrastructure, Trust in entities) Which can be as simple as: End-to-end trust = (Trust in infrastructure) * (Trust in entities)
Trust Stack Brokerage Policies Enforcement Domain Trust Authority QoS, QoP Collecting Raw Metrics Behavior Analysis Analysis (Situation Awareness) Workflow Authentication and Authorization Security Secure Communication
Domainless Trust Evaluation Architecture
Domain Based Trust Evaluation Cross-Domain Trust
ATM Functionalities Trust Metrics between analysis and filtering Environment Entity Activity User Context Monitoring Collected Information Situation Analysis Vulnerability Analysis Anomaly Analysis Security Policy Analysis Directly Collected Context-Based Filtering Trust Metrics Quantification Trust Evaluation Trust Database Self-Trust Evaluation Machine-Trust Evaluation Peer-Trust Evaluation Data Decision Dissemination
TrustFlow Voltage Frequency Temperature I/O Paging Fan Privilege Level Behavior Logs User Hardware Software # of Cores CPU Utilization Memory Read/Write Manager Developer Application TRUST Network Number of Connection Correct Signature Logs Location Protection Apps Machines Middleware Bandwidth Packet Rate Machine Manager OS Firmware Logs
Trust Flow
Trust Metrics 24
Trust Metrics 25
Trust The trust of an entity is a function of its CIA: Confidentiality, Integrity, and Availability: CIA And since trust metrics are used to determine the values of the CIA components, we can use a function h that will map the trust metrics to the CIA components to get the trust: T (E) = f (Confidentiality, Integrity, Availability) T (E) = f (h(trust Metrics)) 4 Directly collected 4 Trust Measured outputs
Determining Initial Trust for a system Using NIST CVSS can determine the risk in a system. CVSS uses the following metrics to evaluate the score: - Base Metrics - Temporal Metrics - Environmental Metrics
Determining Initial Trust for a system Base Metrics: 4 Exploitability Metrics 0 Access Vector 0 Access Complexity 0 Authentication 4 Impact Metrics: 0 Confidentiality 0 Integrity 0 Availability Temporal Metrics: 4 Exploitability 4 Remediation Level 4 Report Confidence Environmental Metrics: 4 Collateral Damage Potential 4 Target Distribution 4 Security Requirements
User Trust 4 The initial user trust is a function based on the user rank and reputation at the time of adding it to the system. T (E,C) = T (User) T (Machine based on CVSS) 4 Trust measurement takes the value from 0 to 1 0 represents the distrust and 1 represent the blind or full trust. 4 We adopt the NIST standard SP 800 four levels of trust: 0 Distrust: 0 0 Low trust: 0.33 0 Moderate Trust: 0.66 0 High Trust: 1
Adaptive Trust 4 Trust Flow 0 The trust value assigned to each component is not static and will be updated continuously. 0 The Trust Authority module is the one responsible for re-evaluating the trust at runtime. 0 The trust is measured per entity and the trust levels are between 0 and 1. 4 The trust of an entity E is given by: T (E) [ 0,1] 4 Interaction between entities is defined by a context C. And the trust for entities will be computed per context:. T (E,C) [0,1]
Adaptive Trust (continue) 4 M i denotes the collected Trust Flow metric, where i is the metric identifier. The function m i () is a quantifying function that returns a measurement between 0 and 1 for the metric M i- 4 The trust for an entity will be computed using two types of trust: 0 Self-measured trust and 0 Reputation-measured trust.
Adaptive Trust (continue) Trust is evaluated by the Trust Authority Self-Trust Evaluation : T s (E,C) = T(ATM E,C) I i (C) m i (M i ) Peer-Trust Evaluation: T p (E,C) = 1 K K j=1 L i=1 T(ATM j,c) T p,e j (E j,c) C = Context (Mandatory Fields: Federal ID#; Optional Fields) E = Evaluation Strategies / Enforcement (Rules e.g.; Space, Time Policies). 4 The values of the metric weight I i for metric i is determined based on the feature selection technique, where: L i=1 I i (C) = 1
Adaptive Trust (continue) 4 Determining the peer trust is based on the gain or lose in the measured self-trust. 4 Assume that E 1 is communicating with E 2. We have the following facts after the communication: T s (E 1,C Before),T s (E 1,C After), T (E 2,C Before) 4 Then: T p,e1 (E 2,C After) = T s (E 1,C After) T s (E 1,C Before) > 1, Increase Trust(T(E 2,C Before)) T s (E 1,C After) T s (E 1,C Before) = 1, T(E 2,C Before) T s (E 1,C After) T s (E 1,C Before) < 1, Decrease Trust(T(E 2,C Before))
Adaptive Trust (continue) Based on the context and the type of operations, the end-to-end trust can be evaluated using three trust evaluation strategies: Optimistic, Pessimistic, and Average. The end-to-end trust for each strategy can be evaluated as follows: 4 Optimistic Trust Evaluation Strategy: T (E,C) = max(t s (E,C),T p (E,C)) 4 Pessimistic Trust Evaluation Strategy: T (E,C) = min(t s (E,C),T p (E,C)) 4 Average Trust Evaluation Strategy: T (E,C) = average(t s (E,C),T p (E,C)).
Context Context ID: The context ID consists of a mandatory part which is the application protocol that is used in establishing the connection and an optional part which consists of additional information such as location and objectives. Note: Optional Context ID doesn t mean it is not required for connections, it only means that not all connections require this field. 4 EX: A fighter is trying to download Targets map. 0 Context ID: Mandatory Fields: Federal ID#, Affiliation, Agency, Rank, Protocol: SFTP Optional: Location, Operation Context Enforcement: The context enforcement is the process of enforcing security policies based on the context of the communication. It is handled by the ATM. 4 The context enforcement could have the following as inputs: 0 Context ID 0 Entity ID: in cased of the domain based trust, this is a combination of the entity ID, group, domain. 0 Location 0 Trust Level
Example of Context Enforcement 4 Assume E 1 is communicating with E 2, the following is a possible policy for the context enforcement on ATM 1 which is responsible for handling E 1 communications: if ( E 2 Rome Labs) then Deny if ( Context Acceptable Air Force Contexts) then Deny if ( T (E 2,C) < Moderate Trust ) then Deny if ( Location(E 2 ) not in US) then Deny Accept
Trust Metrics Trust (T), Context (C), Metric (M) CYBER Example, ICCS2014 (ATM specific to Communications) Trust Confidence Optimistic Average Pessimistic Trust Evaluation Strategy T(E, C) = max {T S (E, C), T P (E, C)} T(E, C) = ave {T S (E, C), T P (E, C)} T(E, C) = min {T S (E, C), T P (E, C)}
OUTLINE Motivation Information Fusion for Decision Making (Trust) Understand the flow of information in Man and Machines Information Fusion and DDDAS Information Management Flow Managers and QoS Flow (trust is a QoS service) SOA-based DDDAS - DaaS End to End Trust Metrics Summary of metrics Calculations Potential Use Case Scenarios for DaaS DDDAS-based Crisis Management
DDDAS-based Crisis Management Battle Management Nuclear Disaster Management Terrorist/Accident Management Decision Makers Management Domain Domain Experts DDAS Analytics for Cybersecurity Actions Operations Domain Sensors Measurements Air Force Police Firemen First Medical Responders 39
DDDAS Components as a Service (DaaS) Producer Information Brokering Service? Producer Submission Service Repository Service Dissemination Service = Query Service? Subscribing Consumer Query Consumer Online Storage 40
SOA-based DDDAS Environment We are collaborating with:! Youakim Badr, LIRIS Lab, INSA- Lyon, France! Ilkay Altintas, UCSD!! Information Trust Infrastructure Business Process Service Providers S 1 S 2 Design time S 3 S 4 S 5 Runtime (ESB) Environment 41
SOA-based DDDAS Environment We are collaborating, with Ilkay Altintas, UCSD to use Kepler to implement DDDAS workflows that can meet any QoS, QoP or trust requirements for mission critical DDDAS environments 42
Resilient DDAS-based Models 4 Resilient models are based on (DDDAS) Info-Symbiotic Systems 4 Unify computing and measurements S n E n D n S n f(d n ) E n S n+1 S 1 1 2 D 1 Time 3 4 + S 2 5 D 2 E n S n - D n Model evolution 6 S 3 6 7 D 3 43
Generalized DaaS Business Requirements Model Design time Endogenous Changes QoS Model Tolerance Model Security Model generate affect Contextual Information Business Logic Security Model Exogenous Changes Ad-hoc Composition Approach affect runtime Running processes Infrastructure 44
Generalized Resilient DaaS Business Requirement Endogenous Changes Service Model Exogenous Changes Security Policy Context Model feedback generate Business Domain Security Objective Context Model. affect Risk Model affect Annotation Model Web Service Composition 45
DDDAS-based Crisis Management Youakim Badr, Frederique Biennier, and Samir Tata, IEEE Transaction on Services Computing, Vol4, 2011 46
USE CASE SCENARIO Crisis Management Youakim Badr, Frederique Biennier, and Samir Tata, IEEE Transaction on Services Computing, Vol4, 2011 47
Access Control Policies Youakim Badr, Frederique Biennier, and Samir Tata, IEEE Transaction on Services Computing, Vol4, 2011 48
Concluding Remarks How to continuously adapt DDDAS environments to changes, occurring within/outside the DDDAS composition process and satisfying a wide range of security/trust constraints in dynamic environments Solution : Resilient SOA- DDDAS Architecture (DaaS) (Continuous integration of runtime and design time) 49
Back Up Slides
Continuous Behavior Analysis
User Cyber DNA
User Cyber DNA
User Cyber DNA