Simulated Penetration Testing: From Dijkstra to Turing Test++
|
|
|
- Luke Randall
- 10 years ago
- Views:
Transcription
1 Jörg Hoffmann Simulated Penetration Testing 1/58 Simulated Penetration Testing: From Dijkstra to Turing Test++ Jörg Hoffmann June 26, 2015
2
3
4
5
6
7 What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?
8 What? Classical Attack Graphs POMDPs MDPs Taxonomy And Now?
9
10
11
12 Details: See paper.
13 Details: See old town.
14 Details: See old town.
15 Jörg Hoffmann Simulated Penetration Testing 1/49 Simulated Penetration Testing: From Dijkstra to Turing Test++ Jörg Hoffmann June 26, 2015
16 Jörg Hoffmann Simulated Penetration Testing 2/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
17 Jörg Hoffmann Simulated Penetration Testing 3/49 Network Hacking Web Server Application Server DMZ Internet Router Firewall Attacker Workstation DB Server SENSITIVE USERS
18 Jörg Hoffmann Simulated Penetration Testing 3/49 Network Hacking Web Server Application Server DMZ Internet Router Firewall Attacker Workstation DB Server SENSITIVE USERS
19 Jörg Hoffmann Simulated Penetration Testing 3/49 Network Hacking Web Server Application Server DMZ Internet Router Firewall Attacker Workstation DB Server SENSITIVE USERS
20 Jörg Hoffmann Simulated Penetration Testing 3/49 Network Hacking Web Server Application Server DMZ Internet Router Firewall Attacker Workstation DB Server SENSITIVE USERS
21 Jörg Hoffmann Simulated Penetration Testing 4/49 Penetration Testing (Pentesting) Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would.
22 Jörg Hoffmann Simulated Penetration Testing 4/49 Penetration Testing (Pentesting) Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would. Well-established industry (roots back to the 60s). Points out specific dangerous attacks (as opposed to vulnerability scanners).
23 Jörg Hoffmann Simulated Penetration Testing 4/49 Penetration Testing (Pentesting) Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would. Well-established industry (roots back to the 60s). Points out specific dangerous attacks (as opposed to vulnerability scanners). Pentesting tools sold by security companies, like Core Security. Core IMPACT (since 2001); Immunity Canvas (since 2002); Metasploit (since 2003). Run security checks launching exploits.
24 Jörg Hoffmann Simulated Penetration Testing 4/49 Penetration Testing (Pentesting) Pentesting Actively verifying network defenses by conducting an intrusion in the same way an attacker would. Well-established industry (roots back to the 60s). Points out specific dangerous attacks (as opposed to vulnerability scanners). Pentesting tools sold by security companies, like Core Security. Core IMPACT (since 2001); Immunity Canvas (since 2002); Metasploit (since 2003). Run security checks launching exploits. Core IMPACT uses Metric-FF for automation since 2010.
25 Jörg Hoffmann Simulated Penetration Testing 5/49 Automation Security teams are typically small:
26 Jörg Hoffmann Simulated Penetration Testing 5/49 Automation Security teams are typically small:
27 Jörg Hoffmann Simulated Penetration Testing 5/49 Automation Increase testing coverage:
28 Jörg Hoffmann Simulated Penetration Testing 5/49 Automation The security officer s rat race :
29 Jörg Hoffmann Simulated Penetration Testing 5/49 Automation The security officer s rat race :
30 Jörg Hoffmann Simulated Penetration Testing 5/49 Automation = Simulated Pentesting: Make a model of the network and exploits. Run attack planning on the model to simulate attacks.
31 Jörg Hoffmann Simulated Penetration Testing 5/49 Automation = Simulated Pentesting: Make a model of the network and exploits. Run attack planning on the model to simulate attacks. Running the rat race update the model, go drink a coffee.
32 Jörg Hoffmann Simulated Penetration Testing 6/49 The Turing Test
33 Jörg Hoffmann Simulated Penetration Testing 7/49 The Turing Test++: Hacking, not Talking! Ultimate vision: realistically simulate a human hacker!
34 Jörg Hoffmann Simulated Penetration Testing 7/49 The Turing Test++: Hacking, not Talking! Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical.
35 Jörg Hoffmann Simulated Penetration Testing 7/49 The Turing Test++: Hacking, not Talking! Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical. However: socio-technical attacks, e. g. social network reconnaissance.
36 Jörg Hoffmann Simulated Penetration Testing 7/49 The Turing Test++: Hacking, not Talking! Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical. However: socio-technical attacks, e. g. social network reconnaissance. Turing Test as a sub-problem of spying on people.
37 Jörg Hoffmann Simulated Penetration Testing 7/49 The Turing Test++: Hacking, not Talking! Ultimate vision: realistically simulate a human hacker! Yes hacking is more technical. However: socio-technical attacks, e. g. social network reconnaissance. Turing Test as a sub-problem of spying on people (e. g. [Huber et al. (2009)]).
38 Jörg Hoffmann Simulated Penetration Testing 8/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
39 Jörg Hoffmann Simulated Penetration Testing 9/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
40 Jörg Hoffmann Simulated Penetration Testing 10/49 Simulated Pentesting at Core Security Core IMPACT system architecture: Exploits & Attack Modules transform Actions Attack Workspace transform Initial conditions Pentesting Framework PDDL Description execution Plan Planner
41 Jörg Hoffmann Simulated Penetration Testing 10/49 Simulated Pentesting at Core Security Core IMPACT system architecture: Exploits & Attack Modules transform Actions Attack Workspace transform Initial conditions Pentesting Framework PDDL Description execution Plan Planner In practice, the attack plans are being used to point out to the security team where to look.
42 Jörg Hoffmann Simulated Penetration Testing 10/49 Simulated Pentesting at Core Security Point out to the security team where to look
43 Jörg Hoffmann Simulated Penetration Testing 10/49 Simulated Pentesting at Core Security Point out to the security team where to look
44 Jörg Hoffmann Simulated Penetration Testing 11/49 Classical Planning Definition A STRIPS planning task is a tuple P, A, s 0, G : P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), del(a), c(a) of precondition, add list, delete list, and non-negative cost. s 0 : initial state; G: goal.
45 Jörg Hoffmann Simulated Penetration Testing 11/49 Classical Planning Definition A STRIPS planning task is a tuple P, A, s 0, G : P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), del(a), c(a) of precondition, add list, delete list, and non-negative cost. s 0 : initial state; G: goal. Definition A STRIPS planning task s state space is a tuple S, A, T, s 0, S G : S: set of all states; A: actions as above. T : state transitions (s, a, s ) s 0 : initial state as above; S G : goal states. Objective: Find cheapest path from s 0 to (a state in) S G.
46 Jörg Hoffmann Simulated Penetration Testing 12/49 Core Security Attack Planning PDDL Actions: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10)))
47 Jörg Hoffmann Simulated Penetration Testing 12/49 Core Security Attack Planning PDDL Actions: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10)))
48 Jörg Hoffmann Simulated Penetration Testing 12/49 Core Security Attack Planning PDDL Actions: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10)))
49 Jörg Hoffmann Simulated Penetration Testing 12/49 Core Security Attack Planning PDDL Actions: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10)))
50 Jörg Hoffmann Simulated Penetration Testing 12/49 Core Security Attack Planning PDDL Actions: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10))) Action cost: Average execution time. Success statistic against hosts with the same/similar observable configuration parameters.
51 Jörg Hoffmann Simulated Penetration Testing 13/49 Core Security Attack Planning PDDL, ctd. Actions: Initial state: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10))) connected predicates: network graph. has * predicates: host configurations. One compromised host: models the internet. Goal: Compromise one or several goal hosts.
52 Jörg Hoffmann Simulated Penetration Testing 14/49 Remarks History: Planning domain of this kind (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC 08 and IPC 11.
53 Jörg Hoffmann Simulated Penetration Testing 14/49 Remarks History: Planning domain of this kind (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC 08 and IPC 11. Presented encoding proposed by [Lucangeli et al. (2010)]. Used commercially by Core Security in Core INSIGHT since 2010, running a variant of Metric-FF [Hoffmann (2003)].
54 Jörg Hoffmann Simulated Penetration Testing 14/49 Remarks History: Planning domain of this kind (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC 08 and IPC 11. Presented encoding proposed by [Lucangeli et al. (2010)]. Used commercially by Core Security in Core INSIGHT since 2010, running a variant of Metric-FF [Hoffmann (2003)]. Do Core Security s customers like this? I am told they do.
55 Jörg Hoffmann Simulated Penetration Testing 14/49 Remarks History: Planning domain of this kind (less IT-level, including also physical actions like talking to somebody) first proposed by [Boddy et al. (2005)]; used as benchmark in IPC 08 and IPC 11. Presented encoding proposed by [Lucangeli et al. (2010)]. Used commercially by Core Security in Core INSIGHT since 2010, running a variant of Metric-FF [Hoffmann (2003)]. Do Core Security s customers like this? I am told they do. In fact, they like it so much already that Core Security is very reluctant to invest money in making this better...
56 Jörg Hoffmann Simulated Penetration Testing 14/49 Remarks And now:... some remarks about the model.
57 Jörg Hoffmann Simulated Penetration Testing 15/49 Assumption (iii) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10))) Which of the predicates are static?
58 Jörg Hoffmann Simulated Penetration Testing 15/49 Assumption (iii) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10))) Which of the predicates are static? All except compromised.
59 Jörg Hoffmann Simulated Penetration Testing 15/49 Assumption (iii) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10))) Which of the predicates are static? All except compromised.
60 Jörg Hoffmann Simulated Penetration Testing 16/49 Assumption (iv) (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host)... :effect (and (compromised?t) (increase (time) 10))) Are you missing something?
61 Jörg Hoffmann Simulated Penetration Testing 16/49 Assumption (iv) (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host)... :effect (and (compromised?t) (increase (time) 10))) Are you missing something? There are no delete effects.
62 Jörg Hoffmann Simulated Penetration Testing 16/49 Assumption (iv) (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host)... :effect (and (compromised?t) (increase (time) 10))) Are you missing something? There are no delete effects. The attack is monotonic (growing set of attack assets). = delete-relaxed planning.
63 Jörg Hoffmann Simulated Penetration Testing 16/49 Assumption (iv) (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host)... :effect (and (compromised?t) (increase (time) 10))) Are you missing something? There are no delete effects. The attack is monotonic (growing set of attack assets). = delete-relaxed planning. Metric-FF solves this once in every search state... Generating an attack is polynomial-time. Generating an optimal attack is NP-complete.
64 Jörg Hoffmann Simulated Penetration Testing 17/49 Assumption (v) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) Which preconditions are not static?
65 Jörg Hoffmann Simulated Penetration Testing 17/49 Assumption (v) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) Which preconditions are not static? Just 1: (compromised?s).
66 Jörg Hoffmann Simulated Penetration Testing 17/49 Assumption (v) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) Which preconditions are not static? Just 1: (compromised?s). 1 positive precondition, 1 positive effect.
67 Jörg Hoffmann Simulated Penetration Testing 17/49 Assumption (v) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) Which preconditions are not static? Just 1: (compromised?s). 1 positive precondition, 1 positive effect. Optimal attack planning for single goal host = Dijkstra.
68 Jörg Hoffmann Simulated Penetration Testing 17/49 Assumption (v) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) Which preconditions are not static? Just 1: (compromised?s). 1 positive precondition, 1 positive effect. Optimal attack planning for single goal host = Dijkstra. Fixed # goal hosts polynomial-time [Bylander (1994)]. Scaling # goal hosts = Steiner tree [Keyder and Geffner (2009)].
69 Jörg Hoffmann Simulated Penetration Testing 18/49 Concluding Remarks? Simulated Pentesting at Core Security Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits.
70 Jörg Hoffmann Simulated Penetration Testing 18/49 Concluding Remarks? Simulated Pentesting at Core Security Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway:
71 Jörg Hoffmann Simulated Penetration Testing 18/49 Concluding Remarks? Simulated Pentesting at Core Security Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway: Extensibility to more fine-grained models of exploits, socio-technical aspects, detrimental side effects.
72 Jörg Hoffmann Simulated Penetration Testing 18/49 Concluding Remarks? Simulated Pentesting at Core Security Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway: Extensibility to more fine-grained models of exploits, socio-technical aspects, detrimental side effects. Bounded sub-optimal search to suggest several solutions not just a single optimal one.
73 Jörg Hoffmann Simulated Penetration Testing 18/49 Concluding Remarks? Simulated Pentesting at Core Security Dijkstra in the graph over network hosts where weighted edges are defined as a function of configuration parameters and available exploits. Why they use planning & Metric-FF anyway: Extensibility to more fine-grained models of exploits, socio-technical aspects, detrimental side effects. Bounded sub-optimal search to suggest several solutions not just a single optimal one. Quicker & cheaper than building a proprietary solver.
74 Jörg Hoffmann Simulated Penetration Testing 19/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
75 Jörg Hoffmann Simulated Penetration Testing 20/49 Attack Graphs in a Nutshell Community: Application-oriented security. Approach: Describe attack actions by preconditions and effects. Identify/give overview of dangerous action combinations.
76 Jörg Hoffmann Simulated Penetration Testing 20/49 Attack Graphs in a Nutshell Community: Application-oriented security. Approach: Describe attack actions by preconditions and effects. Identify/give overview of dangerous action combinations. Example model: RSH Connection Spoofing: requires with Trusted Partner: TP; TP.service is RSH; Service Active: SA; SA.service is RSH; provides with push channel: PSC; PSC using := RSH; remote execution: REX; REX.using := RSH;......
77 Jörg Hoffmann Simulated Penetration Testing 21/49 Attack Graphs in a Nutshell, ctd. Brief overview of variants: Who and When? What? Terminology
78 Jörg Hoffmann Simulated Penetration Testing 21/49 Attack Graphs in a Nutshell, ctd. Brief overview of variants: Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) STRIPS actions attack graph = action descriptions
79 Jörg Hoffmann Simulated Penetration Testing 21/49 Attack Graphs in a Nutshell, ctd. Brief overview of variants: Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) Ritchey and Ammann (2000) STRIPS actions BDD model checking attack graph = action descriptions attack graph = state space
80 Jörg Hoffmann Simulated Penetration Testing 21/49 Attack Graphs in a Nutshell, ctd. Brief overview of variants: Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) Ritchey and Ammann (2000) Ammann et al. (2002) STRIPS actions BDD model checking Attacks are monotonic! attack graph = action descriptions attack graph = state space
81 Jörg Hoffmann Simulated Penetration Testing 21/49 Attack Graphs in a Nutshell, ctd. Brief overview of variants: Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) Ritchey and Ammann (2000) Ammann et al. (2002) Since then, e. g. Ammann et al. (2002); Noel et al. (2009) STRIPS actions BDD model checking Attacks are monotonic! Relaxed planning attack graph = action descriptions attack graph = state space attack graph = relaxed planning graph
82 Jörg Hoffmann Simulated Penetration Testing 21/49 Attack Graphs in a Nutshell, ctd. Brief overview of variants: Who and When? What? Terminology Schneier (1999); Templeton and Levitt (2000) Ritchey and Ammann (2000) Ammann et al. (2002) Since then, e. g. Ammann et al. (2002); Noel et al. (2009) STRIPS actions BDD model checking Attacks are monotonic! Relaxed planning attack graph = action descriptions attack graph = state space attack graph = relaxed planning graph Attack graphs practical security-analysis tools based on variants of, and analyses on, relaxed planning graphs. AI attack graphs community bridge could be quite useful...
83 Jörg Hoffmann Simulated Penetration Testing 22/49 Dimension (B): Action Model Two major dimensions for simulated pentesting models: (A) Uncertainty Model: Up next. (B) Action Model: Degree of interaction between individual attack components.
84 Jörg Hoffmann Simulated Penetration Testing 22/49 Dimension (B): Action Model Two major dimensions for simulated pentesting models: (A) Uncertainty Model: Up next. (B) Action Model: Degree of interaction between individual attack components. Dimension (B) distinction lines: Explicit Network Graph: Actions = hops from?s to?t. 1 positive precond, 1 positive effect. Subset of compromised hosts. Monotonic actions: Attacker can only gain new attack assests. Installed software, access rights, knowledge (e. g. passwords) etc. General actions: No restrictions (STRIPS, in simplest case). Can model detrimental side effects.
85 Jörg Hoffmann Simulated Penetration Testing 23/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
86 Jörg Hoffmann Simulated Penetration Testing 24/49 An Additional Assumption...
87 Jörg Hoffmann Simulated Penetration Testing 24/49 Assumptions (i) and (ii) Known network graph: No uncertainty about network graph topology. Known host configurations: No uncertainty about host configurations.
88 Jörg Hoffmann Simulated Penetration Testing 25/49 An Overview Before We Begin... Uncertainty Model, Dimension (A): None: Classical planning. CoreSec-Classical: Core Security s model, as seen. Assumptions (i) (v). Uncertainty of action outcomes: MDPs. CoreSec-MDP: Minimal extension of CoreSec-Classical. Assumptions (ii) (viii). Uncertainty of state: POMDPs. CoreSec-POMDP: Minimal extension of CoreSec-Classical. Assumptions (ii) (vii).
89 Jörg Hoffmann Simulated Penetration Testing 26/49 Partially Observable MDP (POMDP) Definition A POMDP is a tuple S, A, T, O, O, b 0 : S states, A actions, O observations. T (s, a, s ): probability of coming to state s when executing action a in state s. O(s, a, o): probability of making observation o when executing action a in state s. b 0 : initial belief, probability distribution over S. Respectively, some (possibly factored) description thereof.
90 Jörg Hoffmann Simulated Penetration Testing 26/49 Partially Observable MDP (POMDP) Definition A POMDP is a tuple S, A, T, O, O, b 0 : S states, A actions, O observations. T (s, a, s ): probability of coming to state s when executing action a in state s. O(s, a, o): probability of making observation o when executing action a in state s. b 0 : initial belief, probability distribution over S. Respectively, some (possibly factored) description thereof. I ll discuss optimization objectives later on. For now, assume observable goal states S g, minimizing undiscounted expected cost-to-goal in a Stochastic Shortest Path (SSP) formulation.
91 Jörg Hoffmann Simulated Penetration Testing 27/49 The Basic Problem
92 Jörg Hoffmann Simulated Penetration Testing 27/49 The Basic Idea [Sarraute et al. (2012)]
93 Jörg Hoffmann Simulated Penetration Testing 28/49 States H0-win2000 H0-win2000-p445 H0-win2000-p445-SMB H0-win2000-p445-SMB-vuln H0-win2000-p445-SMB-agent H0-win2003 H0-win2003-p445 H0-win2003-p445-SMB H0-win2003-p445-SMB-vuln H0-win2003-p445-SMB-agent H0-winXPsp2 H0-winXPsp2-p445 H0-winXPsp2-p445-SMB H0-winXPsp2-p445-SMB-vuln H0-winXPsp2-p445-SMB-agent terminal H0 : the host. winxxx : OS. p445 : is port 445 open? SMB : if so, SAMBA server? vuln : SAMBA server vulnerable? agent : has attacker exploited that vulnerability yet? terminal : attacker has given up.
94 Jörg Hoffmann Simulated Penetration Testing 29/49 Assumptions (vi) and (vii) Succeed-or-nothing: Exploits have only two possible outcomes, succeed or fail. Fail has an empty effect. Abstraction mainly regarding detrimental side effects.
95 Jörg Hoffmann Simulated Penetration Testing 29/49 Assumptions (vi) and (vii) Succeed-or-nothing: Exploits have only two possible outcomes, succeed or fail. Fail has an empty effect. Abstraction mainly regarding detrimental side effects. Configuration-deterministic actions: Action outcome depends deterministically on network configuration. Abstraction only in case of more fine-grained dependencies.
96 Jörg Hoffmann Simulated Penetration Testing 30/49 Exploit Actions Same syntax: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10)))
97 Jörg Hoffmann Simulated Penetration Testing 30/49 Exploit Actions Same syntax: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10)))... but with a different semantics: Consider s a s T (s, a, s ) = O(s, a, o) = 1 s = pre(a), s = appl(s, a) 1 s = pre(a), s = s 0 otherwise 1 s = pre(a), s = appl(s, a), o = success 1 s = pre(a), s = s, o = fail 0 otherwise
98 Jörg Hoffmann Simulated Penetration Testing 31/49 Sensing Actions Example: (:action OS Detect :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :observe (and (when (has OS?t Windows2000) ( win )) (when (has OS?t Windows2003) ( win )) (when (has OS?t WindowsXPsp2) ( winxp )) (when (has OS?t WindowsXPsp3) ( winxp )))
99 Jörg Hoffmann Simulated Penetration Testing 31/49 Sensing Actions Example: (:action OS Detect :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :observe (and (when (has OS?t Windows2000) ( win )) (when (has OS?t Windows2003) ( win )) (when (has OS?t WindowsXPsp2) ( winxp )) (when (has OS?t WindowsXPsp3) ( winxp ))) Network reconnaissance also satisfies the benign assumption: Non-injective but deterministic function of configuration.
100 Jörg Hoffmann Simulated Penetration Testing 32/49 So, we re done, right?
101 Jörg Hoffmann Simulated Penetration Testing 32/49 So, we re done, right? Computation!
102 Jörg Hoffmann Simulated Penetration Testing 32/49 So, we re done, right? Computation! But: Can use single-machine case + decomposition. N3 m * F 1 F 3 F 1 3 C1 C5 C1 N1 F 1 3 N3 N1 F 1 3 F 1 3 m 1... C3 F 1 2 F 3 2 N2 m k C2 C3 C4 C6 C7 C2 C3
103 Jörg Hoffmann Simulated Penetration Testing 32/49 So, we re done, right? Modeling!
104 Jörg Hoffmann Simulated Penetration Testing 32/49 So, we re done, right? Modeling! But: Can use outcome of standard scanning scripts?
105 Jörg Hoffmann Simulated Penetration Testing 33/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
106 Jörg Hoffmann Simulated Penetration Testing 34/49 Markov Decision Process (MDP) Definition An MDP is a tuple S, A, T, s 0 : S states, A actions. T (s, a, s ): probability of coming to state s when executing action a in state s. s 0 : initial state. Respectively, some (possibly factored) description thereof. I ll discuss optimization objectives later on. For now, assume goal states S g, minimizing undiscounted expected cost-to-goal in a Stochastic Shortest Path (SSP) formulation.
107 Jörg Hoffmann Simulated Penetration Testing 35/49 The Basic Idea (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t) (has OS?t Windows) (has OS edition?t Professional) (has OS servicepack?t Sp2) (has OS version?t WinXp) (has architecture?t I386) (has service?t ovtrcd)) :effect (and (compromised?t) (increase (time) 10))) (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :effect (and (probabilistic 0.3 (compromised?t)) (increase (time) 10)))
108 Jörg Hoffmann Simulated Penetration Testing 36/49 How to Obtain Action Outcome Probabilities? = outcome occurs iff φ(host configurations)
109 Jörg Hoffmann Simulated Penetration Testing 36/49 How to Obtain Action Outcome Probabilities? = outcome occurs iff φ(host configurations) = outcome probability P (φ(host configurations), b 0 )
110 Jörg Hoffmann Simulated Penetration Testing 36/49 How to Obtain Action Outcome Probabilities? = outcome occurs iff φ(host configurations) = outcome probability P (φ(host configurations), b 0 ) = just need success probability as function of host configurations in b 0
111 Jörg Hoffmann Simulated Penetration Testing 36/49 How to Obtain Action Outcome Probabilities? = outcome occurs iff φ(host configurations) = outcome probability P (φ(host configurations), b 0 ) = just need success probability as function of host configurations in b 0 = Use Core Security success statistics as success probabilities.
112 Jörg Hoffmann Simulated Penetration Testing 37/49 MDP vs. POMDP Where did we cheat on the previous slide?
113 Jörg Hoffmann Simulated Penetration Testing 37/49 MDP vs. POMDP Where did we cheat on the previous slide? = outcome prob P (φ(host configs), b 0 ) b 0 just captures the attacker s initial knowledge.
114 Jörg Hoffmann Simulated Penetration Testing 37/49 MDP vs. POMDP Where did we cheat on the previous slide? = outcome prob P (φ(host configs), b 0 ) b 0 just captures the attacker s initial knowledge. Hence: Inability to learn. Success probabilities develop with knowledge in the POMDP, but remain constant in the MDP.
115 Jörg Hoffmann Simulated Penetration Testing 37/49 MDP vs. POMDP Where did we cheat on the previous slide? = outcome prob P (φ(host configs), b 0 ) b 0 just captures the attacker s initial knowledge. Hence: Inability to learn. Success probabilities develop with knowledge in the POMDP, but remain constant in the MDP. (But: Maintain flags for partial belief-tracking in the MDP?)
116 Jörg Hoffmann Simulated Penetration Testing 38/49 Assumption (viii) Assume that?t doesn t have the required configuration: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :effect (and (probabilistic 0.3 (compromised?t)) (increase (time) 10)))
117 Jörg Hoffmann Simulated Penetration Testing 38/49 Assumption (viii) Assume that?t doesn t have the required configuration: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :effect (and (probabilistic 0.3 (compromised?t)) (increase (time) 10))) The probability of breaking into?t eventually is
118 Jörg Hoffmann Simulated Penetration Testing 38/49 Assumption (viii) Assume that?t doesn t have the required configuration: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :effect (and (probabilistic 0.3 (compromised?t)) (increase (time) 10))) The probability of breaking into?t eventually is 1.
119 Jörg Hoffmann Simulated Penetration Testing 38/49 Assumption (viii) Assume that?t doesn t have the required configuration: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :effect (and (probabilistic 0.3 (compromised?t)) (increase (time) 10))) The probability of breaking into?t eventually is 1. This contradicts our benign assumptions (iii) and (vii).
120 Jörg Hoffmann Simulated Penetration Testing 38/49 Assumption (viii) Assume that?t doesn t have the required configuration: (:action HP OpenView Remote Buffer Overflow Exploit :parameters (?s - host?t - host) :precondition (and (compromised?s) (connected?s?t)) :effect (and (probabilistic 0.3 (compromised?t)) (increase (time) 10))) The probability of breaking into?t eventually is 1. This contradicts our benign assumptions (iii) and (vii). Hence: Apply-once constraint: Allow to apply each exploit, on each target host, at most once.
121 Jörg Hoffmann Simulated Penetration Testing 39/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
122 Jörg Hoffmann Simulated Penetration Testing 40/49
123 Jörg Hoffmann Simulated Penetration Testing 40/49 Remember?
124 Jörg Hoffmann Simulated Penetration Testing 40/49 A Model Taxonomy (A) Uncertainty Model States Action None Outcomes PO CHP (i) (iii) (viii) CoreSec POMDP Canadian Hacker Problem (CHP) (i) (iii) (viii) Attack Asset POMDP Attack Asset MDP CoreSec MDP (Durkota and Lisy 2014) Graph Distance Delete Relaxed Classical Planning (i) (v) (i) (iii) (iv) (vi) (viii) (i) (iii) (iv) (vi) (viii) (i) (iv) Factored POMDP (i) (iii) (vii) (viii) Current POMDP Model (Sarraute et al. 2012) Factored MDP (i) (iii) (vii) (viii) Classical Planning (i) (iii) CoreSec Classical (Lucangeli et al. 2010) Attack Graphs e.g. (Amman et al. 2002) CyberSecurity (Boddy et al. 2005) Explicit Network Graph Monotonic Actions General Actions (B) Action Model
125 Jörg Hoffmann Simulated Penetration Testing 41/49 The 3rd Dimension Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve?
126 Jörg Hoffmann Simulated Penetration Testing 41/49 The 3rd Dimension Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve? Options: Finite-horizon: Ok. But: Offline problem, horizon not meaningful unless for overall attack (see below). Maximize discounted reward: Ok. But: Discounting unintuitive. And who s to set the rewards?
127 Jörg Hoffmann Simulated Penetration Testing 41/49 The 3rd Dimension Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve? Options: Finite-horizon: Ok. But: Offline problem, horizon not meaningful unless for overall attack (see below). Maximize discounted reward: Ok. But: Discounting unintuitive. And who s to set the rewards? Minimize non-discounted expected cost-to-goal (SSP): Seems good. Non-0 action costs, give-up action. But: Give-up cost?
128 Jörg Hoffmann Simulated Penetration Testing 41/49 The 3rd Dimension Three major dimensions for simulated pentesting models: (A) Uncertainty Model. (B) Action Model. (C) Optimization objective: What is the atttacker trying to achieve? Options: Finite-horizon: Ok. But: Offline problem, horizon not meaningful unless for overall attack (see below). Maximize discounted reward: Ok. But: Discounting unintuitive. And who s to set the rewards? Minimize non-discounted expected cost-to-goal (SSP): Seems good. Non-0 action costs, give-up action. But: Give-up cost? Limited-budget goal probability maximization (MAXPROP): My favorite. Non-0 action costs, give-up action, hence finite-runs SSP. No but I can think of.
129 Jörg Hoffmann Simulated Penetration Testing 42/49 The Interesting Sub-Classes States PO CHP (i) (iii) (viii) Attack Asset POMDP (i) (iii) (iv) (vi) (viii) (A) Uncertainty Model Action Outcomes CoreSec POMDP Canadian Hacker Problem (CHP) (i) (iii) (viii) Attack Asset MDP (i) (iii) (iv) (vi) (viii) CoreSec MDP (Durkota and Lisy 2014) Explicit Network Graph Monotonic Actions (B) Action Model
130 Jörg Hoffmann Simulated Penetration Testing 42/49 The Interesting Sub-Classes we will Discuss (A) Uncertainty Model Action Outcomes Canadian Hacker Problem (CHP) (i) (iii) (viii) Attack Asset MDP (i) (iii) (iv) (vi) (viii) CoreSec MDP (Durkota and Lisy 2014) Explicit Network Graph Monotonic Actions (B) Action Model
131 Jörg Hoffmann Simulated Penetration Testing 42/49 We Start With: (A) Uncertainty Model Action Outcomes Canadian Hacker Problem (CHP) (i) (iii) (viii) CoreSec MDP Explicit Network Graph (B) Action Model
132 Jörg Hoffmann Simulated Penetration Testing 43/49 The Canadian Traveller Problem
133 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem +
134 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem action-outcome uncertainty = +
135 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem action-outcome uncertainty = +
136 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem action-outcome uncertainty = + + =
137 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem action-outcome uncertainty = + + =
138 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem
139 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem
140 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem
141 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem
142 Jörg Hoffmann Simulated Penetration Testing 44/49 The Canadian Hacker Problem
143 Jörg Hoffmann Simulated Penetration Testing 45/49 The Canadian Hacker Problem: And Now? Wrap-up: Variant of Canadian Traveller Problem where we have a monotonically growing set of nodes ( no need to drive back ).
144 Jörg Hoffmann Simulated Penetration Testing 45/49 The Canadian Hacker Problem: And Now? Wrap-up: Variant of Canadian Traveller Problem where we have a monotonically growing set of nodes ( no need to drive back ). Research Challenges/Opportunities:
145 Jörg Hoffmann Simulated Penetration Testing 45/49 The Canadian Hacker Problem: And Now? Wrap-up: Variant of Canadian Traveller Problem where we have a monotonically growing set of nodes ( no need to drive back ). Research Challenges/Opportunities: 1001 CTP papers to be adapted to this...
146 Jörg Hoffmann Simulated Penetration Testing 46/49 Attack-Asset MDPs (A) Uncertainty Model Action Outcomes Attack Asset MDP (i) (iii) (iv) (vi) (viii) (Durkota and Lisy 2014) Monotonic Actions (B) Action Model
147 Jörg Hoffmann Simulated Penetration Testing 46/49 Attack-Asset MDPs Definition An Attack-Asset MDP is a tuple P, A, s 0, G : P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), p(a), c(a) of precondition, add list, success probability, and non-negative cost. s 0 : initial state; G: goal.
148 Jörg Hoffmann Simulated Penetration Testing 46/49 Attack-Asset MDPs Definition An Attack-Asset MDP is a tuple P, A, s 0, G : P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), p(a), c(a) of precondition, add list, success probability, and non-negative cost. s 0 : initial state; G: goal. The probabilistic transitions T arise from these rules: States: STRIPS s, available actions A A. a is applicable to (s, A) if pre(a) s and a A.
149 Jörg Hoffmann Simulated Penetration Testing 46/49 Attack-Asset MDPs Definition An Attack-Asset MDP is a tuple P, A, s 0, G : P: set of facts (Boolean state variables). A: set of actions a, each a tuple pre(a), add(a), p(a), c(a) of precondition, add list, success probability, and non-negative cost. s 0 : initial state; G: goal. The probabilistic transitions T arise from these rules: States: STRIPS s, available actions A A. a is applicable to (s, A) if pre(a) s and a A. With probability p(a) we obtain s = s add(a), and with probability 1 p(a) we obtain s = s. In both cases, we pay cost c(a), and remove a from A.
150 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once.
151 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities:
152 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization.
153 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is nothing happens.
154 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is nothing happens. Every probabilistic action yields a single deterministic action.
155 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is nothing happens. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects.
156 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is nothing happens. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects. Weak plans and determinization heuristics = standard delete relaxation heuristics.
157 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is nothing happens. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects. Weak plans and determinization heuristics = standard delete relaxation heuristics. Landmark action outcomes = deterministic delete-relaxation landmarks.
158 Jörg Hoffmann Simulated Penetration Testing 47/49 Attack-Asset MDPs: And Now? Wrap-up: Probabilistic delete-free STRIPS with success probabilities, no effect in case of failure, each action at most once. Research Challenges/Opportunities: E.g. determinization. Only two outcomes, of which one is nothing happens. Every probabilistic action yields a single deterministic action. These deterministic actions have no delete effects. Weak plans and determinization heuristics = standard delete relaxation heuristics. Landmark action outcomes = deterministic delete-relaxation landmarks. Limited-budget goal probability maximization: landmarks reduce budget à la [Mirkis and Domshlak (2014)].
159 Jörg Hoffmann Simulated Penetration Testing 48/49 Agenda 1 What is this all about? 2 Classical Planning: The Core Security Model [Lucangeli et al. (2010)] 3 Attack Graphs 4 Towards Accuracy: POMDP Models [Sarraute et al. (2012)] 5 The MDP Middle Ground 6 A Model Taxonomy 7 And Now?
160 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker!
161 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs.
162 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott best modeled in PPDDL, at least the MDP variants.)
163 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott best modeled in PPDDL, at least the MDP variants.) Diverse attacks,
164 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria,
165 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria, situation report,
166 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria, situation report, suggest fixes.
167 Jörg Hoffmann Simulated Penetration Testing 49/49 An AI (Sequential Decision Making) Challenge Realistically simulate a human hacker! Model and algorithm design in wide space of relevant complexity/accuracy trade-offs. (Sorry Scott best modeled in PPDDL, at least the MDP variants.) Diverse attacks, meta-criteria, situation report, suggest fixes. Ultimately, an AI-complete problem.
168 References Thanks for Your Attention!... and enjoy the old city tour.
169 References References I Paul Ammann, Duminda Wijesekera, and Saket Kaushik. Scalable, graph-based network vulnerability analysis. In ACM Conference on Computer and Communications Security, pages , Mark Boddy, Jonathan Gohde, Tom Haigh, and Steven Harp. Course of action generation for cyber security using classical planning. In Susanne Biundo, Karen Myers, and Kanna Rajan, editors, Proceedings of the 15th International Conference on Automated Planning and Scheduling (ICAPS-05), pages 12 21, Monterey, CA, USA, Morgan Kaufmann. Rainer Böhme and Márk Félegyházi. Optimal information security investment with penetration testing. In Proceedings of the 1st International Conference on Decision and Game Theory for Security (GameSec 10), pages 21 37, Tom Bylander. The computational complexity of propositional STRIPS planning. Artificial Intelligence, 69(1 2): , Russell Greiner, Ryan Hayward, Magdalena Jankowska, and Michael Molloy. Finding optimal satisficing strategies for and-or trees. Artificial Intelligence, 170(1):19 58, 2006.
170 References References II Russell Greiner. Finding optimal derivation strategies in redundant knowledge bases. Artificial Intelligence, 50(1):95 115, Jörg Hoffmann. The Metric-FF planning system: Translating ignoring delete lists to numeric state variables. Journal of Artificial Intelligence Research, 20: , Markus Huber, Stewart Kowalski, Marcus Nohlberg, and Simon Tjoa. Towards automating social engineering using social networking sites. In Proceedings of the 12th IEEE International Conference on Computational Science and Engineering (CSE 09), pages IEEE Computer Society, Emil Keyder and Hector Geffner. Trees of shortest paths vs. Steiner trees: Understanding and improving delete relaxation heuristics. In C. Boutilier, editor, Proceedings of the 21st International Joint Conference on Artificial Intelligence (IJCAI 2009), pages , Pasadena, California, USA, July Morgan Kaufmann. Barbara Kordy, Sjouke Mauw, Sasa Radomirovic, and Patrick Schweitzer. Foundations of attack-defense trees. In Proceedings of the 7th International Workshop on Formal Aspects in Security and Trust (FAST 10), pages 80 95, 2010.
171 References References III Barbara Kordy, Piotr Kordy, Sjouke Mauw, and Patrick Schweitzer. ADTool: security analysis with attack-defense trees. In Proceedings of the 10th International Conference on Quantitative Evaluation of Systems (QEST 13), pages , Viliam Lisý and Radek Píbil. Computing optimal attack strategies using unconstrained influence diagrams. In Pacific Asia Workshop on Intelligence and Security Informatics, pages 38 46, Jorge Lucangeli, Carlos Sarraute, and Gerardo Richarte. Attack planning in the real world. In Proceedings of the 2nd Workshop on Intelligent Security (SecArt 10), Vitaly Mirkis and Carmel Domshlak. Landmarks in oversubscription planning. In Thorsten Schaub, editor, Proceedings of the 21st European Conference on Artificial Intelligence (ECAI 14), pages , Prague, Czech Republic, August IOS Press. Steven Noel, Matthew Elder, Sushil Jajodia, Pramod Kalapa, Scott O Hare, and Kenneth Prole. Advances in topological vulnerability analysis. In Proceedings of the 2009 Cybersecurity Applications & Technology Conference for Homeland Security (CATCH 09), pages , 2009.
172 References References IV Ronald W. Ritchey and Paul Ammann. Using model checking to analyze network vulnerabilities. In IEEE Symposium on Security and Privacy, pages , Carlos Sarraute, Olivier Buffet, and Jörg Hoffmann. POMDPs make better hackers: Accounting for uncertainty in penetration testing. In Jörg Hoffmann and Bart Selman, editors, Proceedings of the 26th AAAI Conference on Artificial Intelligence (AAAI 12), pages , Toronto, ON, Canada, July AAAI Press. B. Schneier. Attack trees. Dr. Dobbs Journal, Milind Tambe. Security and Game Theory: Algorithms, Deployed Systems, Lessons Learned. Cambridge University Press, Steven J. Templeton and Karl E. Levitt. A requires/provides model for computer attacks. In Proceedings of the Workshop on New Security Paradigms (NSPW 00), pages 31 38, 2000.
173 References Attack Trees Community: Application-oriented security, some academic research. Approach: Graphical Security Models. Organize known possible attacks by top-down refinement over attack actions and sub-actions.
174 References Attack Trees Community: Application-oriented security, some academic research. Approach: Graphical Security Models. Organize known possible attacks by top-down refinement over attack actions and sub-actions. On the side: Many attack tree models are equivalent to AI formula evaluation [e. g. Greiner (1991); Greiner et al. (2006)]. Apparently unnoticed by both communities; pointed out by Lisý and Píbil (2013).
175 References Dimension (B): In Other Words Explicit Network Graph: Actions = hops from?s to?t. Monotonic actions: Attacker can only gain new attack assests. General actions: No restrictions.
176 References Dimension (B): In Other Words Explicit Network Graph: Actions = hops from?s to?t. Monotonic actions: Attacker can only gain new attack assests. General actions: No restrictions. Note that (v) implies (iv).
177 References Dimension (B): In Other Words Explicit Network Graph: Actions = hops from?s to?t. Monotonic actions: Attacker can only gain new attack assests. General actions: No restrictions. Note that (v) implies (iv). And each of (iv) and (v) implies (iii):
178 References Dimension (B) Assumptions: Overview Explicit Network Graph: Actions = hops from?s to?t. Relax: More general attack assets (software/passwords... ). Monotonic actions: Attacker can only gain new attack assests. Relax: E.g. detrimental side effects, crashing the host. Static network: Host connections & configurations not affected. Relax: E.g. detrimental side effects, crashing the host.
179 References Game-Theoretic Models What about modeling the defender?
180 References Game-Theoretic Models My 5 cents: What about modeling the defender? How to get realistic models? Is a network intrusion actually a game? Typically mentioned, if at all, as detection risk as in potential detrimental side effect of an attack action.
181 References Game-Theoretic Models My 5 cents: What about modeling the defender? How to get realistic models? Is a network intrusion actually a game? Typically mentioned, if at all, as detection risk as in potential detrimental side effect of an attack action. GameSec series
182 References Game-Theoretic Models My 5 cents: What about modeling the defender? How to get realistic models? Is a network intrusion actually a game? Typically mentioned, if at all, as detection risk as in potential detrimental side effect of an attack action. GameSec series Böhme and Félegyházi (2010) introduce a model of the entire pentesting life cycle, and prove that pentesting pays off.
183 References Game-Theoretic Models My 5 cents: What about modeling the defender? How to get realistic models? Is a network intrusion actually a game? Typically mentioned, if at all, as detection risk as in potential detrimental side effect of an attack action. GameSec series Böhme and Félegyházi (2010) introduce a model of the entire pentesting life cycle, and prove that pentesting pays off. Attack-defense trees [Kordy et al. (2010, 2013)].
184 References Game-Theoretic Models My 5 cents: What about modeling the defender? How to get realistic models? Is a network intrusion actually a game? Typically mentioned, if at all, as detection risk as in potential detrimental side effect of an attack action. GameSec series Böhme and Félegyházi (2010) introduce a model of the entire pentesting life cycle, and prove that pentesting pays off. Attack-defense trees [Kordy et al. (2010, 2013)]. Security games (e. g. Tambe (2011)): Completely different application.
Simulated Penetration Testing: From Dijkstra to Turing Test++
Simulated Penetration Testing: From Dijkstra to Turing Test++ Jörg Hoffmann Saarland University Saarbrücken, Germany [email protected] Abstract Penetration testing (pentesting) is a well established
Some Research Directions in Automated Pentesting
Carlos Sarraute Research Directions in Automated Pentesting 1/50 Some Research Directions in Automated Pentesting Carlos Sarraute CoreLabs & ITBA PhD program Buenos Aires, Argentina H2HC October 29/30,
Using AI Techniques to improve Pentesting Automation
Using AI Techniques to improve Pentesting Automation Carlos Sarraute Core Security Technologies and Ph.D. program in Informatics Engineering, ITBA Hackito Ergo Sum April 8-10, 2010 Brief presentation My
POMPDs Make Better Hackers: Accounting for Uncertainty in Penetration Testing. By: Chris Abbott
POMPDs Make Better Hackers: Accounting for Uncertainty in Penetration Testing By: Chris Abbott Introduction What is penetration testing? Methodology for assessing network security, by generating and executing
arxiv:1306.4040v1 [cs.cr] 17 Jun 2013
An Algorithm to Find Optimal Attack Paths in Nondeterministic Scenarios Carlos Sarraute Core Security Technologies and ITBA Buenos Aires, Argentina [email protected] Gerardo Richarte Core Security Technologies
Penetration Testing == POMDP Solving?
Penetration Testing == POMDP Solving? Carlos Sarraute Core Security Technologies & ITBA Buenos Aires, Argentina [email protected] Olivier Buffet and Jörg Hoffmann INRIA Nancy, France {olivier.buffet,joerg.hoffmann}@loria.fr
Penetration Testing == POMDP Solving?
Penetration Testing == POMDP Solving? Carlos Sarraute Core Security Technologies & ITBA Buenos Aires, Argentina [email protected] Olivier Buffet and Jörg Hoffmann INRIA Nancy, France {olivier.buffet,joerg.hoffmann}@loria.fr
Attack Graph Techniques
Chapter 2 Attack Graph Techniques 2.1 An example scenario Modern attack-graph techniques can automatically discover all possible ways an attacker can compromise an enterprise network by analyzing configuration
INTRODUCTION: PENETRATION TEST A BUSINESS PERSPECTIVE:
PENETRATION TESTING A SYSTEMATIC APPROACH INTRODUCTION: The basic idea behind writing this article was to put forward a systematic approach that needs to be followed to perform a successful penetration
VEA-bility Security Metric: A Network Security Analysis Tool
VEA-bility Security Metric: A Network Security Analysis Tool Melanie Tupper Dalhousie University [email protected] A. Nur Zincir-Heywood Dalhousie University [email protected] Abstract In this work, we propose
Techniques for Supporting Prediction of Security Breaches in. Critical Cloud Infrastructures Using Bayesian Network and. Markov Decision Process
Techniques for Supporting Prediction of Security Breaches in Critical Cloud Infrastructures Using Bayesian Network and Markov Decision Process by Vinjith Nagaraja A Thesis Presentation in Partial Fulfillment
Network Mission Assurance
Network Mission Assurance Michael F. Junod, Patrick A. Muckelbauer, PhD, Todd C. Hughes, PhD, Julius M. Etzl, and James E. Denny Lockheed Martin Advanced Technology Laboratories Camden, NJ 08102 {mjunod,pmuckelb,thughes,jetzl,jdenny}@atl.lmco.com
Penetration Testing - a way for improving our cyber security
OWASP EU Tour Bucharest 2013 The OWASP Foundation http://www.owasp.org Penetration Testing - a way for improving our cyber security Adrian Furtunǎ, PhD, OSCP, CEH [email protected] Copyright The OWASP
WHITEPAPER. Nessus Exploit Integration
Nessus Exploit Integration v2 Tenable Network Security has committed to providing context around vulnerabilities, and correlating them to other sources, such as available exploits. We currently pull information
Optimal Network Security Hardening Using Attack Graph Games
Proceedings of the wenty-fourth International Joint Conference on Artificial Intelligence (IJCAI 2015) Optimal Network Security Hardening Using Attack Graph Games Karel Durkota 1, Viliam Lisý 1, Branislav
An Introduction to Markov Decision Processes. MDP Tutorial - 1
An Introduction to Markov Decision Processes Bob Givan Purdue University Ron Parr Duke University MDP Tutorial - 1 Outline Markov Decision Processes defined (Bob) Objective functions Policies Finding Optimal
CMPT 471 Networking II
CMPT 471 Networking II Firewalls Janice Regan, 2006-2013 1 Security When is a computer secure When the data and software on the computer are available on demand only to those people who should have access
CONTINUOUS DIAGNOSTICS BEGINS WITH REDSEAL
CONTINUOUS DIAGNOSTICS BEGINS WITH REDSEAL WHAT IS CDM? The continuous stream of high profile cybersecurity breaches demonstrates the need to move beyond purely periodic, compliance-based approaches to
Network Security and Risk Analysis Using Attack Graphs
Network Security and Risk Analysis Using Attack Graphs Anoop Singhal National Institute of Standards and Technology Coauthors: Lingyu Wang and Sushil Jajodia Concordia University George Mason University
How To Analyze And Detect A Network Attack Through A Network Graph
Advanced Vulnerability Analysis and Intrusion Detection Through Predictive Attack Graphs Steven Noel and Sushil Jajodia Center for Secure Information Systems, George Mason University, Fairfax, VA, USA
Demystifying Penetration Testing for the Enterprise. Presented by Pravesh Gaonjur
Demystifying Penetration Testing for the Enterprise Presented by Pravesh Gaonjur Pravesh Gaonjur Founder and Executive Director of TYLERS Information Security Consultant Certified Ethical Hacker (CEHv8Beta)
Safe robot motion planning in dynamic, uncertain environments
Safe robot motion planning in dynamic, uncertain environments RSS 2011 Workshop: Guaranteeing Motion Safety for Robots June 27, 2011 Noel du Toit and Joel Burdick California Institute of Technology Dynamic,
Penetration Testing. Presented by
Penetration Testing Presented by Roadmap Introduction to Pen Testing Types of Pen Testing Approach and Methodology Side Effects Demonstration Questions Introduction and Fundamentals Penetration Testing
Firewall Verification and Redundancy Checking are Equivalent
Firewall Verification and Redundancy Checking are Equivalent H. B. Acharya University of Texas at Austin [email protected] M. G. Gouda National Science Foundation University of Texas at Austin [email protected]
WHITE PAPER AUTOMATED, REAL-TIME RISK ANALYSIS AND REMEDIATION
WHITE PAPER AUTOMATED, REAL-TIME RISK ANALYSIS AND REMEDIATION Table of Contents Executive Summary...3 Vulnerability Scanners Alone Are Not Enough...3 Real-Time Change Configuration Notification is the
Security Optimization of Dynamic Networks with Probabilistic Graph Modeling and Linear Programming
1 Security Optimization of Dynamic Networks with Probabilistic Graph Modeling and Linear Programming Hussain M.J. Almohri, Member, IEEE, Layne T. Watson Fellow, IEEE, Danfeng (Daphne) Yao, Member, IEEE
FIDIUS: Intelligent Support for Vulnerability Testing
FIDIUS: Intelligent Support for Vulnerability Testing Dominik Elsbroek and Daniel Kohlsdorf and Dominik Menke and Lars Meyer {elsbroek,dkohl,dmke,lmeyer}@tzi.org Center for Computing and Communication
Optimal Planning with ACO
Optimal Planning with ACO M.Baioletti, A.Milani, V.Poggioni, F.Rossi Dipartimento Matematica e Informatica Universit di Perugia, ITALY email: {baioletti, milani, poggioni, rossi}@dipmat.unipg.it Abstract.
National Cyber League Certified Ethical Hacker (CEH) TM Syllabus
National Cyber League Certified Ethical Hacker (CEH) TM Syllabus Note to Faculty This NCL Syllabus is intended as a supplement to courses that are based on the EC- Council Certified Ethical Hacker TM (CEHv8)
The Value of Automated Penetration Testing White Paper
The Value of Automated Penetration Testing White Paper Overview As an information security and the security manager of the company, I am well aware of the difficulties of enterprises and organizations
COURSE NAME: INFORMATION SECURITY INTERNSHIP PROGRAM
COURSE NAME: INFORMATION SECURITY INTERNSHIP PROGRAM Course Description This is the Information Security Training program. The Training provides you Penetration Testing in the various field of cyber world.
ITEC441- IS Security. Chapter 15 Performing a Penetration Test
1 ITEC441- IS Security Chapter 15 Performing a Penetration Test The PenTest A penetration test (pentest) simulates methods that intruders use to gain unauthorized access to an organization s network and
National Cyber League Certified Ethical Hacker (CEH) TM Syllabus
National Cyber League Certified Ethical Hacker (CEH) TM Syllabus Note to Faculty This NCL Syllabus is intended as a supplement to courses that are based on the EC- Council Certified Ethical Hacker TM (CEHv8)
NETWORK PENETRATION TESTING
Tim West Consulting 6807 Wicklow St. Arlington, TX 76002 817-228-3420 [email protected] OVERVIEW Tim West Consulting Tim West Consulting is a full service IT security and support firm that specializes
CIT 480: Securing Computer Systems. Vulnerability Scanning and Exploitation Frameworks
CIT 480: Securing Computer Systems Vulnerability Scanning and Exploitation Frameworks Vulnerability Scanners Vulnerability scanners are automated tools that scan hosts and networks for potential vulnerabilities,
A Biologically Inspired Approach to Network Vulnerability Identification
A Biologically Inspired Approach to Network Vulnerability Identification Evolving CNO Strategies for CND Todd Hughes, Aron Rubin, Andrew Cortese,, Harris Zebrowitz Senior Member, Engineering Staff Advanced
Research on the Essential Network Equipment Risk Assessment Methodology based on Vulnerability Scanning Technology Xiaoqin Song 1
International Conference on Informatization in Education, Management and Business (IEMB 2015) Research on the Essential Network Equipment Risk Assessment Methodology based on Vulnerability Scanning Technology
PCI Security Scan Procedures. Version 1.0 December 2004
PCI Security Scan Procedures Version 1.0 December 2004 Disclaimer The Payment Card Industry (PCI) is to be used as a guideline for all entities that store, process, or transmit Visa cardholder data conducting
CyberNEXS Global Services
CyberNEXS Global Services CYBERSECURITY A cyber training, exercising, competition and certification product for maximizing the cyber skills of your workforce The Cyber Network EXercise System CyberNEXS
Outline. Outline. Outline
Network Forensics: Network Prefix Scott Hand September 30 th, 2011 1 What is network forensics? 2 What areas will we focus on today? Basics Some Techniques What is it? OS fingerprinting aims to gather
Metasploit The Elixir of Network Security
Metasploit The Elixir of Network Security Harish Chowdhary Software Quality Engineer, Aricent Technologies Shubham Mittal Penetration Testing Engineer, Iviz Security And Your Situation Would Be Main Goal
Aiming at Higher Network Security Levels Through Extensive PENETRATION TESTING. Anestis Bechtsoudis. http://bechtsoudis.com abechtsoudis (at) ieee.
Aiming at Higher Network Security Levels Through Extensive PENETRATION TESTING Anestis Bechtsoudis http://bechtsoudis.com abechtsoudis (at) ieee.org Athena Summer School 2011 Course Goals Highlight modern
CRYPTUS DIPLOMA IN IT SECURITY
CRYPTUS DIPLOMA IN IT SECURITY 6 MONTHS OF TRAINING ON ETHICAL HACKING & INFORMATION SECURITY COURSE NAME: CRYPTUS 6 MONTHS DIPLOMA IN IT SECURITY Course Description This is the Ethical hacking & Information
NERC CIP VERSION 5 COMPLIANCE
BACKGROUND The North American Electric Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) Reliability Standards define a comprehensive set of requirements that are the basis for maintaining
Network and Host-based Vulnerability Assessment
Network and Host-based Vulnerability Assessment A guide for information systems and network security professionals 6600 Peachtree-Dunwoody Road 300 Embassy Row Atlanta, GA 30348 Tel: 678.443.6000 Toll-free:
A Framework for Analysis A Network Vulnerability
A Framework for Analysis A Tito Waluyo Purboyo 1, Kuspriyanto 2 1,2 School of Electrical Engineering & Informatics, Institut Teknologi Bandung Jl. Ganesha 10 Bandung 40132, Indonesia Abstract: administrators
Considerations In Developing Firewall Selection Criteria. Adeptech Systems, Inc.
Considerations In Developing Firewall Selection Criteria Adeptech Systems, Inc. Table of Contents Introduction... 1 Firewall s Function...1 Firewall Selection Considerations... 1 Firewall Types... 2 Packet
SAST, DAST and Vulnerability Assessments, 1+1+1 = 4
SAST, DAST and Vulnerability Assessments, 1+1+1 = 4 Gordon MacKay Digital Defense, Inc. Chris Wysopal Veracode Session ID: Session Classification: ASEC-W25 Intermediate AGENDA Risk Management Challenges
Penetration Testing Service. By Comsec Information Security Consulting
Penetration Testing Service By Consulting February, 2007 Background The number of hacking and intrusion incidents is increasing year by year as technology rolls out. Equally, there is no hiding place your
Optimal Network Security Hardening Using Attack Graph Games
Optimal Network Security Hardening Using Attack Graph Games Karel Durkota Dept. of Computer Science FEE, CU in Prague [email protected] Viliam Lisý Dept. of Computer Science FEE, CU in Prague
Security Optimization of Dynamic Networks with Probabilistic Graph Modeling and Linear Programming
1 Security Optimization of Dynamic Networks with Probabilistic Graph Modeling and Linear Programming Hussain M.J. Almohri, Member, IEEE, Layne T. Watson, Danfeng (Daphne) Yao, Member, IEEE and Xinming
DeltaV System Cyber-Security
January 2013 Page 1 This paper describes the system philosophy and guidelines for keeping your DeltaV System secure from Cyber attacks. www.deltav.com January 2013 Page 2 Table of Contents Introduction...
Complete Web Application Security. Phase1-Building Web Application Security into Your Development Process
Complete Web Application Security Phase1-Building Web Application Security into Your Development Process Table of Contents Introduction 3 Thinking of security as a process 4 The Development Life Cycle
Detecting SQL Injection Vulnerabilities in Web Services
Detecting SQL Injection Vulnerabilities in Web Services Nuno Antunes, {nmsa, mvieira}@dei.uc.pt LADC 2009 CISUC Department of Informatics Engineering University of Coimbra Outline n Web Services n Web
Professional Penetration Testing Techniques and Vulnerability Assessment ...
Course Introduction Today Hackers are everywhere, if your corporate system connects to internet that means your system might be facing with hacker. This five days course Professional Vulnerability Assessment
Understanding Security Testing
Understanding Security Testing Choosing between vulnerability assessments and penetration testing need not be confusing or onerous. Arian Eigen Heald, M.A., Ms.IA., CNE, CISA, CISSP I. Introduction Many
North Dakota 2013 IT Security Audit Vulnerability Assessment & Penetration Test Project Briefing
North Dakota 2013 IT Security Audit Vulnerability Assessment & Penetration Test Project Briefing Introduction ManTech Project Manager Mark Shaw, Senior Executive Director Cyber Security Solutions Division
ADDING NETWORK INTELLIGENCE TO VULNERABILITY MANAGEMENT
ADDING NETWORK INTELLIGENCE INTRODUCTION Vulnerability management is crucial to network security. Not only are known vulnerabilities propagating dramatically, but so is their severity and complexity. Organizations
Course Outline Department of Computing Science Faculty of Science. COMP 3710-3 Applied Artificial Intelligence (3,1,0) Fall 2015
Course Outline Department of Computing Science Faculty of Science COMP 710 - Applied Artificial Intelligence (,1,0) Fall 2015 Instructor: Office: Phone/Voice Mail: E-Mail: Course Description : Students
Scalable, Graph-Based Network Vulnerability Analysis
Scalable, Graph-Based Network Vulnerability Analysis Paul Ammann ISE Department, MS 4A4 Center for Secure Inf. Sys. George Mason University Fairfax, VA 22030, U.S.A. +1 703 993 1660 [email protected] Duminda
GUIDE TO INFORMATION SECURITY TESTING AND ASSESSMENT
GUIDE TO INFORMATION SECURITY TESTING AND ASSESSMENT Shirley Radack, Editor Computer Security Division Information Technology Laboratory National Institute of Standards and Technology A comprehensive approach
Automatic Generation of Correlation Rules to Detect Complex Attack Scenarios
Automatic Generation of Correlation Rules to Detect Complex Attack Scenarios Erwan Godefroy, Eric Totel, Michel Hurfin, Frédéric Majorczyk To cite this version: Erwan Godefroy, Eric Totel, Michel Hurfin,
Adversary-Driven State-Based System Security Evaluation
Adversary-Driven State-Based System Security Evaluation Elizabeth LeMay, Willard Unkenholz, Donald Parks, Carol Muehrcke*, Ken Keefe, William H. Sanders Information Trust Institute, Coordinated Science
Experiences from Educating Practitioners in Vulnerability Analysis
Experiences from Educating Practitioners in Vulnerability Analysis Abstract. This paper presents experiences from a vulnerability analysis course especially developed for practitioners. The described course
Visualizing the Business Impact of Technical Cyber Risks
Visualizing the Business Impact of Technical Cyber Risks May 21, 2014 Henk Jonkers Senior Research Consultant, BiZZdesign Agenda Introduction and problem statement Enterprise Architecture with ArchiMate
Learn Ethical Hacking, Become a Pentester
Learn Ethical Hacking, Become a Pentester Course Syllabus & Certification Program DOCUMENT CLASSIFICATION: PUBLIC Copyrighted Material No part of this publication, in whole or in part, may be reproduced,
Vulnerability analysis
Vulnerability analysis License This work by Z. Cliffe Schreuders at Leeds Metropolitan University is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License. Contents License Contents
Next-Generation Penetration Testing. Benjamin Mossé, MD, Mossé Security
Next-Generation Penetration Testing Benjamin Mossé, MD, Mossé Security About Me Managing Director of Mossé Security Creator of an Mossé Cyber Security Institute - in Melbourne +30,000 machines compromised
Modelling and Analysing Network Security Policies in a Given Vulnerability Setting
Modelling and Analysing Network Security Policies in a Given Vulnerability Setting Roland Rieke Fraunhofer Institute for Secure Information Technology SIT, Darmstadt, Germany [email protected] Abstract.
THE TOP 4 CONTROLS. www.tripwire.com/20criticalcontrols
THE TOP 4 CONTROLS www.tripwire.com/20criticalcontrols THE TOP 20 CRITICAL SECURITY CONTROLS ARE RATED IN SEVERITY BY THE NSA FROM VERY HIGH DOWN TO LOW. IN THIS MINI-GUIDE, WE RE GOING TO LOOK AT THE
Information Security Assessment and Testing Services RFQ # 28873 Questions and Answers September 8, 2014
QUESTIONS ANSWERS Q1 How many locations and can all locations be tested from a A1 5 locations and not all tests can be performed from a central location? central location. Q2 Connection type between location
0days: How hacking really works. V 1.0 Jan 29, 2005 Dave Aitel [email protected]
0days: How hacking really works V 1.0 Jan 29, 2005 Dave Aitel [email protected] Who am I? NSA->@stake->Immunity CEO of Immunity, Inc. Consulting (product assessments) Immunity CANVAS Immunity Partner's
How To Protect A Network From Attack From A Hacker (Hbss)
Leveraging Network Vulnerability Assessment with Incident Response Processes and Procedures DAVID COLE, DIRECTOR IS AUDITS, U.S. HOUSE OF REPRESENTATIVES Assessment Planning Assessment Execution Assessment
Distributed Denial of Service Attack Tools
Distributed Denial of Service Attack Tools Introduction: Distributed Denial of Service Attack Tools Internet Security Systems (ISS) has identified a number of distributed denial of service tools readily
FIREWALLS & NETWORK SECURITY with Intrusion Detection and VPNs, 2 nd ed. Chapter 4 Finding Network Vulnerabilities
FIREWALLS & NETWORK SECURITY with Intrusion Detection and VPNs, 2 nd ed. Chapter 4 Finding Network Vulnerabilities Learning Objectives Name the common categories of vulnerabilities Discuss common system
Information Security Attack Tree Modeling for Enhancing Student Learning
Information Security Attack Tree Modeling for Enhancing Student Learning Jidé B. Odubiyi, Computer Science Department Bowie State University, Bowie, MD and Casey W. O Brien, Network Technology Department
Goal Oriented Pentesting Getting the most value out of Penetration Testing
Goal Oriented Pentesting Getting the most value out of Penetration Testing June 15, 2010 About me Joshua Jabra Abraham Joined Rapid7 in 2006 as a Security Consultant Extensive IT Security and Auditing
Microsoft Technologies
NETWORK ENGINEERING TRACK Microsoft Technologies QUARTER 1 DESKTOP APPLICATIONS - ESSENTIALS Module 1 - Office Applications This subject enables users to acquire the necessary knowledge and skills to use
VEA-bility Analysis of Network Diversification
VEA-bility Analysis of Network Diversification Melanie Tupper Supervised by Nur Zincir-Heywood Faculty of Computer Science, Dalhousie University [email protected] [email protected] August 31, 2007 Abstract:
Overview of Network Security The need for network security Desirable security properties Common vulnerabilities Security policy designs
Overview of Network Security The need for network security Desirable security properties Common vulnerabilities Security policy designs Why Network Security? Keep the bad guys out. (1) Closed networks
Software Modeling and Verification
Software Modeling and Verification Alessandro Aldini DiSBeF - Sezione STI University of Urbino Carlo Bo Italy 3-4 February 2015 Algorithmic verification Correctness problem Is the software/hardware system
An Analytical Framework for Measuring Network Security using Exploit Dependency Graph
An Analytical Framework for Measuring Network Security using Exploit Dependency Graph Parantapa Bhattacharya Department of Computer Science and Engineering Indian Institute of Technology, Kharagpur Email:
Intrusion Detection and Cyber Security Monitoring of SCADA and DCS Networks
Intrusion Detection and Cyber Security Monitoring of SCADA and DCS Networks Dale Peterson Director, Network Security Practice Digital Bond, Inc. 1580 Sawgrass Corporate Parkway, Suite 130 Sunrise, FL 33323
Topological Vulnerability Analysis
Topological Vulnerability Analysis Sushil Jajodia and Steven Noel Traditionally, network administrators rely on labor-intensive processes for tracking network configurations and vulnerabilities. This requires
