Artificial Intelligence Lecture 2

Similar documents
AI: A Modern Approach, Chpts. 3-4 Russell and Norvig

Informed search algorithms. Chapter 4, Sections 1 2 1

CS104: Data Structures and Object-Oriented Design (Fall 2013) October 24, 2013: Priority Queues Scribes: CS 104 Teaching Team

CS MidTerm Exam 4/1/2004 Name: KEY. Page Max Score Total 139

Binary Heap Algorithms

From Last Time: Remove (Delete) Operation

Krishna Institute of Engineering & Technology, Ghaziabad Department of Computer Application MCA-213 : DATA STRUCTURES USING C

Binary Search Trees CMPSC 122

- Easy to insert & delete in O(1) time - Don t need to estimate total memory needed. - Hard to search in less than O(n) time

Why Use Binary Trees?

Why? A central concept in Computer Science. Algorithms are ubiquitous.

Y. Xiang, Constraint Satisfaction Problems

Sorting revisited. Build the binary search tree: O(n^2) Traverse the binary tree: O(n) Total: O(n^2) + O(n) = O(n^2)

A binary heap is a complete binary tree, where each node has a higher priority than its children. This is called heap-order property

recursion, O(n), linked lists 6/14

Greatest Common Factor and Least Common Multiple

Search methods motivation 1

Binary Search Trees. Data in each node. Larger than the data in its left child Smaller than the data in its right child

Chess Algorithms Theory and Practice. Rune Djurhuus Chess Grandmaster / October 3, 2012

Binary Search Trees. A Generic Tree. Binary Trees. Nodes in a binary search tree ( B-S-T) are of the form. P parent. Key. Satellite data L R

Measuring the Performance of an Agent

Outline. NP-completeness. When is a problem easy? When is a problem hard? Today. Euler Circuits

Boolean Expressions, Conditions, Loops, and Enumerations. Precedence Rules (from highest to lowest priority)

An Introduction to The A* Algorithm

Analysis of Micromouse Maze Solving Algorithms

APP INVENTOR. Test Review

Introduction Solvability Rules Computer Solution Implementation. Connect Four. March 9, Connect Four

Introduction to Learning & Decision Trees

Binary Heaps. CSE 373 Data Structures

Vector storage and access; algorithms in GIS. This is lecture 6

Heaps & Priority Queues in the C++ STL 2-3 Trees

Big Data and Scripting. Part 4: Memory Hierarchies

Guessing Game: NP-Complete?

Ordered Lists and Binary Trees

CSC148 Lecture 8. Algorithm Analysis Binary Search Sorting

Formal Languages and Automata Theory - Regular Expressions and Finite Automata -

1 Review of Newton Polynomials

Data Structures. Algorithm Performance and Big O Analysis

TREE BASIC TERMINOLOGIES

Data Structure with C

Binary Heaps * * * * * * * / / \ / \ / \ / \ / \ * * * * * * * * * * * / / \ / \ / / \ / \ * * * * * * * * * *

Reminder: Complexity (1) Parallel Complexity Theory. Reminder: Complexity (2) Complexity-new

Reminder: Complexity (1) Parallel Complexity Theory. Reminder: Complexity (2) Complexity-new GAP (2) Graph Accessibility Problem (GAP) (1)

Static Load Balancing

Chess Algorithms Theory and Practice. Rune Djurhuus Chess Grandmaster / September 23, 2014

Previous Lectures. B-Trees. External storage. Two types of memory. B-trees. Main principles

6.02 Practice Problems: Routing

Turing Machines: An Introduction

SYSM 6304: Risk and Decision Analysis Lecture 5: Methods of Risk Analysis

Efficient Data Structures for Decision Diagrams

Symbol Tables. Introduction

B-Trees. Algorithms and data structures for external memory as opposed to the main memory B-Trees. B -trees

CAs and Turing Machines. The Basis for Universal Computation

SEARCHING AND KNOWLEDGE REPRESENTATION. Angel Garrido

MAX = 5 Current = 0 'This will declare an array with 5 elements. Inserting a Value onto the Stack (Push)

Sequential Data Structures

Regular Expressions and Automata using Haskell

2) Write in detail the issues in the design of code generator.

Outline BST Operations Worst case Average case Balancing AVL Red-black B-trees. Binary Search Trees. Lecturer: Georgy Gimel farb

CSE 326, Data Structures. Sample Final Exam. Problem Max Points Score 1 14 (2x7) 2 18 (3x6) Total 92.

Analysis of Algorithms I: Binary Search Trees

CMPSCI611: Approximating MAX-CUT Lecture 20

Algorithms and Data Structures

The Union-Find Problem Kruskal s algorithm for finding an MST presented us with a problem in data-structure design. As we looked at each edge,

Unix Shell Scripts. Contents. 1 Introduction. Norman Matloff. July 30, Introduction 1. 2 Invoking Shell Scripts 2

CSC 180 H1F Algorithm Runtime Analysis Lecture Notes Fall 2015

Dynamic Programming. Lecture Overview Introduction

Home Page. Data Structures. Title Page. Page 1 of 24. Go Back. Full Screen. Close. Quit

Multiplication Rules! Tips to help your child learn their times tables

Advanced Oracle SQL Tuning

5. A full binary tree with n leaves contains [A] n nodes. [B] log n 2 nodes. [C] 2n 1 nodes. [D] n 2 nodes.

Data Structure [Question Bank]

BSc in Artificial Intelligence and Computer Science ABDAL MOHAMED

Load Balancing. Load Balancing 1 / 24

Data Structures and Algorithms

6. Standard Algorithms

1) The postfix expression for the infix expression A+B*(C+D)/F+D*E is ABCD+*F/DE*++

Search Engines. Stephen Shaw 18th of February, Netsoc

Questions 1 through 25 are worth 2 points each. Choose one best answer for each.

Seminar. Path planning using Voronoi diagrams and B-Splines. Stefano Martina

CompSci-61B, Data Structures Final Exam

Decision Making under Uncertainty

Binary Trees and Huffman Encoding Binary Search Trees

Scheduling Shop Scheduling. Tim Nieberg

Incremental Routing Algorithms For Dynamic Transportation Networks

Chapter 8: Bags and Sets

Data Structures and Algorithms Written Examination

Efficiency of algorithms. Algorithms. Efficiency of algorithms. Binary search and linear search. Best, worst and average case.

CSE 326: Data Structures B-Trees and B+ Trees

DATA STRUCTURES USING C

CHAPTER 4 ESSENTIAL DATA STRUCTRURES

Regression Verification: Status Report

(Refer Slide Time: 2:03)

Analysis of Binary Search algorithm and Selection Sort algorithm

SIMS 255 Foundations of Software Design. Complexity and NP-completeness

Artificial Intelligence Exam DT2001 / DT2006 Ordinarie tentamen

CPSC 211 Data Structures & Implementations (c) Texas A&M University [ 313]

Lecture 3: Finding integer solutions to systems of linear equations

Lecture Notes on Linear Search

4/1/2017. PS. Sequences and Series FROM 9.2 AND 9.3 IN THE BOOK AS WELL AS FROM OTHER SOURCES. TODAY IS NATIONAL MANATEE APPRECIATION DAY

Transcription:

Artificial Intelligence Lecture 2

Representation From AI admirers to AI programmers. Step 1: Represent the problem so that it is computerfriendly. Step 2: Code the problem in a programming language. Step 3: Develop/code an algorithm to find a solution. Step 4: Represent the solution so that it is humanfriendly. 2

What is Artificial Intelligence Representation... 3

Blind search If we have no extra information to guide the search search search Iterative deepening 4

What is Artificial Intelligence? But you can still do some cool things. The water jug problem: Suppose you are given 1 jug (3L) and 1 jug (4L). You also have a tap With which you can fill the jugs. Goal: Get exactly 2L in the 4L jug. 4L 3L 5

Representation Step 1: Representing the problem for a machine. We represent the amount of water in the jugs with (X,Y) 1.(X,Y) > (4,Y) Fill the 4 liter jug. 2.(X,Y) > (X,3) Fill the 3 liter jug. 3.(X,Y) > (0,Y) Empty the four liter jug 4.(X,Y) if X+Y >= 4 and Y > 0 > (4,Y (4 X)) Fill the 4 liter jug with water from the 3 liter jug. 6

Representation 7

Representation in python # Each state is a tuple (x,y) of water def nextstates(current_state): x,y = current_state states = [(4, y),(x, 3),(0, y),(x, 0)] if x+y >= 4: # Fill 4 liter jug from the 3 liter one states = states + [(4,y (4 x))] else: # Pour everything from the 3 liter to the 4 liter one states = states + [(x+y,0)] if x+y >= 3: # Fill the 3 liter jug from the four liter one states = states + [(x (3 y),3)] else: # Pour everything from the 4 litre to the 3 litre jug states = states + [(0,x+y)] # Remove duplicate states return list(set(states)) 8

Representation in python >>> nextstates( (0,0) ) [(0, 3), (0, 0), (4, 0)] >>> nextstates( (0, 3) ) [(3, 0), (0, 3), (0, 0), (4, 3)] >>> nextstates( (3, 0) ) [(3, 0), (0, 3), (0, 0), (3, 3), (4, 0)] 9

(4,2) (0,2) (2,0) 10

Silly implementation don't do this Just try everything until it works def silly( state, goal ): visited_states = [ (state) ] while state!= goal: choices = nextstates(state) next = choices[random.randrange(0,len(choices))] state = next visited_states += [state] return visited_states 11

Silly implementation it's really bad >>> silly( (0,0), (3,0) ) [(0, 0), (0, 3), (0, 0), (0, 3), (0, 0), (4, 0), (4, 0), (4, 0), (4, 3), (4, 3), (4, 3), (4, 0), (4, 0), (4, 0), (0, 0), (0, 3), (3, 0)] 12

Recursive depth-first search def recursivedf(state, goal, previous): if state == goal: return previous for choice in nextstates(state): if choice in previous: # We have already been in that state before continue else: solution = recursivedf(choice, goal, previous+ [choice]) if solution!= []: return solution return [] 13

Recursive depth first >>> recursivedf( (0,0), (2,0), [(0,0)] ) [(0, 0), (0, 3), (3, 0), (3, 3), (4, 2), (0, 2), (2, 0)] >>> recursivedf( (0,0), (0,1), [(0,0)] ) [(0, 0), (0, 3), (3, 0), (3, 3), (4, 2), (0, 2), (2, 0), (2, 3), (4, 1), (0, 1)] 14

Iterative implementation of depth first Store a list of states to visit If the first state is the goal state, then finished Remove first state from the list compute all choices for this state remove choices where we have already been (loop detection!) Store not only current state, but all previous states! add all choices to the beginning of list 15

Iterative implementation of depth first def dfsearch( start, goal ): l = [ [start] ] while l!= []: path = l[0] l = l[1:] if path[ 1] == goal: return path choices = nextstates( path[ 1] ) for c in choices: if c not in path: l = [path+[c]] + l return [] 16

Testing it >>> dfsearch( (0,0), (2,0) ) [(0, 0), (0, 3), (3, 0), (3, 3), (4, 2), (0, 2), (2, 0)] 17

Evaluation criteria Complete: Does the algorithm always find a solution if it exists? Optimal: Is the solution always the best one? eg. length of solution Space: How much memory does it take to find a solution? Time: How long time does it take to find a solution? 18

Evaluating depth first Complete: Only if we avoid loops and search tree is finite Optimal: No, we're satisfied with any solution Space: Only as much as needed to remember the current path 19

How can we find the best solution? Idea 1: find all solutions and compare them This can be quite many... Idea 2: explore the tree so that we look for the shortests solutions at each time. : a b d e f g h j... : a b c d e f... 20

Iterative implementation of breadth first Store a queue of states to visit If the first state is the goal state, then finished Remove first state from the queue compute all choices for this state remove choices where we have already been (loop detection!) Store not only current state, but all previous states! add all choices to end of queue 21

Iterative implementation of breadth first def bfsearch( start, goal ): l = [ [start] ] while 1: path = l.pop(0) if path[ 1] == goal: return path choices = nextstates( path[ 1] ) for c in choices: if c not in path: l.append(path+[c]) return [] 22

Testing it >>> bfsearch( (0,0), (2,0) ) [(0, 0), (0, 3), (3, 0), (3, 3), (4, 2), (0, 2), (2, 0)] 23

Evaluating breadth first Complete: Yes, if a solution exists Optimal: Yes, the first one found must have shortest path Space: Need to remember the whole row (usually big!) above the solution! Memory Solution 24

Branching factor If we have 2 choices at each state 2 n states after n actions If we have 3 choices at each state 3 n states after n actions 25

Analysing breadth first search If the solution exists at depth n, then breadth first search takes O(B n ) time where B is the branching factor of the problem, and uses O(B n ) space If the found solution exists at depth n, then depth first search takes O(n) time where B is the branching factor of the problem, and uses O(B n )space. 26

The problem Not optimal Uses O(n) space Optimal Uses O(B n ) space Can we combine the advantages of both approaches? 27

Iterative deepening (IDA) Let M be a maximum depth. Run depth first, but only until the given depth. Repeat for increasing values of M. M=1, M=2... 28

Iterative deepening def idasearch( start, goal ): M = 0 ; l=[] while 1: if l == []: M = M+1 ; l = [[start]] path = l.pop() if path[ 1] == goal: return path if len(path) > M: continue choices = nextstates( path[ 1] ) for c in choices: if c not in path: l = [path+[c]] + l return [] 29

Iterative deepening : water jugs problem >>> idasearch( (0,0), (2,0) ) [(0, 0), (0, 3), (3, 0), (3, 3), (4, 2), (0, 2), (2, 0)] >>> idasearch( (0,0), (0,1) ) [(0, 0), (4, 0), (1, 3), (1, 0), (0, 1)] How many nodes where visited? >>> idasearch ( (0,0), (0,2) ) 73 nodes visited [(0, 0), (0, 3), (3, 0), (3, 3), (4, 2), (0, 2)] >>> dfsearch ( (0,0), (0,2) ) 25 nodes visited [(0, 0), (0, 3), (3, 0), (3, 3), (4, 2), (0, 2)] 30

Analysing iterative deepening Completeness Yes, will return the best (shortest) solution Space complexity O(n) Time complexity Each iteration of M takes: O(B M ) time Total time: O(B 1 )+O(B 2 )+...+O(B N ) = O(B N ) That's the same complexity as both df and breadth first! In practice, if we skip the big-o-notation IDA takes B times longer Use breadth first if we have enough memory, otherwise IDA Using too much memory causes swapping which is slow! 31

Heuristic search Basic idea Use some domain knowledge to create a heuristic that tells how close to the goal a solution is. Example: In navigation, count the distance to the destination Heuristic does not have to be perfect, only give a rough guide to how good/bad a solution is When searching, expand first the nodes that have a good heuristic value 32

A* search Use a cost function: f(n) = g(n) + h(n) g(n): cost from root node to this node h(n): admissible heuristic cost from this node to goal Admissible heuristic: must never overestimate the distance to the goal For each node, keep track of cost f(n) Expand the node n that have the lowest cost Compute cost of children. Insert sorted into list of nodes Sort explicitly (inefficient) or, Iterate over list and insert at right place 33

A* search The efficiency of A* depends on the heuristic Provides a good way of combining domain knowledge with general search. Provides optimal solutions iff heuristic is admissible Time complexity In worst case: O(B n ) Space complexity In worst case: O(B n ) 34