Artificial Intelligence Exam DT2001 / DT2006 Ordinarie tentamen



Similar documents
Syntax: Phrases. 1. The phrase

Symbiosis of Evolutionary Techniques and Statistical Natural Language Processing

Comma checking in Danish Daniel Hardt Copenhagen Business School & Villanova University

A Chart Parsing implementation in Answer Set Programming

LESSON THIRTEEN STRUCTURAL AMBIGUITY. Structural ambiguity is also referred to as syntactic ambiguity or grammatical ambiguity.

Krishna Institute of Engineering & Technology, Ghaziabad Department of Computer Application MCA-213 : DATA STRUCTURES USING C

Feed-Forward mapping networks KAIST 바이오및뇌공학과 정재승

Understanding English Grammar: A Linguistic Introduction

Question 2 Naïve Bayes (16 points)

Model 2.4 Faculty member + student

Learning Translation Rules from Bilingual English Filipino Corpus

Neural Networks and Support Vector Machines

CS MidTerm Exam 4/1/2004 Name: KEY. Page Max Score Total 139

Outline of today s lecture

PP-Attachment. Chunk/Shallow Parsing. Chunk Parsing. PP-Attachment. Recall the PP-Attachment Problem (demonstrated with XLE):

An Introduction to Neural Networks

Neural Networks and Back Propagation Algorithm

How the Computer Translates. Svetlana Sokolova President and CEO of PROMT, PhD.

Proofreading and Editing:

Machine Learning CS Lecture 01. Razvan C. Bunescu School of Electrical Engineering and Computer Science

SYNTAX: THE ANALYSIS OF SENTENCE STRUCTURE

3 An Illustrative Example

Why language is hard. And what Linguistics has to say about it. Natalia Silveira Participation code: eagles

Parent Help Booklet. Level 3

Back Propagation Neural Networks User Manual

Aim To help students prepare for the Academic Reading component of the IELTS exam.

Year 3 Grammar Guide. For Children and Parents MARCHWOOD JUNIOR SCHOOL

Syntactic Theory on Swedish

CS Master Level Courses and Areas COURSE DESCRIPTIONS. CSCI 521 Real-Time Systems. CSCI 522 High Performance Computing

Grammars and introduction to machine learning. Computers Playing Jeopardy! Course Stony Brook University

The multilayer sentiment analysis model based on Random forest Wei Liu1, Jie Zhang2

Dynamic Programming Problem Set Partial Solution CMPSC 465

Course Manual Automata & Complexity 2015

Materials: Children s literature written in Spanish, videos, games, and pictures comprise the list of materials.

Course: Model, Learning, and Inference: Lecture 5

Artificial Neural Networks and Support Vector Machines. CS 486/686: Introduction to Artificial Intelligence

Constraints in Phrase Structure Grammar

Paraphrasing controlled English texts

Effective Analysis and Predictive Model of Stroke Disease using Classification Methods

INF5820 Natural Language Processing - NLP. H2009 Jan Tore Lønning jtl@ifi.uio.no

CPSC 211 Data Structures & Implementations (c) Texas A&M University [ 313]

Lecture 9. Phrases: Subject/Predicate. English 3318: Studies in English Grammar. Dr. Svetlana Nuernberg

To: GMATScore GMAT Course Registrants From: GMATScore Course Administrator Date: January 2006 Ref: SKU ; GMAT Test Format & Subject Areas

TeachingEnglish Lesson plans. Conversation Lesson News. Topic: News

Course Outline Department of Computing Science Faculty of Science. COMP Applied Artificial Intelligence (3,1,0) Fall 2015

Statistical Machine Translation

Course 395: Machine Learning

Lesson Plan. Date(s)... M Tu W Th F

Introduction. BM1 Advanced Natural Language Processing. Alexander Koller. 17 October 2014

Semantic analysis of text and speech

Recursive Algorithms. Recursion. Motivating Example Factorial Recall the factorial function. { 1 if n = 1 n! = n (n 1)! if n > 1

Get Ready for IELTS Writing. About Get Ready for IELTS Writing. Part 1: Language development. Part 2: Skills development. Part 3: Exam practice

Open Domain Information Extraction. Günter Neumann, DFKI, 2012

Studying Achievement

6. After two minutes, teacher places answer transparency on the projector while students check their answers.

CS510 Software Engineering

Context Grammar and POS Tagging

CS 6740 / INFO Ad-hoc IR. Graduate-level introduction to technologies for the computational treatment of information in humanlanguage

English Descriptive Grammar

NP-complete? NP-hard? Some Foundations of Complexity. Prof. Sven Hartmann Clausthal University of Technology Department of Informatics

Domain Knowledge Extracting in a Chinese Natural Language Interface to Databases: NChiql

Predicting the Risk of Heart Attacks using Neural Network and Decision Tree

Presented to The Federal Big Data Working Group Meetup On 07 June 2014 By Chuck Rehberg, CTO Semantic Insights a Division of Trigent Software

Effective Self-Training for Parsing

Stabilization by Conceptual Duplication in Adaptive Resonance Theory

Natural Language Processing

Key Stage 1 Assessment Information Meeting

Chapter 12 Discovering New Knowledge Data Mining

Practice Exam (Solutions)

Ling 201 Syntax 1. Jirka Hana April 10, 2006

Supervised Learning (Big Data Analytics)

Taking a Law School Exam:

The Classes P and NP. mohamed@elwakil.net

Accelerating and Evaluation of Syntactic Parsing in Natural Language Question Answering Systems

GRADE 4 English Language Arts Proofreading: Lesson 5

Overview of MT techniques. Malek Boualem (FT)

Active Learning SVM for Blogs recommendation

Neural Networks algorithms and applications

NEURAL NETWORKS A Comprehensive Foundation

Constituency. The basic units of sentence structure

Ask your teacher about any which you aren t sure of, especially any differences.

How to write a technique essay? A student version. Presenter: Wei-Lun Chao Date: May 17, 2012

Engaging Students Online

PLANET: Massively Parallel Learning of Tree Ensembles with MapReduce. Authors: B. Panda, J. S. Herbach, S. Basu, R. J. Bayardo.

Pushdown automata. Informatics 2A: Lecture 9. Alex Simpson. 3 October, School of Informatics University of Edinburgh als@inf.ed.ac.

The Specific Text Analysis Tasks at the Beginning of MDA Life Cycle

Final Exam Grammar Review. 5. Explain the difference between a proper noun and a common noun.

Research Tools & Techniques

Polynomials and Factoring. Unit Lesson Plan

Different Approaches to White Box Testing Technique for Finding Errors

Recurrent Neural Networks

6.3 Conditional Probability and Independence

Transition-Based Dependency Parsing with Long Distance Collocations

12 FIRST QUARTER. Class Assignments

SYNTACTIC PATTERNS IN ADVERTISEMENT SLOGANS Vindi Karsita and Aulia Apriana State University of Malang

Transcription:

Artificial Intelligence Exam DT2001 / DT2006 Ordinarie tentamen Date: 2010-01-11 Time: 08:15-11:15 Teacher: Mathias Broxvall Phone: 301438 Aids: Calculator and/or a Swedish-English dictionary Points: The exam consists of 5 exercises with a total of 40 points. Grading: DT2001: 20 points is required for degree 3, 30 points for degree 4 and 35 points for degree 5. DT2006: 20 points is required for degree G, 32.5 points required for degree VG. Other: You may answer in either Swedish or English Use a new sheet for each exercise Motivate all answers thoroughly If anything is unclear, make reasonable assumptions and explain the assumptions.

NOTE: ALWAYS USE A NEW SHEET FOR EACH EXERCISE Exercise 1 (9 points) Answer the following questions with your own words and argue briefly for the points made. a) What is the Turing test and what is it supposed to prove. Argue briefly for the merit of the Turing test. b) What is the Chinese room experiment and what is it supposed to demonstrate. Explain the argument behind this. c) What is the Loebner prize competition. Explain how this, and recent winning competition entrants, relates to the Turing test and to the Chinese room experiment. Exercise 2 (12 points) Assume that we are doing search on the search space given by the tree below, where the goal node is the node L. The left hand side of each node is a label for each node and the right hand side a heuristic value used in exercise c. a) Explain the difference between depth first search, breadth first search and iterative deepening. Explain which ones are complete, how much time they require and how much memory they require. b) Demonstrate depth first search, breadth first search and iterative deepening by showing in which order they visit the nodes of the search tree below, stop when the goal node have been reached. c) Assume that we want to perform A*-search on the tree above. We have a cost function in which each step of the tree costs 1 and where the number in each node gives it's heuristic cost function. Demonstrate in which order the nodes of the tree above would be visited. Stop when the goal node have been reached.

Exercise 3 (6 points) a) Explain what is forward chaining in expert systems and what a proof tree is. Give an example of forward chaining and proof trees using a few rules and a few facts. b) Explain what is backward chaining in expert systems. Explain what the difference between forward and backward chaining is. Give an example of backward chaining using a few rules. c) Assume that we want to compute the probability that a patient have diabetes mellitus given that she tests positive for ketoacidocis..we know that 1% of the population at large have diabetes mellitus, and we know that 2% of the population tests positive for ketoacidocis. Furthermore, we have 100 patient records of newly diagnosed diabetics of which 50 tested positive for ketoacidocis. Compute P(Diabetes Ketoacidocis) based on these numbers. Show all the steps of the computation. Exercise 4 (6 points) What does syntactic ambiguity mean. Create, using the grammar below, a sentence that is syntactically ambigous and demonstrate that it is so using parse trees. If needed you may add extra nounds, verbs, prepositions, determinants and adjectives (but no new rules). S NP VP S NP VP PP VP V VP V NP NP N NP ADJ NP NP DET NP NP NP PP PP P NP N boy girl binoculars Homer hat P with V chases sees run DET the a ADJ young Exercise 5 (7 points) a) Explain what is a linear classifier. Give an example of a two dimensional classification problem that can be correctly classified using a linear classifier, and one that cannot be classified using such a classifier. b) Explain what a neural network is and how a perceptron functions. c) Assume that we have a classification problem of one variable and two examples. The first example has the input 0.6 should give a negative output (-1) while the second example has input 0.4 and should give a positive (+1) output. We start with a perceptron with input weight 0.0 and threshold 0.0. Demonstrate how this perceptron is trained during four iterations over the examples, with learning rates of 0.5, 0.4, 0.3 and 0.2 respectively. What would the perceptron give as a response given the input signal 0.8 after the training?