Why language is hard. And what Linguistics has to say about it. Natalia Silveira Participation code: eagles



From this document you will learn the answers to the following questions:

Whose language processing is easy for humans?

What is the participation code for the part - of - speech tagging?

What does this language do?

Similar documents
POS Tagsets and POS Tagging. Definition. Tokenization. Tagset Design. Automatic POS Tagging Bigram tagging. Maximum Likelihood Estimation 1 / 23

DEPENDENCY PARSING JOAKIM NIVRE

Natural Language Processing

Introduction. BM1 Advanced Natural Language Processing. Alexander Koller. 17 October 2014

Accelerating and Evaluation of Syntactic Parsing in Natural Language Question Answering Systems

Machine Learning for natural language processing

Chunk Parsing. Steven Bird Ewan Klein Edward Loper. University of Melbourne, AUSTRALIA. University of Edinburgh, UK. University of Pennsylvania, USA

Building a Question Classifier for a TREC-Style Question Answering System

UNKNOWN WORDS ANALYSIS IN POS TAGGING OF SINHALA LANGUAGE

NLP Programming Tutorial 5 - Part of Speech Tagging with Hidden Markov Models

Transition-Based Dependency Parsing with Long Distance Collocations

Applying Co-Training Methods to Statistical Parsing. Anoop Sarkar anoop/

Ling 201 Syntax 1. Jirka Hana April 10, 2006

Bing Liu. Web Data Mining. Exploring Hyperlinks, Contents, and Usage Data. With 177 Figures

Symbiosis of Evolutionary Techniques and Statistical Natural Language Processing

Efficient Techniques for Improved Data Classification and POS Tagging by Monitoring Extraction, Pruning and Updating of Unknown Foreign Words

Artificial Intelligence Exam DT2001 / DT2006 Ordinarie tentamen

Phase 2 of the D4 Project. Helmut Schmid and Sabine Schulte im Walde

A Mixed Trigrams Approach for Context Sensitive Spell Checking

Open Domain Information Extraction. Günter Neumann, DFKI, 2012

Statistical Machine Translation

POS Tagging 1. POS Tagging. Rule-based taggers Statistical taggers Hybrid approaches

Search and Data Mining: Techniques. Text Mining Anya Yarygina Boris Novikov

We have recently developed a video database

The Specific Text Analysis Tasks at the Beginning of MDA Life Cycle

Comma checking in Danish Daniel Hardt Copenhagen Business School & Villanova University

Brill s rule-based PoS tagger

English Grammar Checker

A Natural Language Interface for Querying General and Individual Knowledge

Customizing an English-Korean Machine Translation System for Patent Translation *

Automating Legal Research through Data Mining

Using classes has the potential of reducing the problem of sparseness of data by allowing generalizations

PoS-tagging Italian texts with CORISTagger

31 Case Studies: Java Natural Language Tools Available on the Web

Albert Pye and Ravensmere Schools Grammar Curriculum

BILINGUAL TRANSLATION SYSTEM

Computer Algorithms. NP-Complete Problems. CISC 4080 Yanjun Li

Livingston Public Schools Scope and Sequence K 6 Grammar and Mechanics

Testing Data-Driven Learning Algorithms for PoS Tagging of Icelandic

Syntax: Phrases. 1. The phrase

Online Large-Margin Training of Dependency Parsers

Outline of today s lecture

The Book of Grammar Lesson Six. Mr. McBride AP Language and Composition

English Appendix 2: Vocabulary, grammar and punctuation

Context Grammar and POS Tagging

Automatic Knowledge Base Construction Systems. Dr. Daisy Zhe Wang CISE Department University of Florida September 3th 2014

CS4025: Pragmatics. Resolving referring Expressions Interpreting intention in dialogue Conversational Implicature

English. Universidad Virtual. Curso de sensibilización a la PAEP (Prueba de Admisión a Estudios de Posgrado) Parts of Speech. Nouns.

A chart generator for the Dutch Alpino grammar

Course: Model, Learning, and Inference: Lecture 5

An NLP Curator (or: How I Learned to Stop Worrying and Love NLP Pipelines)

Extraction of Hypernymy Information from Text

Pupil SPAG Card 1. Terminology for pupils. I Can Date Word

According to the Argentine writer Jorge Luis Borges, in the Celestial Emporium of Benevolent Knowledge, animals are divided

Log-Linear Models. Michael Collins

Easy-First, Chinese, POS Tagging and Dependency Parsing 1

Simple Type-Level Unsupervised POS Tagging

Projektgruppe. Categorization of text documents via classification

Morphology. Morphology is the study of word formation, of the structure of words. 1. some words can be divided into parts which still have meaning

L130: Chapter 5d. Dr. Shannon Bischoff. Dr. Shannon Bischoff () L130: Chapter 5d 1 / 25

Grammars and introduction to machine learning. Computers Playing Jeopardy! Course Stony Brook University

Natural Language Processing. Today. Logistic Regression Models. Lecture 13 10/6/2015. Jim Martin. Multinomial Logistic Regression

Towards a RB-SMT Hybrid System for Translating Patent Claims Results and Perspectives

MONITORING URBAN TRAFFIC STATUS USING TWITTER MESSAGES

Log-Linear Models a.k.a. Logistic Regression, Maximum Entropy Models

A Primer on Text Analytics

Text Generation for Abstractive Summarization

Sense-Tagging Verbs in English and Chinese. Hoa Trang Dang

Year 3 Grammar Guide. For Children and Parents MARCHWOOD JUNIOR SCHOOL

C o p yr i g ht 2015, S A S I nstitute Inc. A l l r i g hts r eser v ed. INTRODUCTION TO SAS TEXT MINER

CINTIL-PropBank. CINTIL-PropBank Sub-corpus id Sentences Tokens Domain Sentences for regression atsts 779 5,654 Test

Customer Intentions Analysis of Twitter Based on Semantic Patterns

Effective Self-Training for Parsing

Automated Extraction of Vulnerability Information for Home Computer Security

Stock Market Prediction Using Data Mining

Tekniker för storskalig parsning

Writing Common Core KEY WORDS

Rule based Sentence Simplification for English to Tamil Machine Translation System

Natural Language Database Interface for the Community Based Monitoring System *

Semantic annotation of requirements for automatic UML class diagram generation

SENTIMENT ANALYSIS: A STUDY ON PRODUCT FEATURES

An online semi automated POS tagger for. Presented By: Pallav Kumar Dutta Indian Institute of Technology Guwahati

Modern Natural Language Interfaces to Databases: Composing Statistical Parsing with Semantic Tractability

Shallow Parsing with PoS Taggers and Linguistic Features

Supervised Learning (Big Data Analytics)

Hidden Markov Models Chapter 15

Markus Dickinson. Dept. of Linguistics, Indiana University Catapult Workshop Series; February 1, 2013

Special Topics in Computer Science

Chapter 8. Final Results on Dutch Senseval-2 Test Data

Introduction to Machine Learning Lecture 1. Mehryar Mohri Courant Institute and Google Research

Paraphrasing controlled English texts

MODULE 15 Diagram the organizational structure of your company.

Overview of the TACITUS Project

Extraction and Visualization of Protein-Protein Interactions from PubMed

2 Related Work and Overview of Classification Techniques

Pharmacovigilance, also referred to as drug safety surveillance, has been

A Method for Automatic De-identification of Medical Records

Automatic Detection and Correction of Errors in Dependency Treebanks

Semantic parsing with Structured SVM Ensemble Classification Models

Presented to The Federal Big Data Working Group Meetup On 07 June 2014 By Chuck Rehberg, CTO Semantic Insights a Division of Trigent Software

Transcription:

Why language is hard And what Linguistics has to say about it Natalia Silveira Participation code: eagles

Christopher Natalia Silveira Manning Language processing is so easy for humans that it is like trying to sell cargo airplanes to eagles. 2

Christopher Natalia Silveira Manning Language processing is so easy for humans that it is like trying to sell cargo airplanes to eagles. They just don t get what is hard, what is easy and the necessity of infrastructure. Mr. Eagle, um, well we really need a runway to get the 20 tons of product into the air. 3

Christopher Natalia Silveira Manning Language processing is so easy for humans that it is like trying to sell cargo airplanes to eagles. They just don t get what is hard, what is easy and the necessity of infrastructure. Mr. Eagle, um, well we really need a runway to get the 20 tons of product into the air. Mr. Eagle responds with What are you talking about? I can take off, land and raise a family on a tree branch. Cargo planes are easy because flying is easy for me. So I will give you a fish to do the job. 4 Breck Baldwin in the LingPipe blog

Christopher Natalia Silveira Manning Overview of today s lecture (A few reasons) Why language is hard Exploring coarse processing vs. fine processing In between: approaches to sentiment and difficult problems Intermission (A little bit of) What Linguistics has to say about it Part-Of-Speech tagging Dependency parsing 5

Christopher Natalia Silveira Manning Language is multidimensional 6

Christopher Natalia Silveira Manning Coarse vs. fine processing How do we do text classification? How do we do relation extraction? Why? 7

Christopher Natalia Silveira Manning Coarse vs. fine processing How do we do text classification? How do we do relation extraction? Why? 8

Christopher Natalia Silveira Manning Why language is hard: relation extraction <PER>Bill Gates</PER> founded <ORG>Microsoft</ORG> 9

Christopher Natalia Silveira Manning Why language is hard: relation extraction Bill Gates founded Microsoft 10

Christopher Natalia Silveira Manning Why language is hard: relation extraction Bill Gates founded Microsoft Bill Gates, now retired, founded the famous Microsoft 11

Christopher Natalia Silveira Manning Why language is hard: relation extraction Bill Gates founded Microsoft Bill Gates, now retired, founded the famous Microsoft Microsoft, which Bill Gates founded, is a big software company. 12

Christopher Natalia Silveira Manning Why language is hard: relation extraction Bill Gates founded Microsoft Bill Gates, now retired, founded the famous Microsoft Microsoft, which Bill Gates founded, is a big software company. Microsoft was founded by Bill Gates. 13

Christopher Natalia Silveira Manning Why language is hard: relation extraction Bill Gates founded Microsoft Bill Gates, now retired, founded the famous Microsoft Microsoft, which Bill Gates founded, is a big software company. Microsoft was founded by Bill Gates. Microsoft was founded not by Larry Page but by Bill Gates. 14

Christopher Natalia Silveira Manning Why language is hard: relation extraction Bill Gates founded Microsoft Bill Gates, now retired, founded the famous Microsoft Microsoft, which Bill Gates founded, is a big software company. Microsoft was founded by Bill Gates. Microsoft was founded not by Larry Page but by Bill Gates. Microsoft was founded not only by Bill Gates but also Paul Allen. 15

Christopher Natalia Silveira Manning Why language is hard: relation extraction Bill Gates founded Microsoft Bill Gates, now retired, founded the famous Microsoft Microsoft, which Bill Gates founded, is a big software company. Microsoft was founded by Bill Gates. Microsoft was founded not by Larry Page but by Bill Gates. Microsoft was founded not only by Bill Gates but also Paul Allen. 16 And that s not even the hard stuff!

Christopher Natalia Silveira Manning Why language is hard: sentiment analysis I liked this movie. 17

Christopher Natalia Silveira Manning Why language is hard: sentiment analysis I liked this movie. I didn t like this movie. 18

Christopher Natalia Silveira Manning Why language is hard: sentiment analysis I liked this movie. I didn t like this movie. I thought I would like this movie. 19

Christopher Natalia Silveira Manning Why language is hard: sentiment analysis I liked this movie. I didn t like this movie. I thought I would like this movie. I thought this movie would be great. 20

Christopher Natalia Silveira Manning Why language is hard: sentiment analysis I liked this movie. I didn t like this movie. I thought I would like this movie. I thought this movie would be great. I knew this movie would be great. 21

Christopher Natalia Silveira Manning Why language is hard: sentiment analysis I liked this movie. I didn t like this movie. I thought I would like this movie. I thought this movie would be great. I knew this movie would be great. I didn t know this movie would be so great. 22

Part-of-speech tagging A simple but useful form of linguistic analysis Slides by Christopher Manning and Ray Mooney Participation code: eagles

Open class (lexical) words Nouns Verbs Adjectives old older oldest Proper Common Main Adverbs slowly IBM cat / cats see Italy snow registered Numbers more 122,312 Closed class (functional) Modals one Determiners the some can Prepositions to with Conjunctions and or had Particles off up more Pronouns he its Interjections Ow Eh

Christopher Manning Open vs. Closed classes Open vs. Closed classes Closed: determiners: a, an, the pronouns: she, he, I prepositions: on, under, over, near, by, Why closed? Open: Nouns, Verbs, Adjectives, Adverbs.

Christopher Manning POS Tagging Words often have more than one POS: back The back door = JJ On my back = NN Win the voters back = RB Promised to back the bill = VB The POS tagging problem is to determine the POS tag for a particular instance of a word.

Christopher Manning POS Tagging Input: Plays well with others Ambiguity: NNS/VBZ UH/JJ/NN/RB IN NNS Output: Plays/VBZ well/rb with/in others/nns Uses: Text-to-speech (how do we pronounce lead?) Can write regexps like (Det) Adj* N+ over the output for phrases, etc. As input to or to speed up a full parser If you know the tag, you can back off to it in other tasks Penn Treebank POS tags

Christopher Manning POS tagging performance How many tags are correct? (Tag accuracy) About 97% currently But baseline is already 90% Baseline is performance of stupidest possible method Tag every word with its most frequent tag Tag unknown words as nouns Partly easy because Many words are unambiguous You get points for them (the, a, etc.) and for punctuation marks!

Christopher Manning Deciding on the correct part of speech can be difficult even for people Mrs/NNP Shaefer/NNP never/rb got/vbd around/rp to/to joining/vbg All/DT we/prp gotta/vbn do/vb is/vbz go/vb around/in the/dt corner/nn Chateau/NNP Petrus/NNP costs/vbz around/rb 250/CD

Christopher Manning How difficult is POS tagging? About 11% of the word types in the Brown corpus are ambiguous with regard to part of speech But they tend to be very common words. E.g., that I know that he is honest = IN Yes, that play was nice = DT You can t go that far = RB 40% of the word tokens are ambiguous

Christopher Manning Sources of information What are the main sources of information for POS tagging? Knowledge of neighboring words Bill saw that man yesterday NNP NN DT NN NN VB VB(D) IN VB NN Knowledge of word probabilities man is rarely used as a verb. The latter proves the most useful, but the former also helps

Christopher Manning More and Better Features Featurebased tagger Can do surprisingly well just looking at a word by itself: Word the: the DT Lowercased word Importantly: importantly RB Prefixes unfathomable: un- JJ Suffixes Importantly: -ly RB Capitalization Meridian: CAP NNP Word shapes 35-year: d-x JJ Then build a maxent (or whatever) model to predict tag Maxent P(t w): 93.7% overall / 82.6% unknown

Christopher Manning What else can we do? Build better features! RB PRP VBD IN RB IN PRP VBD. They left as soon as he arrived. We could fix this with a feature that looked at the next word JJ NNP NNS VBD VBN. Intrinsic flaws remained undetected. We could fix this by linking capitalized words to their lowercase versions

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. 34

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier NNP 35

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier VBD 36

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier NN 37

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier DT 38

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier CC 39

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier VBD 40

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier TO 41

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier VB 42

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier PRP 43

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier IN 44

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier DT 45

Christopher Ray Mooney Manning Sequence Labeling as Classification Classify each token independently but use as input features, information about the surrounding tokens (sliding window). John saw the saw and decided to take it to the table. classifier NN 46

Christopher Ray Mooney Manning Forward Classification John saw the saw and decided to take it to the table. classifier NNP 47

Christopher Ray Mooney Manning Forward Classification NNP John saw the saw and decided to take it to the table. classifier VBD 48

Christopher Ray Mooney Manning Forward Classification NNP VBD John saw the saw and decided to take it to the table. classifier DT 49

Christopher Ray Mooney Manning Forward Classification NNP VBD DT John saw the saw and decided to take it to the table. classifier NN 50

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN John saw the saw and decided to take it to the table. classifier CC 51

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN CC John saw the saw and decided to take it to the table. classifier VBD 52

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN CC VBD John saw the saw and decided to take it classifier to the table. TO 53

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN CC VBD TO John saw the saw and decided to take it classifier to the table. VB 54

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN CC VBD TO VB John saw the saw and decided to take it classifier to the table. PRP 55

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN CC VBD TO VB PRP John saw the saw and decided to take it to the table. classifier IN 56

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN CC VBD TO VB PRP IN John saw the saw and decided to take it to the table. classifier DT 57

Christopher Ray Mooney Manning Forward Classification NNP VBD DT NN CC VBD TO VB PRP IN DT John saw the saw and decided to take it to the table. classifier NN 58

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. John saw the saw and decided to take it to the table. classifier NN 59

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. John saw the saw and decided to take it NN to the table. classifier DT 60

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. DT NN John saw the saw and decided to take it to the table. classifier IN 61

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. IN DT NN John saw the saw and decided to take it to the table. classifier PRP 62

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. PRP IN DT NN John saw the saw and decided to take it to the table. classifier VB 63

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. VB PRP IN DT NN John saw the saw and decided to take it to the table. classifier TO 64

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. TO VB PRP IN DT NN John saw the saw and decided to take it to the table. classifier VBD 65

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. VBD TO VB PRP IN DT NN John saw the saw and decided to take it to the table. classifier CC 66

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. CC VBD TO VB PRP IN DT NN John saw the saw and decided to take it to the table. classifier NN 67

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. VBD CC VBD TO VB PRP IN DT NN John saw the saw and decided to take it to the table. classifier DT 68

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. DT VBD CC VBD TO VB PRP IN DT NN John saw the saw and decided to take it to the table. classifier VBD 69

Christopher Ray Mooney Manning Backward Classification Disambiguating to in this case would be even easier backward. VBD DT VBD CC VBD TO VB PRP IN DT NN John saw the saw and decided to take it to the table. classifier NNP 70

Christopher Manning Overview: POS Tagging Accuracies Rough accuracies: Most freq tag: ~90% / ~50% Trigram HMM: ~95% / ~55% Maxent P(t w): 93.7% / 82.6% TnT (HMM++): 96.2% / 86.0% MEMM tagger: 96.9% / 86.9% Bidirectional dependencies: 97.2% / 90.0% Upper bound: ~98% (human agreement) Most errors on unknown words

Christopher Manning Tagging Without Sequence Information Baseline Three Words t 0 t 0 w 0 w -1 w 0 w 1 Model Features Token Unknown Sentence Baseline 56,805 93.69% 82.61% 26.74% 3Words 239,767 96.57% 86.78% 48.27% Using words only in a straight classifier works as well as a basic (HMM or discriminative) sequence model!!

Christopher Manning Summary of POS Tagging For tagging, the change from generative to discriminative model does not by itself result in great improvement One profits from models for specifying dependence on overlapping features of the observation such as spelling, suffix analysis, etc. An MEMM allows integration of rich features of the observations, but can suffer strongly from assuming independence from following observations; this effect can be relieved by adding dependence on following words This additional power (of the MEMM,CRF, Perceptron models) has been shown to result in improvements in accuracy The higher accuracy of discriminative models comes at the price of much slower training

Dependency Parsing Building structure Based on slides by Christopher Manning Participation code: eagles

Christopher Manning Dependency Grammar and Dependency Structure Dependency syntax postulates that syntactic structure consists of lexical items linked by binary asymmetric relations ( arrows ) called dependencies The arrows are commonly typed with the name of grammatical relations (subject, prepositional object, apposition, etc.)

Christopher Manning Dependency Grammar and Dependency Structure Dependency syntax postulates that syntactic structure consists of lexical items linked by binary asymmetric relations ( arrows ) called dependencies The arrows are commonly typed with the name of grammatical relations (subject, prepositional object, apposition, etc.)

Christopher Manning Dependency Grammar and Dependency Structure Dependency syntax postulates that syntactic structure consists of lexical items linked by binary asymmetric relations ( arrows ) called dependencies The arrows are commonly typed with the name of grammatical relations (subject, prepositional object, apposition, etc.)

Christopher Manning Dependency Grammar and Dependency Structure Dependency syntax postulates that syntactic structure consists of lexical items linked by binary asymmetric relations ( arrows ) called dependencies The arrows are commonly typed with the name of grammatical relations (subject, prepositional object, apposition, etc.)

Christopher Manning Dependency Grammar and Dependency Structure Dependency syntax postulates that syntactic structure consists of lexical items linked by binary asymmetric relations ( arrows ) called dependencies The arrows are commonly typed with the name of grammatical relations (subject, prepositional object, apposition, etc.)

Christopher Manning Dependency Grammar and Dependency Structure Dependency syntax postulates that syntactic structure consists of lexical items linked by binary asymmetric relations ( arrows ) called dependencies The arrows are commonly typed with the name of grammatical relations (subject, prepositional object, apposition, etc.)

Christopher Manning Methods of Dependency Parsing 1. Dynamic programming (like in the CKY algorithm) You can do it similarly to lexicalized PCFG parsing: an O(n 5 ) algorithm Eisner (1996) gives a clever algorithm that reduces the complexity to O(n 3 ), by producing parse items with heads at the ends rather than in the middle 2. Graph algorithms You create a Maximum Spanning Tree for a sentence McDonald et al. s (2005) MSTParser scores dependencies independently using a ML classifier (he uses MIRA, for online learning, but it could be MaxEnt) 3. Constraint Satisfaction Edges are eliminated that don t satisfy hard constraints. Karlsson (1990), etc. 4. Deterministic parsing Greedy choice of attachments guided by machine learning classifiers MaltParser (Nivre et al. 2008) discussed in the next segment

Christopher Manning Dependency Conditioning Preferences What are the sources of information for dependency parsing? 1. Bilexical affinities [girl the] is plausible 2. POS tags [VB NN] is plausible 3. Dependency distance mostly with nearby words 4. Intervening material Dependencies rarely span intervening verbs or punctuation 5. Valency of heads How many dependents on which side are usual for a head?

Christopher Manning Dependency paths identify relations like protein interaction [Erkan et al. EMNLP 07, Fundel et al. 2007] results det The nsubj demonstrated that KaiC compl nsubj ccomp interacts advmod rythmically prep_with SasA conj_and KaiA conj_and KaiB KaiC nsubj interacts prep_with SasA KaiC nsubj interacts prep_with SasA conj_and KaiA KaiC nsubj interacts prep_with SasA conj_and KaiB

Christopher Manning BioNLP 2009/2011 relation extraction shared tasks [Björne et al. 2009] 50 45 40 35 30 25 20 15 10 5 0 0 1 2 3 4 5 6 7 8 9 10 >10 Dependency distance Linear distance