Natural Language Processing (CSE 517): Dependency Syntax and Parsing

Size: px
Start display at page:

Download "Natural Language Processing (CSE 517): Dependency Syntax and Parsing"

Transcription

1 Natural Language Processing (CSE 517): Depenency Syntax an Parsing Noah A. Smith Swabha Swayamipta c 2018 University of Washington {nasmith,swabha}@cs.washington.eu May 11, / 93

2 Recap: Phrase Structure S NP NP VP DT The NN luxury NN auto NN maker JJ last NN year VBD sol CD NP NN IN PP NP 1,214 cars in DT NNP the U.S. 2 / 93

3 Parent Annotation (Johnson, 1998) S ROOT NP S NP S VP S DT NP The NN NP luxury NN NP auto NN NP maker JJ NP last NN NP year VBD VP NP VP PP VP sol CD NP NN NP IN PP NP PP 1,214 cars in DT NP NNP NP the U.S. Increases the vertical Markov orer: p(chilren parent, granparent) 3 / 93

4 Heaeness S NP NP VP DT The NN luxury NN auto NN maker JJ last NN year VBD NP PP sol CD NN IN NP 1,214 cars in DT the NNP U.S. Suggests horizontal markovization: p(chilren parent) = p(hea parent) p(ith sibling hea, parent) i 4 / 93

5 Lexicalization S sol NP maker NP year VP sol JJ last NN year DT The The NN luxury luxury NN auto auto NN maker maker last year VBD sol NP cars PP in sol CD 1,214 NN cars IN in NP U.S. 1,214 cars in DT the NNP U.S. the U.S. Each noe shares a lexical hea with its hea chil. 5 / 93

6 Depenencies Informally, you can think of epenency structures as a transformation of phrase-structures that maintains the wor-to-wor relationships inuce by lexicalization, as labels to them, an eliminates the phrase categories. There are also linguistic theories built on epenencies (Tesnière, 1959; Mel čuk, 1987), as well as treebanks corresponing to those. Free(r)-wor orer languages (e.g., Czech) 6 / 93

7 Depenency Tree: Definition Let x = x 1,..., x n be a sentence. A a special root symbol as x 0. A epenency tree consists of a set of tuples p, c, l, where p {0,..., n} is the inex of a parent c {1,..., n} is the inex of a chil l L is a label Different annotation schemes efine ifferent label sets L, an ifferent constraints on the set of tuples. Most commonly: The tuple is represente as a irecte ege from x p to x c with label l. The irecte eges form an arborescence (irecte tree) with x 0 as the root (sometimes enote root). 7 / 93

8 Example S NP VP Pronoun we Verb wash NP Determiner Noun our cats Phrase-structure tree. 8 / 93

9 Example S NP VP Pronoun we Verb wash NP Determiner Noun our cats Phrase-structure tree with heas. 9 / 93

10 Example S wash NP we VP wash Pronoun we we Verb wash wash Determiner our NP cats Noun cats our cats Phrase-structure tree with heas, lexicalize. 10 / 93

11 Example we wash our cats Bare bones epenency tree. 11 / 93

12 Example we wash our cats who stink 12 / 93

13 Example we vigorously wash our cats who stink 13 / 93

14 Content Heas vs. Function Heas Creit: Nathan Schneier little kis were always watching birs with fish little kis were always watching birs with fish 14 / 93

15 Labels root pobj sbj obj prep kis saw birs with fish Key epenency relations capture in the labels inclue: subject, irect object, preposition object, ajectival moifier, averbial moifier. In this lecture, I will mostly not iscuss labels, to keep the algorithms simpler. 15 / 93

16 Coorination Structures we vigorously wash our cats an ogs who stink The bugbear of epenency syntax. 16 / 93

17 Example we vigorously wash our cats an ogs who stink Make the first conjunct the hea? 17 / 93

18 Example we vigorously wash our cats an ogs who stink Make the coorinating conjunction the hea? 18 / 93

19 Example we vigorously wash our cats an ogs who stink Make the secon conjunct the hea? 19 / 93

20 Nonprojective Example ROOT ATT PU TMP ATT SBJ VC PC ATT A hearing is scheule on the issue toay. 20 / 93

21 Depenency Schemes Direct annotation. Transform the treebank: efine hea rules that can select the hea chil of any noe in a phrase-structure tree an label the epenencies. More powerful, less local rule sets, possibly collapsing some wors into arc labels. Stanfor epenencies are a popular example (e Marneffe et al., 2006). Only results in projective trees. Rule base epenencies, followe by manual correction. 21 / 93

22 Approaches to Depenency Parsing 1. Transition-base parsing with a stack. 2. Chu-Liu-Emons algorithm for arborescences (irecte graphs). 3. Dynamic programming with the Eisner algorithm. 22 / 93

23 Transition-Base Parsing Depenency tree represente as a linear sequence of transitions. 23 / 93

24 Transition-Base Parsing Depenency tree represente as a linear sequence of transitions. Transitions: Simple operations to be execute on a parser configuration 24 / 93

25 Transition-Base Parsing Depenency tree represente as a linear sequence of transitions. Transitions: Simple operations to be execute on a parser configuration Parser Configuration: stack S an a buffer B. 25 / 93

26 Transition-Base Parsing Depenency tree represente as a linear sequence of transitions. Transitions: Simple operations to be execute on a parser configuration Parser Configuration: stack S an a buffer B. During parsing, apply a classifier to ecie which transition to take next, greeily. No backtracking. 26 / 93

27 Transition-Base Parsing: Transitions Parser Configuration: Initial Configuration: buffer contains x an the stack contains the root. Final Configuration: Empty buffer an the stack contains the entire tree. 27 / 93

28 Transition-Base Parsing: Transitions Parser Configuration: Initial Configuration: buffer contains x an the stack contains the root. Final Configuration: Empty buffer an the stack contains the entire tree. Transitions : arc-stanar transition set (Nivre, 2004): shift the wor at the front of the buffer B onto the stack S. right-arc: u = pop(s); v = pop(s); push(s, v u). left-arc: u = pop(s); v = pop(s); push(s, v u). 28 / 93

29 Transition-Base Parsing: Transitions Parser Configuration: Initial Configuration: buffer contains x an the stack contains the root. Final Configuration: Empty buffer an the stack contains the entire tree. Transitions : arc-stanar transition set (Nivre, 2004): shift the wor at the front of the buffer B onto the stack S. right-arc: u = pop(s); v = pop(s); push(s, v u). left-arc: u = pop(s); v = pop(s); push(s, v u). (For labele parsing, a labels to the right-arc an left-arc transitions.) 29 / 93

30 Transition-Base Parsing: Example Buffer B: Stack S: root we vigorously wash our cats who stink Actions: 30 / 93

31 Transition-Base Parsing: Example Buffer B: Stack S: we root vigorously wash our cats who stink Actions: shift 31 / 93

32 Transition-Base Parsing: Example Buffer B: Stack S: vigorously we root wash our cats who stink Actions: shift shift 32 / 93

33 Transition-Base Parsing: Example Stack S: wash vigorously we root Buffer B: our cats who stink Actions: shift shift shift 33 / 93

34 Transition-Base Parsing: Example Stack S: vigorously we root wash Buffer B: our cats who stink Actions: shift shift shift left-arc 34 / 93

35 Transition-Base Parsing: Example Stack S: we vigorously wash root Buffer B: our cats who stink Actions: shift shift shift left-arc left-arc 35 / 93

36 Transition-Base Parsing: Example Stack S: our we vigorously wash Buffer B: cats who stink root Actions: shift shift shift left-arc left-arc shift 36 / 93

37 Transition-Base Parsing: Example Stack S: cats our we vigorously wash Buffer B: who stink root Actions: shift shift shift left-arc left-arc shift shift 37 / 93

38 Transition-Base Parsing: Example Stack S: our cats Buffer B: who stink we vigorously wash root Actions: shift shift shift left-arc left-arc shift shift left-arc 38 / 93

39 Transition-Base Parsing: Example Stack S: who our cats Buffer B: stink we vigorously wash root Actions: shift shift shift shift left-arc left-arc shift shift left-arc 39 / 93

40 Transition-Base Parsing: Example Stack S: stink who our cats Buffer B: we vigorously wash root Actions: shift shift shift left-arc left-arc shift shift left-arc shift shift 40 / 93

41 Transition-Base Parsing: Example Stack S: who our stink cats Buffer B: we vigorously wash root Actions: shift shift shift left-arc left-arc shift shift left-arc shift shift right-arc 41 / 93

42 Transition-Base Parsing: Example Stack S: our cats who stink Buffer B: we vigorously wash root Actions: shift shift shift left-arc left-arc shift shift left-arc shift shift right-arc right-arc 42 / 93

43 Transition-Base Parsing: Example Stack S: Buffer B: we vigorously wash our cats who stink root Actions: shift shift shift left-arc left-arc shift shift left-arc shift shift right-arc right-arc right-arc 43 / 93

44 Transition-Base Parsing: Example Stack S: root Buffer B: we vigorously wash our cats who stink Actions: shift shift shift left-arc left-arc shift shift left-arc shift shift right-arc right-arc right-arc right-arc 44 / 93

45 The Core of Transition-Base Parsing: Classification At each iteration, choose among {shift, right-arc, left-arc}. (Actually, among all L-labele variants of right- an left-arc.) 45 / 93

46 The Core of Transition-Base Parsing: Classification At each iteration, choose among {shift, right-arc, left-arc}. (Actually, among all L-labele variants of right- an left-arc.) Features can look S, B, an the history of past actions usually there is no ecomposition into local structures. 46 / 93

47 The Core of Transition-Base Parsing: Classification At each iteration, choose among {shift, right-arc, left-arc}. (Actually, among all L-labele variants of right- an left-arc.) Features can look S, B, an the history of past actions usually there is no ecomposition into local structures. Training ata: oracle transition sequence that gives the right tree converts into 2 n pairs: state, correct transition. 47 / 93

48 The Core of Transition-Base Parsing: Classification At each iteration, choose among {shift, right-arc, left-arc}. (Actually, among all L-labele variants of right- an left-arc.) Features can look S, B, an the history of past actions usually there is no ecomposition into local structures. Training ata: oracle transition sequence that gives the right tree converts into 2 n pairs: state, correct transition. Each wor gets shifte once an participates as a chil in one arc. Linear time. 48 / 93

49 Transition-Base Parsing: Remarks Can also be applie to phrase-structure parsing (e.g., Sagae an Lavie, 2006). Keywor: shift-reuce parsing. The algorithm for making ecisions oesn t nee to be greey; can maintain multiple hypotheses. E.g., beam search, which we ll iscuss in the context of machine translation later. Potential flaw: the classifier is typically traine uner the assumption that previous classification ecisions were all correct. As yet, no principle solution to this problem, but see ynamic oracles (Golberg an Nivre, 2012). 49 / 93

50 Approaches to Depenency Parsing 1. Transition-base parsing with a stack. 2. Chu-Liu-Emons algorithm for arborescences (irecte graphs). 3. Dynamic programming with the Eisner algorithm. 50 / 93

51 Acknowlegment Slies are mostly aapte from those by Swabha Swayamipta an Sam Thomson. 51 / 93

52 Features in Depenency Parsing For transition-base parsing, we coul use any past ecisions to score the current ecision: a s global (y) = s(a) = s(a i a 0:i 1 ) We gave up on any guarantee of fining the best possible y in favor of arbitrary features. For a neural network-base moel that fully exploits this, see Dyer et al. (2015). i=1 52 / 93

53 Graph-Base Depenency Parsing Selects structures which are globally optimal. 53 / 93

54 Graph-Base Depenency Parsing Selects structures which are globally optimal. Start with a fully connecte graph. Set of O(n 2 ) eges, E. 54 / 93

55 Graph-Base Depenency Parsing Selects structures which are globally optimal. Start with a fully connecte graph. Set of O(n 2 ) eges, E. No incoming eges to x 0, ensuring that it will be the root. 55 / 93

56 First-Orer Graph-Base (FOG) Depenency Parsing (McDonal et al., 2005) Every possible irecte ege e between a parent p an a chil c gets a local score, s(e). y = argmax y E s global (y) = argmax y E subject to the constraint that y is an arborescence s(e) e y Classical algorithm to efficiently solve this problem: Chu an Liu (1965), Emons (1967) 56 / 93

57 Chu-Liu-Emons Intuitions Every non-root noe nees exactly one incoming ege. 57 / 93

58 Chu-Liu-Emons Intuitions Every non-root noe nees exactly one incoming ege. In fact, every connecte component that oesn t contain x 0 nees exactly one incoming ege. 58 / 93

59 Chu-Liu-Emons Intuitions Every non-root noe nees exactly one incoming ege. In fact, every connecte component that oesn t contain x 0 nees exactly one incoming ege. Maximum spanning tree. 59 / 93

60 Chu-Liu-Emons Intuitions Every non-root noe nees exactly one incoming ege. In fact, every connecte component that oesn t contain x 0 nees exactly one incoming ege. Maximum spanning tree. High-level view of the algorithm: 1. For every c, pick an incoming ege (i.e., pick a parent) greeily. 2. If this forms an arborescence, you are one! 3. Otherwise, it s because there s a cycle, C. Arborescences can t have cycles, so some ege in C nees to be kicke out. We also nee to fin an incoming ege for C. Choosing the incoming ege for C etermines which ege to kick out. 60 / 93

61 Chu-Liu-Emons: Recursive (Inefficient) Definition ef maxarborescence(v, E, root): # returns best arborescence as a map from each noe to its parent for c in V \ root: bestinege[c] argmax e E:e= p,c e.s # i.e., s(e) if bestinege contains a cycle C: # buil a new graph where C is contracte into a single noe v C new Noe() V V {v C } \ C E {ajust(e, v C ) for e E \ C} A maxarborescence(v, E, root) return {e.original for e A} C \ {A[v C ].kicksout} # each noe got a parent without creating any cycles return bestinege 61 / 93

62 Unerstaning Chu-Liu-Emons There are two stages: Contraction (the stuff before the recursive call) Expansion (the stuff after the recursive call) 62 / 93

63 Chu-Liu-Emons: Contraction For each non-root noe v, set bestinege[v] to be its highest scoring incoming ege. If a cycle C is forme: contract the noes in C into a new noe v C ajust subroutine on next slie performs the following: Eges incoming to any noe in C now get estination vc For each noe v in C, an for each ege e incoming to v from outsie of C: Set e.kicksout to bestinege[v], an Set e.s to be e.s e.kicksout.s Eges outgoing from any noe in C now get source vc Repeat until every non-root noe has an incoming ege an no cycles are forme 63 / 93

64 Chu-Liu-Emons: Ege Ajustment Subroutine ef ajust(e, v C ): e copy(e) e.original e if e.est C: e.est v C e.kicksout bestinege[e.est] e.s e.s e.kicksout.s elif e.src C: e.src v C return e 64 / 93

65 Contraction Example ROOT V1 V2 V3 bestinege V1 a : 5 b : 1 c : 1 : 11 V2 f : 5 g : 10 i : 8 e : 4 h : 9 V3 a b c e f g h i kicksout 65 / 93

66 Contraction Example ROOT V1 V2 V3 bestinege g V1 a : 5 b : 1 c : 1 : 11 V2 f : 5 g : 10 i : 8 e : 4 h : 9 V3 a b c e f g h i kicksout 66 / 93

67 Contraction Example ROOT V1 V2 V3 bestinege g V1 a : 5 b : 1 c : 1 : 11 V2 f : 5 g : 10 i : 8 e : 4 h : 9 V3 a b c e f g h i kicksout 67 / 93

68 Contraction Example ROOT V1 V2 V3 bestinege g V1 a : 5 10 b : 1 11 c : 1 V4 : 11 V2 f : 5 g : 10 i : 8 11 e : 4 h : 9 10 V3 a b c e f g h i kicksout g g 68 / 93

69 Contraction Example a : 5 ROOT V1 V2 V3 V4 bestinege g V4 b : 10 c : 1 f : 5 i : 3 e : 4 h : 1 V3 a b c e f g h i kicksout g g 69 / 93

70 Contraction Example a : 5 ROOT V1 V2 V3 V4 bestinege g f V4 b : 10 c : 1 f : 5 i : 3 e : 4 h : 1 V3 a b c e f g h i kicksout g g 70 / 93

71 Contraction Example a : 5 ROOT V1 V2 V3 V4 bestinege g f h V4 b : 10 c : 1 f : 5 i : 3 e : 4 h : 1 V3 a b c e f g h i kicksout g g 71 / 93

72 Contraction Example ROOT a : 5 1 b : 10 1 c : 1 5 V5 V4 f : 5 i : 3 e : 4 h : 1 V3 V1 V2 V3 V4 V5 a b c e f g h i bestinege g f h kicksout g, h, h f g 72 / 93

73 Contraction Example ROOT b : 9 a : 4 c : 4 V5 V1 V2 V3 V4 V5 a b c e f g h i bestinege g f h kicksout g, h, h f f g 73 / 93

74 Contraction Example ROOT b : 9 a : 4 c : 4 V5 V1 V2 V3 V4 V5 a b c e f g h i bestinege g f h a kicksout g, h, h f f g 74 / 93

75 Chu-Liu-Emons: Expansion After the contraction stage, every contracte noe will have exactly one bestinege. This ege will kick out one ege insie the contracte noe, breaking the cycle. Go through each bestinege e in the reverse orer that we ae them Lock own e, an remove every ege in kicksout(e) from bestinege. 75 / 93

76 Expansion Example ROOT b : 9 a : 4 c : 4 V5 V1 V2 V3 V4 V5 a b c e f g h i bestinege g f h a kicksout g, h, h f f g 76 / 93

77 Expansion Example ROOT b : 9 a : 4 c : 4 V5 V1 V2 V3 V4 V5 a b c e f g h i bestinege a g f a h a kicksout g, h, h f f g 77 / 93

78 Expansion Example ROOT a : 5 b : 10 c : 1 V4 f : 5 i : 3 e : 4 h : 1 V3 V1 V2 V3 V4 V5 a b c e f g h i bestinege a g f a h a kicksout g, h, h f f g 78 / 93

79 Expansion Example ROOT a : 5 b : 10 c : 1 V4 f : 5 i : 3 e : 4 h : 1 V3 V1 V2 V3 V4 V5 a b c e f g h i bestinege a g f a h a kicksout g, h, h f f g 79 / 93

80 Expansion Example V1 ROOT a : 5 b : 1 c : 1 : 11 V2 f : 5 g : 10 i : 8 e : 4 h : 9 V3 V1 V2 V3 V4 V5 a b c e f g h i bestinege a g f a h a kicksout g, h, h f f g 80 / 93

81 Expansion Example V1 ROOT a : 5 b : 1 c : 1 : 11 V2 f : 5 g : 10 i : 8 e : 4 h : 9 V3 V1 V2 V3 V4 V5 a b c e f g h i bestinege a g f a h a kicksout g, h, h f f g 81 / 93

82 Observation The set of arborescences strictly inclues the set of projective epenency trees. Is this a goo thing or a ba thing? 82 / 93

83 Chu-Liu-Emons: Notes This is a greey algorithm with a clever form of elaye backtracking to recover from inconsistent ecisions (cycles). 83 / 93

84 Chu-Liu-Emons: Notes This is a greey algorithm with a clever form of elaye backtracking to recover from inconsistent ecisions (cycles). CLE is exact: it always recovers an optimal arborescence. 84 / 93

85 Chu-Liu-Emons: Notes This is a greey algorithm with a clever form of elaye backtracking to recover from inconsistent ecisions (cycles). CLE is exact: it always recovers an optimal arborescence. What about labele epenencies? As a matter of preprocessing, for each p, c, keep only the top-scoring labele ege. 85 / 93

86 Chu-Liu-Emons: Notes This is a greey algorithm with a clever form of elaye backtracking to recover from inconsistent ecisions (cycles). CLE is exact: it always recovers an optimal arborescence. What about labele epenencies? As a matter of preprocessing, for each p, c, keep only the top-scoring labele ege. Tarjan (1977) offere a more efficient, but unfortunately incorrect, implementation. Camerini et al. (1979) correcte it. The approach is not recursive; instea using a isjoint set ata structure to keep track of collapse noes. Even better: Gabow et al. (1986) use a Fibonacci heap to keep incoming eges sorte, an fins cycles in a more sensible way. Also constrains root to have only one outgoing ege. With these tricks, O(n 2 + n log n) runtime. 86 / 93

87 More Details on Statistical Depenency Parsing What about the scores? McDonal et al. (2005) use carefully-esigne features an (something close to) the structure perceptron; Kiperwasser an Golberg (2016) use biirectional recurrent neural networks. 87 / 93

88 More Details on Statistical Depenency Parsing What about the scores? McDonal et al. (2005) use carefully-esigne features an (something close to) the structure perceptron; Kiperwasser an Golberg (2016) use biirectional recurrent neural networks. What about higher-orer parsing? Requires approximate inference, e.g., ual ecomposition (Martins et al., 2013). 88 / 93

89 Important Traeoffs (an Not Just in NLP) 1. Two extremes: Specialize algorithm that efficiently solves your problem, uner your assumptions. E.g., Chu-Liu-Emons for FOG epenency parsing. General-purpose metho that solves many problems, allowing you to test the effect of ifferent assumptions. E.g., ynamic programming, transition-base methos, some forms of approximate inference. 89 / 93

90 Important Traeoffs (an Not Just in NLP) 1. Two extremes: Specialize algorithm that efficiently solves your problem, uner your assumptions. E.g., Chu-Liu-Emons for FOG epenency parsing. General-purpose metho that solves many problems, allowing you to test the effect of ifferent assumptions. E.g., ynamic programming, transition-base methos, some forms of approximate inference. 2. Two extremes: Fast (linear-time) but greey Moel-optimal but slow 90 / 93

91 Important Traeoffs (an Not Just in NLP) 1. Two extremes: Specialize algorithm that efficiently solves your problem, uner your assumptions. E.g., Chu-Liu-Emons for FOG epenency parsing. General-purpose metho that solves many problems, allowing you to test the effect of ifferent assumptions. E.g., ynamic programming, transition-base methos, some forms of approximate inference. 2. Two extremes: Fast (linear-time) but greey Moel-optimal but slow Dirty secret: the best way to get (English) epenency trees is to run phrase-structure parsing, then convert. 91 / 93

92 References I Paolo M. Camerini, Luigi Fratta, an Francesco Maffioli. A note on fining optimum branchings. Networks, 9 (4): , Y. J. Chu an T. H. Liu. On the shortest arborescence of a irecte graph. Science Sinica, 14: , Marie-Catherine e Marneffe, Bill MacCartney, an Christopher D. Manning. Generating type epenency parses from phrase structure parses. In Proc. of LREC, Chris Dyer, Miguel Ballesteros, Wang Ling, Austin Matthews, an Noah A. Smith. Transition-base epenency parsing with stack long short-term memory. In Proc. of ACL, Jack Emons. Optimum branchings. Journal of Research of the National Bureau of Stanars, 71B: , Harol N. Gabow, Zvi Galil, Thomas Spencer, an Robert E. Tarjan. Efficient algorithms for fining minimum spanning trees in unirecte an irecte graphs. Combinatorica, 6(2): , Yoav Golberg an Joakim Nivre. A ynamic oracle for arc-eager epenency parsing. In Proc. of COLING, Mark Johnson. PCFG moels of linguistic tree representations. Computational Linguistics, 24(4):613 32, Eliyahu Kiperwasser an Yoav Golberg. Simple an accurate epenency parsing using biirectional LSTM feature representations. Transactions of the Association for Computational Linguistics, 4: , Anré F. T. Martins, Miguel Almeia, an Noah A. Smith. Turning on the turbo: Fast thir-orer non-projective turbo parsers. In Proc. of ACL, / 93

93 References II Ryan McDonal, Fernano Pereira, Kiril Ribarov, an Jan Hajic. Non-projective epenency parsing using spanning tree algorithms. In Proceeings of HLT-EMNLP, URL Igor A. Mel čuk. Depenency Syntax: Theory an Practice. State University Press of New York, Joakim Nivre. Incrementality in eterministic epenency parsing. In Proc. of ACL, Kenji Sagae an Alon Lavie. A best-first probabilistic shift-reuce parser. In Proc. of COLING-ACL, Robert E. Tarjan. Fining optimum branchings. Networks, 7:25 35, L. Tesnière. Éléments e Syntaxe Structurale. Klincksieck, / 93

Dependency Parsing 2 CMSC 723 / LING 723 / INST 725. Marine Carpuat. Fig credits: Joakim Nivre, Dan Jurafsky & James Martin

Dependency Parsing 2 CMSC 723 / LING 723 / INST 725. Marine Carpuat. Fig credits: Joakim Nivre, Dan Jurafsky & James Martin Dependency Parsing 2 CMSC 723 / LING 723 / INST 725 Marine Carpuat Fig credits: Joakim Nivre, Dan Jurafsky & James Martin Dependency Parsing Formalizing dependency trees Transition-based dependency parsing

More information

Transition-Based Dependency Parsing with Stack Long Short-Term Memory

Transition-Based Dependency Parsing with Stack Long Short-Term Memory Transition-Based Dependency Parsing with Stack Long Short-Term Memory Chris Dyer, Miguel Ballesteros, Wang Ling, Austin Matthews, Noah A. Smith Association for Computational Linguistics (ACL), 2015 Presented

More information

Tekniker för storskalig parsning: Dependensparsning 2

Tekniker för storskalig parsning: Dependensparsning 2 Tekniker för storskalig parsning: Dependensparsning 2 Joakim Nivre Uppsala Universitet Institutionen för lingvistik och filologi joakim.nivre@lingfil.uu.se Dependensparsning 2 1(45) Data-Driven Dependency

More information

Assignment 4 CSE 517: Natural Language Processing

Assignment 4 CSE 517: Natural Language Processing Assignment 4 CSE 517: Natural Language Processing University of Washington Winter 2016 Due: March 2, 2016, 1:30 pm 1 HMMs and PCFGs Here s the definition of a PCFG given in class on 2/17: A finite set

More information

Parsing with Dynamic Programming

Parsing with Dynamic Programming CS11-747 Neural Networks for NLP Parsing with Dynamic Programming Graham Neubig Site https://phontron.com/class/nn4nlp2017/ Two Types of Linguistic Structure Dependency: focus on relations between words

More information

Transition-based Parsing with Neural Nets

Transition-based Parsing with Neural Nets CS11-747 Neural Networks for NLP Transition-based Parsing with Neural Nets Graham Neubig Site https://phontron.com/class/nn4nlp2017/ Two Types of Linguistic Structure Dependency: focus on relations between

More information

Graph-Based Parsing. Miguel Ballesteros. Algorithms for NLP Course. 7-11

Graph-Based Parsing. Miguel Ballesteros. Algorithms for NLP Course. 7-11 Graph-Based Parsing Miguel Ballesteros Algorithms for NLP Course. 7-11 By using some Joakim Nivre's materials from Uppsala University and Jason Eisner's material from Johns Hopkins University. Outline

More information

Basic Parsing with Context-Free Grammars. Some slides adapted from Karl Stratos and from Chris Manning

Basic Parsing with Context-Free Grammars. Some slides adapted from Karl Stratos and from Chris Manning Basic Parsing with Context-Free Grammars Some slides adapted from Karl Stratos and from Chris Manning 1 Announcements HW 2 out Midterm on 10/19 (see website). Sample ques>ons will be provided. Sign up

More information

Topics in Parsing: Context and Markovization; Dependency Parsing. COMP-599 Oct 17, 2016

Topics in Parsing: Context and Markovization; Dependency Parsing. COMP-599 Oct 17, 2016 Topics in Parsing: Context and Markovization; Dependency Parsing COMP-599 Oct 17, 2016 Outline Review Incorporating context Markovization Learning the context Dependency parsing Eisner s algorithm 2 Review

More information

Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set

Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set Fast(er) Exact Decoding and Global Training for Transition-Based Dependency Parsing via a Minimal Feature Set Tianze Shi* Liang Huang Lillian Lee* * Cornell University Oregon State University O n 3 O n

More information

Agenda for today. Homework questions, issues? Non-projective dependencies Spanning tree algorithm for non-projective parsing

Agenda for today. Homework questions, issues? Non-projective dependencies Spanning tree algorithm for non-projective parsing Agenda for today Homework questions, issues? Non-projective dependencies Spanning tree algorithm for non-projective parsing 1 Projective vs non-projective dependencies If we extract dependencies from trees,

More information

Dependency Parsing CMSC 723 / LING 723 / INST 725. Marine Carpuat. Fig credits: Joakim Nivre, Dan Jurafsky & James Martin

Dependency Parsing CMSC 723 / LING 723 / INST 725. Marine Carpuat. Fig credits: Joakim Nivre, Dan Jurafsky & James Martin Dependency Parsing CMSC 723 / LING 723 / INST 725 Marine Carpuat Fig credits: Joakim Nivre, Dan Jurafsky & James Martin Dependency Parsing Formalizing dependency trees Transition-based dependency parsing

More information

Let s get parsing! Each component processes the Doc object, then passes it on. doc.is_parsed attribute checks whether a Doc object has been parsed

Let s get parsing! Each component processes the Doc object, then passes it on. doc.is_parsed attribute checks whether a Doc object has been parsed Let s get parsing! SpaCy default model includes tagger, parser and entity recognizer nlp = spacy.load('en ) tells spacy to use "en" with ["tagger", "parser", "ner"] Each component processes the Doc object,

More information

Dependency grammar and dependency parsing

Dependency grammar and dependency parsing Dependency grammar and dependency parsing Syntactic analysis (5LN455) 2014-12-10 Sara Stymne Department of Linguistics and Philology Based on slides from Marco Kuhlmann Mid-course evaluation Mostly positive

More information

Dependency grammar and dependency parsing

Dependency grammar and dependency parsing Dependency grammar and dependency parsing Syntactic analysis (5LN455) 2016-12-05 Sara Stymne Department of Linguistics and Philology Based on slides from Marco Kuhlmann Activities - dependency parsing

More information

Statistical parsing. Fei Xia Feb 27, 2009 CSE 590A

Statistical parsing. Fei Xia Feb 27, 2009 CSE 590A Statistical parsing Fei Xia Feb 27, 2009 CSE 590A Statistical parsing History-based models (1995-2000) Recent development (2000-present): Supervised learning: reranking and label splitting Semi-supervised

More information

The CKY Parsing Algorithm and PCFGs. COMP-550 Oct 12, 2017

The CKY Parsing Algorithm and PCFGs. COMP-550 Oct 12, 2017 The CKY Parsing Algorithm and PCFGs COMP-550 Oct 12, 2017 Announcements I m out of town next week: Tuesday lecture: Lexical semantics, by TA Jad Kabbara Thursday lecture: Guest lecture by Prof. Timothy

More information

Dependency grammar and dependency parsing

Dependency grammar and dependency parsing Dependency grammar and dependency parsing Syntactic analysis (5LN455) 2015-12-09 Sara Stymne Department of Linguistics and Philology Based on slides from Marco Kuhlmann Activities - dependency parsing

More information

Context Encoding LSTM CS224N Course Project

Context Encoding LSTM CS224N Course Project Context Encoding LSTM CS224N Course Project Abhinav Rastogi arastogi@stanford.edu Supervised by - Samuel R. Bowman December 7, 2015 Abstract This project uses ideas from greedy transition based parsing

More information

CS 224N Assignment 2 Writeup

CS 224N Assignment 2 Writeup CS 224N Assignment 2 Writeup Angela Gong agong@stanford.edu Dept. of Computer Science Allen Nie anie@stanford.edu Symbolic Systems Program 1 Introduction 1.1 PCFG A probabilistic context-free grammar (PCFG)

More information

Generalized Edge Coloring for Channel Assignment in Wireless Networks

Generalized Edge Coloring for Channel Assignment in Wireless Networks Generalize Ege Coloring for Channel Assignment in Wireless Networks Chun-Chen Hsu Institute of Information Science Acaemia Sinica Taipei, Taiwan Da-wei Wang Jan-Jan Wu Institute of Information Science

More information

EDAN20 Language Technology Chapter 13: Dependency Parsing

EDAN20 Language Technology   Chapter 13: Dependency Parsing EDAN20 Language Technology http://cs.lth.se/edan20/ Pierre Nugues Lund University Pierre.Nugues@cs.lth.se http://cs.lth.se/pierre_nugues/ September 19, 2016 Pierre Nugues EDAN20 Language Technology http://cs.lth.se/edan20/

More information

Online Learning of Approximate Dependency Parsing Algorithms

Online Learning of Approximate Dependency Parsing Algorithms Online Learning of Approximate Dependency Parsing Algorithms Ryan McDonald Fernando Pereira Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104 {ryantm,pereira}@cis.upenn.edu

More information

Generalized Edge Coloring for Channel Assignment in Wireless Networks

Generalized Edge Coloring for Channel Assignment in Wireless Networks TR-IIS-05-021 Generalize Ege Coloring for Channel Assignment in Wireless Networks Chun-Chen Hsu, Pangfeng Liu, Da-Wei Wang, Jan-Jan Wu December 2005 Technical Report No. TR-IIS-05-021 http://www.iis.sinica.eu.tw/lib/techreport/tr2005/tr05.html

More information

Refresher on Dependency Syntax and the Nivre Algorithm

Refresher on Dependency Syntax and the Nivre Algorithm Refresher on Dependency yntax and Nivre Algorithm Richard Johansson 1 Introduction This document gives more details about some important topics that re discussed very quickly during lecture: dependency

More information

CKY algorithm / PCFGs

CKY algorithm / PCFGs CKY algorithm / PCFGs CS 585, Fall 2018 Introduction to Natural Language Processing http://people.cs.umass.edu/~miyyer/cs585/ Mohit Iyyer College of Information and Computer Sciences University of Massachusetts

More information

CS395T Project 2: Shift-Reduce Parsing

CS395T Project 2: Shift-Reduce Parsing CS395T Project 2: Shift-Reduce Parsing Due date: Tuesday, October 17 at 9:30am In this project you ll implement a shift-reduce parser. First you ll implement a greedy model, then you ll extend that model

More information

CS224n: Natural Language Processing with Deep Learning 1 Lecture Notes: Part IV Dependency Parsing 2 Winter 2019

CS224n: Natural Language Processing with Deep Learning 1 Lecture Notes: Part IV Dependency Parsing 2 Winter 2019 CS224n: Natural Language Processing with Deep Learning 1 Lecture Notes: Part IV Dependency Parsing 2 Winter 2019 1 Course Instructors: Christopher Manning, Richard Socher 2 Authors: Lisa Wang, Juhi Naik,

More information

Compiler Optimisation

Compiler Optimisation Compiler Optimisation Michael O Boyle mob@inf.e.ac.uk Room 1.06 January, 2014 1 Two recommene books for the course Recommene texts Engineering a Compiler Engineering a Compiler by K. D. Cooper an L. Torczon.

More information

Pairwise alignment using shortest path algorithms, Gunnar Klau, November 29, 2005, 11:

Pairwise alignment using shortest path algorithms, Gunnar Klau, November 29, 2005, 11: airwise alignment using shortest path algorithms, Gunnar Klau, November 9,, : 3 3 airwise alignment using shortest path algorithms e will iscuss: it graph Dijkstra s algorithm algorithm (GDU) 3. References

More information

Lecture 1 September 4, 2013

Lecture 1 September 4, 2013 CS 84r: Incentives an Information in Networks Fall 013 Prof. Yaron Singer Lecture 1 September 4, 013 Scribe: Bo Waggoner 1 Overview In this course we will try to evelop a mathematical unerstaning for the

More information

Questions? Post on piazza, or Radhika (radhika at eecs.berkeley) or Sameer (sa at berkeley)!

Questions? Post on piazza, or  Radhika (radhika at eecs.berkeley) or Sameer (sa at berkeley)! EE122 Fall 2013 HW3 Instructions Recor your answers in a file calle hw3.pf. Make sure to write your name an SID at the top of your assignment. For each problem, clearly inicate your final answer, bol an

More information

Transition-Based Dependency Parsing with MaltParser

Transition-Based Dependency Parsing with MaltParser Transition-Based Dependency Parsing with MaltParser Joakim Nivre Uppsala University and Växjö University Transition-Based Dependency Parsing 1(13) Introduction Outline Goals of the workshop Transition-based

More information

arxiv: v1 [cs.cl] 25 Apr 2017

arxiv: v1 [cs.cl] 25 Apr 2017 Joint POS Tagging and Dependency Parsing with Transition-based Neural Networks Liner Yang 1, Meishan Zhang 2, Yang Liu 1, Nan Yu 2, Maosong Sun 1, Guohong Fu 2 1 State Key Laboratory of Intelligent Technology

More information

Introduction to Data-Driven Dependency Parsing

Introduction to Data-Driven Dependency Parsing Introduction to Data-Driven Dependency Parsing Introductory Course, ESSLLI 2007 Ryan McDonald 1 Joakim Nivre 2 1 Google Inc., New York, USA E-mail: ryanmcd@google.com 2 Uppsala University and Växjö University,

More information

Additional Divide and Conquer Algorithms. Skipping from chapter 4: Quicksort Binary Search Binary Tree Traversal Matrix Multiplication

Additional Divide and Conquer Algorithms. Skipping from chapter 4: Quicksort Binary Search Binary Tree Traversal Matrix Multiplication Aitional Divie an Conquer Algorithms Skipping from chapter 4: Quicksort Binary Search Binary Tree Traversal Matrix Multiplication Divie an Conquer Closest Pair Let s revisit the closest pair problem. Last

More information

CS269I: Incentives in Computer Science Lecture #8: Incentives in BGP Routing

CS269I: Incentives in Computer Science Lecture #8: Incentives in BGP Routing CS269I: Incentives in Computer Science Lecture #8: Incentives in BGP Routing Tim Roughgaren October 19, 2016 1 Routing in the Internet Last lecture we talke about elay-base (or selfish ) routing, which

More information

Classifying Facial Expression with Radial Basis Function Networks, using Gradient Descent and K-means

Classifying Facial Expression with Radial Basis Function Networks, using Gradient Descent and K-means Classifying Facial Expression with Raial Basis Function Networks, using Graient Descent an K-means Neil Allrin Department of Computer Science University of California, San Diego La Jolla, CA 9237 nallrin@cs.ucs.eu

More information

I Know Your Name: Named Entity Recognition and Structural Parsing

I Know Your Name: Named Entity Recognition and Structural Parsing I Know Your Name: Named Entity Recognition and Structural Parsing David Philipson and Nikil Viswanathan {pdavid2, nikil}@stanford.edu CS224N Fall 2011 Introduction In this project, we explore a Maximum

More information

Transition-based dependency parsing

Transition-based dependency parsing Transition-based dependency parsing Syntactic analysis (5LN455) 2014-12-18 Sara Stymne Department of Linguistics and Philology Based on slides from Marco Kuhlmann Overview Arc-factored dependency parsing

More information

6.854J / J Advanced Algorithms Fall 2008

6.854J / J Advanced Algorithms Fall 2008 MIT OpenCourseWare http://ocw.mit.eu 6.854J / 18.415J Avance Algorithms Fall 2008 For inormation about citing these materials or our Terms o Use, visit: http://ocw.mit.eu/terms. 18.415/6.854 Avance Algorithms

More information

Advanced Search Algorithms

Advanced Search Algorithms CS11-747 Neural Networks for NLP Advanced Search Algorithms Daniel Clothiaux https://phontron.com/class/nn4nlp2017/ Why search? So far, decoding has mostly been greedy Chose the most likely output from

More information

Non-projective Dependency Parsing using Spanning Tree Algorithms

Non-projective Dependency Parsing using Spanning Tree Algorithms Non-projective Dependency Parsing using Spanning Tree Algorithms Ryan McDonald Fernando Pereira Department of Computer and Information Science University of Pennsylvania {ryantm,pereira}@cis.upenn.edu

More information

Chapter 4. Lexical and Syntax Analysis. Topics. Compilation. Language Implementation. Issues in Lexical and Syntax Analysis.

Chapter 4. Lexical and Syntax Analysis. Topics. Compilation. Language Implementation. Issues in Lexical and Syntax Analysis. Topics Chapter 4 Lexical and Syntax Analysis Introduction Lexical Analysis Syntax Analysis Recursive -Descent Parsing Bottom-Up parsing 2 Language Implementation Compilation There are three possible approaches

More information

Throughput Characterization of Node-based Scheduling in Multihop Wireless Networks: A Novel Application of the Gallai-Edmonds Structure Theorem

Throughput Characterization of Node-based Scheduling in Multihop Wireless Networks: A Novel Application of the Gallai-Edmonds Structure Theorem Throughput Characterization of Noe-base Scheuling in Multihop Wireless Networks: A Novel Application of the Gallai-Emons Structure Theorem Bo Ji an Yu Sang Dept. of Computer an Information Sciences Temple

More information

A shortest path algorithm in multimodal networks: a case study with time varying costs

A shortest path algorithm in multimodal networks: a case study with time varying costs A shortest path algorithm in multimoal networks: a case stuy with time varying costs Daniela Ambrosino*, Anna Sciomachen* * Department of Economics an Quantitative Methos (DIEM), University of Genoa Via

More information

Learning convex bodies is hard

Learning convex bodies is hard Learning convex boies is har Navin Goyal Microsoft Research Inia navingo@microsoftcom Luis Raemacher Georgia Tech lraemac@ccgatecheu Abstract We show that learning a convex boy in R, given ranom samples

More information

Questions? Post on piazza, or Radhika (radhika at eecs.berkeley) or Sameer (sa at berkeley)!

Questions? Post on piazza, or  Radhika (radhika at eecs.berkeley) or Sameer (sa at berkeley)! EE122 Fall 2013 HW3 Instructions Recor your answers in a file calle hw3.pf. Make sure to write your name an SID at the top of your assignment. For each problem, clearly inicate your final answer, bol an

More information

Loop Scheduling and Partitions for Hiding Memory Latencies

Loop Scheduling and Partitions for Hiding Memory Latencies Loop Scheuling an Partitions for Hiing Memory Latencies Fei Chen Ewin Hsing-Mean Sha Dept. of Computer Science an Engineering University of Notre Dame Notre Dame, IN 46556 Email: fchen,esha @cse.n.eu Tel:

More information

Learning Latent Linguistic Structure to Optimize End Tasks. David A. Smith with Jason Naradowsky and Xiaoye Tiger Wu

Learning Latent Linguistic Structure to Optimize End Tasks. David A. Smith with Jason Naradowsky and Xiaoye Tiger Wu Learning Latent Linguistic Structure to Optimize End Tasks David A. Smith with Jason Naradowsky and Xiaoye Tiger Wu 12 October 2012 Learning Latent Linguistic Structure to Optimize End Tasks David A. Smith

More information

Message Transport With The User Datagram Protocol

Message Transport With The User Datagram Protocol Message Transport With The User Datagram Protocol User Datagram Protocol (UDP) Use During startup For VoIP an some vieo applications Accounts for less than 10% of Internet traffic Blocke by some ISPs Computer

More information

Backpropagating through Structured Argmax using a SPIGOT

Backpropagating through Structured Argmax using a SPIGOT Backpropagating through Structured Argmax using a SPIGOT Hao Peng, Sam Thomson, Noah A. Smith @ACL July 17, 2018 Overview arg max Parser Downstream task Loss L Overview arg max Parser Downstream task Head

More information

LING/C SC/PSYC 438/538. Lecture 3 Sandiway Fong

LING/C SC/PSYC 438/538. Lecture 3 Sandiway Fong LING/C SC/PSYC 438/538 Lecture 3 Sandiway Fong Today s Topics Homework 4 out due next Tuesday by midnight Homework 3 should have been submitted yesterday Quick Homework 3 review Continue with Perl intro

More information

Iterative CKY parsing for Probabilistic Context-Free Grammars

Iterative CKY parsing for Probabilistic Context-Free Grammars Iterative CKY parsing for Probabilistic Context-Free Grammars Yoshimasa Tsuruoka and Jun ichi Tsujii Department of Computer Science, University of Tokyo Hongo 7-3-1, Bunkyo-ku, Tokyo 113-0033 CREST, JST

More information

Disjoint Multipath Routing in Dual Homing Networks using Colored Trees

Disjoint Multipath Routing in Dual Homing Networks using Colored Trees Disjoint Multipath Routing in Dual Homing Networks using Colore Trees Preetha Thulasiraman, Srinivasan Ramasubramanian, an Marwan Krunz Department of Electrical an Computer Engineering University of Arizona,

More information

Transition-based Dependency Parsing with Rich Non-local Features

Transition-based Dependency Parsing with Rich Non-local Features Transition-based Dependency Parsing with Rich Non-local Features Yue Zhang University of Cambridge Computer Laboratory yue.zhang@cl.cam.ac.uk Joakim Nivre Uppsala University Department of Linguistics and

More information

Stack- propaga+on: Improved Representa+on Learning for Syntax

Stack- propaga+on: Improved Representa+on Learning for Syntax Stack- propaga+on: Improved Representa+on Learning for Syntax Yuan Zhang, David Weiss MIT, Google 1 Transi+on- based Neural Network Parser p(action configuration) So1max Hidden Embedding words labels POS

More information

Just-In-Time Software Pipelining

Just-In-Time Software Pipelining Just-In-Time Software Pipelining Hongbo Rong Hyunchul Park Youfeng Wu Cheng Wang Programming Systems Lab Intel Labs, Santa Clara What is software pipelining? A loop optimization exposing instruction-level

More information

Statistical Dependency Parsing

Statistical Dependency Parsing Statistical Dependency Parsing The State of the Art Joakim Nivre Uppsala University Department of Linguistics and Philology joakim.nivre@lingfil.uu.se Statistical Dependency Parsing 1(29) Introduction

More information

Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory. Testing Techniques

Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory. Testing Techniques Politehnica University of Timisoara Mobile Computing, Sensors Network an Embee Systems Laboratory ing Techniques What is testing? ing is the process of emonstrating that errors are not present. The purpose

More information

Natural Language Dependency Parsing. SPFLODD October 13, 2011

Natural Language Dependency Parsing. SPFLODD October 13, 2011 Natural Language Dependency Parsing SPFLODD October 13, 2011 Quick Review from Tuesday What are CFGs? PCFGs? Weighted CFGs? StaKsKcal parsing using the Penn Treebank What it looks like How to build a simple

More information

CS 106 Winter 2016 Craig S. Kaplan. Module 01 Processing Recap. Topics

CS 106 Winter 2016 Craig S. Kaplan. Module 01 Processing Recap. Topics CS 106 Winter 2016 Craig S. Kaplan Moule 01 Processing Recap Topics The basic parts of speech in a Processing program Scope Review of syntax for classes an objects Reaings Your CS 105 notes Learning Processing,

More information

THE APPLICATION OF ARTICLE k-th SHORTEST TIME PATH ALGORITHM

THE APPLICATION OF ARTICLE k-th SHORTEST TIME PATH ALGORITHM International Journal of Physics an Mathematical Sciences ISSN: 2277-2111 (Online) 2016 Vol. 6 (1) January-March, pp. 24-6/Mao an Shi. THE APPLICATION OF ARTICLE k-th SHORTEST TIME PATH ALGORITHM Hua Mao

More information

Image Segmentation using K-means clustering and Thresholding

Image Segmentation using K-means clustering and Thresholding Image Segmentation using Kmeans clustering an Thresholing Preeti Panwar 1, Girhar Gopal 2, Rakesh Kumar 3 1M.Tech Stuent, Department of Computer Science & Applications, Kurukshetra University, Kurukshetra,

More information

Learning Polynomial Functions. by Feature Construction

Learning Polynomial Functions. by Feature Construction I Proceeings of the Eighth International Workshop on Machine Learning Chicago, Illinois, June 27-29 1991 Learning Polynomial Functions by Feature Construction Richar S. Sutton GTE Laboratories Incorporate

More information

PART 2. Organization Of An Operating System

PART 2. Organization Of An Operating System PART 2 Organization Of An Operating System CS 503 - PART 2 1 2010 Services An OS Supplies Support for concurrent execution Facilities for process synchronization Inter-process communication mechanisms

More information

Finite Automata Implementations Considering CPU Cache J. Holub

Finite Automata Implementations Considering CPU Cache J. Holub Finite Automata Implementations Consiering CPU Cache J. Holub The finite automata are mathematical moels for finite state systems. More general finite automaton is the noneterministic finite automaton

More information

The CKY algorithm part 2: Probabilistic parsing

The CKY algorithm part 2: Probabilistic parsing The CKY algorithm part 2: Probabilistic parsing Syntactic analysis/parsing 2017-11-14 Sara Stymne Department of Linguistics and Philology Based on slides from Marco Kuhlmann Recap: The CKY algorithm The

More information

Hybrid Combination of Constituency and Dependency Trees into an Ensemble Dependency Parser

Hybrid Combination of Constituency and Dependency Trees into an Ensemble Dependency Parser Hybrid Combination of Constituency and Dependency Trees into an Ensemble Dependency Parser Nathan David Green and Zdeněk Žabokrtský Charles University in Prague Institute of Formal and Applied Linguistics

More information

AT&T: The Tag&Parse Approach to Semantic Parsing of Robot Spatial Commands

AT&T: The Tag&Parse Approach to Semantic Parsing of Robot Spatial Commands AT&T: The Tag&Parse Approach to Semantic Parsing of Robot Spatial Commands Svetlana Stoyanchev, Hyuckchul Jung, John Chen, Srinivas Bangalore AT&T Labs Research 1 AT&T Way Bedminster NJ 07921 {sveta,hjung,jchen,srini}@research.att.com

More information

Acknowledgement. Flow Graph Theory. Depth First Search. Speeding up DFA 8/16/2016

Acknowledgement. Flow Graph Theory. Depth First Search. Speeding up DFA 8/16/2016 8/6/06 Program Analysis https://www.cse.iitb.ac.in/~karkare/cs68/ Flow Graph Theory Acknowlegement Slies base on the material at http://infolab.stanfor.eu/~ullman/ragon/ w06/w06.html Amey Karkare Dept

More information

Cluster Center Initialization Method for K-means Algorithm Over Data Sets with Two Clusters

Cluster Center Initialization Method for K-means Algorithm Over Data Sets with Two Clusters Available online at www.scienceirect.com Proceia Engineering 4 (011 ) 34 38 011 International Conference on Avances in Engineering Cluster Center Initialization Metho for K-means Algorithm Over Data Sets

More information

A Quick Guide to MaltParser Optimization

A Quick Guide to MaltParser Optimization A Quick Guide to MaltParser Optimization Joakim Nivre Johan Hall 1 Introduction MaltParser is a system for data-driven dependency parsing, which can be used to induce a parsing model from treebank data

More information

B-Splines and NURBS Week 5, Lecture 9

B-Splines and NURBS Week 5, Lecture 9 CS 430/585 Computer Graphics I B-Splines an NURBS Week 5, Lecture 9 Davi Breen, William Regli an Maxim Peysakhov Geometric an Intelligent Computing Laboratory Department of Computer Science Drexel University

More information

Clustering Expression Data. Clustering Expression Data

Clustering Expression Data. Clustering Expression Data www.cs.washington.eu/ Subscribe if you Din t get msg last night Clustering Exression Data Why cluster gene exression ata? Tissue classification Fin biologically relate genes First ste in inferring regulatory

More information

Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed.

Here are a couple of warnings to my students who may be here to get a copy of what happened on a day that you missed. Preface Here are my online notes for my Calculus I course that I teach here at Lamar University. Despite the fact that these are my class notes, they shoul be accessible to anyone wanting to learn Calculus

More information

Multilevel Linear Dimensionality Reduction using Hypergraphs for Data Analysis

Multilevel Linear Dimensionality Reduction using Hypergraphs for Data Analysis Multilevel Linear Dimensionality Reuction using Hypergraphs for Data Analysis Haw-ren Fang Department of Computer Science an Engineering University of Minnesota; Minneapolis, MN 55455 hrfang@csumneu ABSTRACT

More information

Uninformed search methods

Uninformed search methods CS 1571 Introuction to AI Lecture 4 Uninforme search methos Milos Hauskrecht milos@cs.pitt.eu 539 Sennott Square Announcements Homework assignment 1 is out Due on Thursay, September 11, 014 before the

More information

APPLYING GENETIC ALGORITHM IN QUERY IMPROVEMENT PROBLEM. Abdelmgeid A. Aly

APPLYING GENETIC ALGORITHM IN QUERY IMPROVEMENT PROBLEM. Abdelmgeid A. Aly International Journal "Information Technologies an Knowlege" Vol. / 2007 309 [Project MINERVAEUROPE] Project MINERVAEUROPE: Ministerial Network for Valorising Activities in igitalisation -

More information

The Application of Constraint Rules to Data-driven Parsing

The Application of Constraint Rules to Data-driven Parsing The Application of Constraint Rules to Data-driven Parsing Sardar Jaf The University of Manchester jafs@cs.man.ac.uk Allan Ramsay The University of Manchester ramsaya@cs.man.ac.uk Abstract In this paper,

More information

Hidden Markov Models. Natural Language Processing: Jordan Boyd-Graber. University of Colorado Boulder LECTURE 20. Adapted from material by Ray Mooney

Hidden Markov Models. Natural Language Processing: Jordan Boyd-Graber. University of Colorado Boulder LECTURE 20. Adapted from material by Ray Mooney Hidden Markov Models Natural Language Processing: Jordan Boyd-Graber University of Colorado Boulder LECTURE 20 Adapted from material by Ray Mooney Natural Language Processing: Jordan Boyd-Graber Boulder

More information

MORA: a Movement-Based Routing Algorithm for Vehicle Ad Hoc Networks

MORA: a Movement-Based Routing Algorithm for Vehicle Ad Hoc Networks : a Movement-Base Routing Algorithm for Vehicle A Hoc Networks Fabrizio Granelli, Senior Member, Giulia Boato, Member, an Dzmitry Kliazovich, Stuent Member Abstract Recent interest in car-to-car communications

More information

Principles of B-trees

Principles of B-trees CSE465, Fall 2009 February 25 1 Principles of B-trees Noes an binary search Anoe u has size size(u), keys k 1,..., k size(u) 1 chilren c 1,...,c size(u). Binary search property: for i = 1,..., size(u)

More information

Learning Subproblem Complexities in Distributed Branch and Bound

Learning Subproblem Complexities in Distributed Branch and Bound Learning Subproblem Complexities in Distribute Branch an Boun Lars Otten Department of Computer Science University of California, Irvine lotten@ics.uci.eu Rina Dechter Department of Computer Science University

More information

Preamble. Singly linked lists. Collaboration policy and academic integrity. Getting help

Preamble. Singly linked lists. Collaboration policy and academic integrity. Getting help CS2110 Spring 2016 Assignment A. Linke Lists Due on the CMS by: See the CMS 1 Preamble Linke Lists This assignment begins our iscussions of structures. In this assignment, you will implement a structure

More information

Turning on the Turbo: Fast Third-Order Non- Projective Turbo Parsers

Turning on the Turbo: Fast Third-Order Non- Projective Turbo Parsers Carnegie Mellon University Research Showcase @ CMU Language Technologies Institute School of Computer Science 8-2013 Turning on the Turbo: Fast Third-Order Non- Projective Turbo Parsers Andre F.T. Martins

More information

Spanning Tree Methods for Discriminative Training of Dependency Parsers

Spanning Tree Methods for Discriminative Training of Dependency Parsers University of Pennsylvania ScholarlyCommons Technical Reports (CIS) Department of Computer & Information Science January 2006 Spanning Tree Methods for Discriminative Training of Dependency Parsers Ryan

More information

Figure 1: 2D arm. Figure 2: 2D arm with labelled angles

Figure 1: 2D arm. Figure 2: 2D arm with labelled angles 2D Kinematics Consier a robotic arm. We can sen it commans like, move that joint so it bens at an angle θ. Once we ve set each joint, that s all well an goo. More interesting, though, is the question of

More information

BIJECTIONS FOR PLANAR MAPS WITH BOUNDARIES

BIJECTIONS FOR PLANAR MAPS WITH BOUNDARIES BIJECTIONS FOR PLANAR MAPS WITH BOUNDARIES OLIVIER BERNARDI AND ÉRIC FUSY Abstract. We present bijections for planar maps with bounaries. In particular, we obtain bijections for triangulations an quarangulations

More information

Queueing Model and Optimization of Packet Dropping in Real-Time Wireless Sensor Networks

Queueing Model and Optimization of Packet Dropping in Real-Time Wireless Sensor Networks Queueing Moel an Optimization of Packet Dropping in Real-Time Wireless Sensor Networks Marc Aoun, Antonios Argyriou, Philips Research, Einhoven, 66AE, The Netherlans Department of Computer an Communication

More information

Ling 571: Deep Processing for Natural Language Processing

Ling 571: Deep Processing for Natural Language Processing Ling 571: Deep Processing for Natural Language Processing Julie Medero January 14, 2013 Today s Plan Motivation for Parsing Parsing as Search Search algorithms Search Strategies Issues Two Goal of Parsing

More information

An Adaptive Routing Algorithm for Communication Networks using Back Pressure Technique

An Adaptive Routing Algorithm for Communication Networks using Back Pressure Technique International OPEN ACCESS Journal Of Moern Engineering Research (IJMER) An Aaptive Routing Algorithm for Communication Networks using Back Pressure Technique Khasimpeera Mohamme 1, K. Kalpana 2 1 M. Tech

More information

A Classification of 3R Orthogonal Manipulators by the Topology of their Workspace

A Classification of 3R Orthogonal Manipulators by the Topology of their Workspace A Classification of R Orthogonal Manipulators by the Topology of their Workspace Maher aili, Philippe Wenger an Damien Chablat Institut e Recherche en Communications et Cybernétique e Nantes, UMR C.N.R.S.

More information

THE BAYESIAN RECEIVER OPERATING CHARACTERISTIC CURVE AN EFFECTIVE APPROACH TO EVALUATE THE IDS PERFORMANCE

THE BAYESIAN RECEIVER OPERATING CHARACTERISTIC CURVE AN EFFECTIVE APPROACH TO EVALUATE THE IDS PERFORMANCE БСУ Международна конференция - 2 THE BAYESIAN RECEIVER OPERATING CHARACTERISTIC CURVE AN EFFECTIVE APPROACH TO EVALUATE THE IDS PERFORMANCE Evgeniya Nikolova, Veselina Jecheva Burgas Free University Abstract:

More information

Comparison of Methods for Increasing the Performance of a DUA Computation

Comparison of Methods for Increasing the Performance of a DUA Computation Comparison of Methos for Increasing the Performance of a DUA Computation Michael Behrisch, Daniel Krajzewicz, Peter Wagner an Yun-Pang Wang Institute of Transportation Systems, German Aerospace Center,

More information

Distributed Planning in Hierarchical Factored MDPs

Distributed Planning in Hierarchical Factored MDPs Distribute Planning in Hierarchical Factore MDPs Carlos Guestrin Computer Science Dept Stanfor University guestrin@cs.stanfor.eu Geoffrey Goron Computer Science Dept Carnegie Mellon University ggoron@cs.cmu.eu

More information

Accurate Parsing and Beyond. Yoav Goldberg Bar Ilan University

Accurate Parsing and Beyond. Yoav Goldberg Bar Ilan University Accurate Parsing and Beyond Yoav Goldberg Bar Ilan University Syntactic Parsing subj root rcmod rel xcomp det subj aux acomp acomp The soup, which I expected to be good, was bad subj root rcmod rel xcomp

More information

Undirected Dependency Parsing

Undirected Dependency Parsing Computational Intelligence, Volume 59, Number 000, 2010 Undirected Dependency Parsing CARLOS GÓMEZ-RODRÍGUEZ cgomezr@udc.es Depto. de Computación, Universidade da Coruña. Facultad de Informática, Campus

More information

Non-Projective Dependency Parsing in Expected Linear Time

Non-Projective Dependency Parsing in Expected Linear Time Non-Projective Dependency Parsing in Expected Linear Time Joakim Nivre Uppsala University, Department of Linguistics and Philology, SE-75126 Uppsala Växjö University, School of Mathematics and Systems

More information

On the Role of Multiply Sectioned Bayesian Networks to Cooperative Multiagent Systems

On the Role of Multiply Sectioned Bayesian Networks to Cooperative Multiagent Systems On the Role of Multiply Sectione Bayesian Networks to Cooperative Multiagent Systems Y. Xiang University of Guelph, Canaa, yxiang@cis.uoguelph.ca V. Lesser University of Massachusetts at Amherst, USA,

More information