BottomUp Parsing II. Lecture 8


 Damian Watts
 4 years ago
 Views:
Transcription
1 BottomUp Parsing II Lecture 8 1
2 Review: ShiftReduce Parsing Bottomup parsing uses two actions: Shift ABC xyz ABCx yz Reduce Cbxy ijk CbA ijk 2
3 Recall: he Stack Left string can be implemented by a stack op of the stack is the Shift pushes a terminal on the stack Reduce pops 0 or more symbols off of the stack production rhs pushes a nonterminal on the stack production lhs 3
4 Key Issue How do we decide when to shift or reduce? xample grammar: + int * int ) Consider step int * int + int We could reduce by int giving * int + int A fatal mistake! No way to reduce to the start symbol Basically, no production that looks like * 4
5 So What his Shows We definitely don t always want to reduce just because we have the right hand side of a production on top of the stack o repeat, just because there s the right hand side of a production sitting on the top of the stack, it might be a mistake to do a reduction We might instead want to wait and do our reduction someplace else And the idea about how we decide is that we only want to reduce if the result can still be reduced to the start symbol 5
6 Recall Some erminology If S is the start symbol of a grammar G, and α is such that S * α, then α is a sentential form of G Note α may contain both terminals and nonterminals A sentence of G is a sentential form that contains no nonterminals. echnically speaking, the language of G, LG), is the set of sentences of G 6
7 Handles Intuition: Want to reduce only if the result can still be reduced to the start symbol Informally, a handle is a substring that matches the body of a production, and whose reduction using that production) represents one step along the reverse of a rightmost derivation 7
8 Handles Intuition: Want to reduce only if the result can still be reduced to the start symbol Assume a rightmost derivation S * αxω αβω Remember, parser is going from right to left here hen production X β is a handle of αβω What is means is that you can reduce using this production and still potentially get back to the start symbol 8
9 Handles Cont.) Handles formalize the intuition A handle is a string that can be reduced and also allows further reductions back to the start symbol using a particular production at a specific spot) We only want to reduce at handles Clearly, since if we do a reduction at a place that is not a handle, we won t be able to get back to the start symbol! Note: We have said what a handle is, not how to find handles Finding handles consumes most of the rest of our 9 discussions of parsing
10 echniques for Recognizing Handles Bad news: here is no known efficient algorithm that recognizes handles in general So for arbitrary grammar, we don t have a fast way to find handles when we are parsing Good news: here are heuristics for guessing handles AND for some fairly large classes of contextfree grammars, these heuristics always identify the handles correctly For the heuristics we will use, these are the SLR simple LR) grammars L Lefttoright) R Rightmost derivation in reverse))
11 Important Fact #2 Important Fact #2 about bottomup parsing: In shiftreduce parsing, handles appear only at the top of the stack, never inside Recall that Important Fact #1 was in previous slide set: A bottomup parser traces a rightmost derivation in reverse. 11
12 Why? Informal induction on # of reduce moves: rue initially, stack is empty Immediately after reducing a handle rightmost nonterminal on top of the stack next handle must be to right of rightmost nonterminal, because this is a rightmost derivation Sequence of shift moves reaches next handle 12
13 Summary of Handles In shiftreduce parsing, handles always appear at the top of the stack Handles are never to the left of the rightmost nonterminal herefore, shiftreduce moves are sufficient; the need never move left Bottomup parsing algorithms are based on recognizing handles 13
14 Recognizing Handles here are no known efficient algorithms to recognize handles Solution: use heuristics to guess which stacks are handles On some CFGs, the heuristics always guess correctly For the heuristics we use here, these are the SLR grammars Other heuristics work for other grammars 14
15 Classes of GFGs equivalently, parsers) Unambiguous CFG Only one parse tree for a given string of tokens quivalently, ambiguous grammar is one that produces more than one leftmost tree or more than one rightmost tree for a given sentence LRk) CFG Bottomup parser that scans from lefttoright, produces a rightmost derivation, and uses k tokens of lookahead 15
16 Classes of GFGs LALRk) CFG Lookahead LR grammar Basically a more memory efficient variant of an LR parser at time of development, memory requirements of LR parsers made them impractical) Most parser generator tools use this class SLRk) CFG Simple LR parser Smaller parse table and simpler algorithm 16
17 Grammars Strict containment relations here All CFGs Unambiguous CFGs LRk) CFGs will generate conflicts LALRk) CFGs SLRk) CFGs
18 Viable Prefixes It is not obvious how to detect handles At each step the parser sees only the stack, not the entire input; start with that... α is a viable prefix if there is an ω such that α ω is a valid state of a shiftreduce parse stack rest of input parser might know first token of ω but typically not more than that 18
19 Huh? What does this mean? A few things: A viable prefix does not extend past the right end of the handle It s a viable prefix because it is a prefix of the handle As long as a parser has viable prefixes on the stack no parsing error has been detected his all isn t very deep. Saying that α is a viable prefix is just saying that if α is at the top of the stack, we haven t yet made an error. hese are the valid states of a shift reduce parse.)
20 Important Fact #3 Important Fact #3 about bottomup parsing: For any grammar, the set of viable prefixes is a regular language his is an amazing fact, and one that is the key to bottomup parsing. Almost all the bottom up parsing tools are based on this fact. 20
21 Important Fact #3 Cont.) Important Fact #3 is nonobvious We show how to compute automata that accept viable prefixes But first, we ll need a number of additional definitions 21
22 Items An item is a production with a. somewhere on the rhs We can create these from the usual productions.g., the items for ) are.).).) ). 22
23 Items Cont.) he only item for X is X. his is a special case Items are often called LR0) items 23
24 Intuition he problem in recognizing viable prefixes is that the stack has only bits and pieces of the rhs of productions If it had a complete rhs, we could reduce hese bits and pieces are always prefixes of rhs of productions 24
25 It urns Out that what is on the stack is not random It has a special structure: these bits and pieces are always prefixes of the righthand sides of productions hat is, in any successful parse, what is on the stack always has to be a prefix of the right hand side of some production or productions 25
26 xample Consider the input int) stack input hen ) is a valid state of a shiftreduce parse is a prefix of the rhs of ) Will be reduced after the next shift Item.) says that so far we have seen of this production and hope to see ) I.e., this item describes this state of affairs what we ve seen and what we hope to see) Note: we may not see what we hope to see 26
27 Stated Another Way Left of dot tells what we re working on, right of dot tells what we re waiting to see before we can perform a reduction 27
28 Generalization he stack always has the structure of possibly multiple) prefixes of rhs s Several prefixes literally stacked up on the stack Prefix 1 Prefix 2... Prefix n1 Prefix n 28
29 Generalization Let Prefix i be a prefix of rhs of X i α i Prefix i will eventually reduce to X i he missing part of α i1 starts with X i i.e., the missing suffix of α i1 must start with X i ) hat is, when we eventually replace α i with X i, X i must extend prefix α i1 to be closer to a complete rhs of production X i1 α i1 i.e. there is a production X i1 Prefix i1 X i β for some β which may be empty) 29
30 Generalization Recursively, Prefix k+1 Prefix n must eventually reduce to the missing part of α k hat is, all the prefixes above prefix k must eventually reduce to the missing part of α k So the image: A stack of prefixes. Always working on the topmost shifting and reducing) But every time we perform a reduction, that must extend the prefix immediately below it on the stack And when a bunch of these have been removed via reduction, then we get to work on prefixes lower in the stack
31 An xample + int * int ) Consider the string int * int): int * int) is a state of a shiftreduce parse stack input is a prefix of the rhs of ) Seen, working on rest of this production is a prefix of the rhs of We ve seen none of this int * is a prefix of the rhs of int * We ve seen the start of this 31
32 An xample Cont.) Can record this with the stack of items Says.). int *. We ve seen of ) We ve seen of We ve seen int * of int * top of stack Note how lhs of each production is going to become part of rhs of previous production
33 Recognizing Viable Prefixes Idea: o recognize viable prefixes, we must Recognize a sequence of partial right hand sides of productions, where ach sequence can eventually reduce to part of the missing suffix of its predecessor 33
34 Now, onto the algorithm for implementing this idea hat is, we finally get to the technical highlight of bottomup parsing! 34
35 An NFA Recognizing Viable Prefixes We re going to build an NFA that recognizes the viable prefixes of G his should be possible, since we claim that the set of viable prefixes of G are a regular language Input to the NFA is the stack the NFA reads the stack) Reads the stack from bottom to top Output of NFA is either yes viable prefix) or no not viable prefix) 35
36 An NFA Recognizing Viable Prefixes But let s be clear what this is saying: Yes: his stack is OK, we could end up parsing the input No: What we ve got on the stack now doesn t resemble any valid stack for any possible parse of any input string for this grammar. 36
37 An NFA Recognizing Viable Prefixes Simplifies some things 1. Add a dummy production S S to G 2. he NFA states are the items of G Including the extra production 3. For item α.xβ add transition α.xβ X αx.β X can be terminal or nonterminal here means on input X And remember, items are the states of the NFA, so this is telling us when to move from one state to another e.g., a transition) 37
38 An NFA Recognizing Viable Prefixes X must be nonterminal here 4. For item α.xβ and production X γ add α.xβ X.γ Is it possible that there could be a valid configuration of the parser where we saw α but then X didn t appear next? Yes: stack is sequence of partial rhs Could be that all that s on the stack right now for this production is α and the next thing on the stack is eventually going to reduce to X 38
39 An NFA Recognizing Viable Prefixes What does this mean? It means that whatever is there on the stack has to be derived from X, has to be something that can be generated by using a sequence of X productions), because eventually it s going to reduce the X So, for every item that looks like this, and for every production for X, we re going to say that if there is no X on the stack, we make an move, we shift to a state where we can try to recognize the rhs plus something derived from X 39
40 An NFA Recognizing Viable Prefixes 1. Add a dummy production S S to G 2. he NFA states are the items of G Including the extra production 3. For item NFA state) α.xβ add transition α.xβ X αx.β 4. For item α.xβ and production X γ add α.xβ X.γ point must mark the start of another rhs 40
41 An NFA Recognizing Viable Prefixes Cont.) 5. very state is an accepting state If the automaton successfully consumes the entire stack, then the stack is viable Note not every state is going to have a transition on every symbol so lot of possible stacks will be rejected) 6. Start state is S.S Dummy production added so we can conveniently name the start state 41
42 An xample S + int * int ) he usual, augmented with the extra production 42
43 NFA for Viable Prefixes of the xample ). ).).) ). S. S.. +.int.+ int + int int * int *. int.* int *. 43
44 he xample Automata It s actually quite complicated Many states and transitions And this is for a relatively simple grammar, so hese NFAs for recognizing viable prefixes for grammars are actually quite elaborate Let s build it! 44
45 NFA for Viable Prefixes in Detail 1) S. So, this is saying we re hoping to see an on the stack. If we don t, then we re hoping to see something derived from. 45
46 NFA for Viable Prefixes in Detail 2) S. S... + If we see and on the stack, we move the dot over, saying we ve read the on the stack. his would indicate we re probably done with parsing you would have reduced the old start symbol and be about to reduce to the new start symbol.) Lacking at top of stack, hope for either a or a + use Rule 4) 46
47 NFA for Viable Prefixes in Detail 2) S. S... + Notice how we crucially use the power of nondeterministic automata. We don t know which rhs of a production is going to appear on the stack and this grammar isn t even leftfactored) so we just use the guessing power of the NFA, let it choose which one to use because it can always guess correctly). 47
48 NFA for Viable Prefixes in Detail 2) S. S... + Another way of describing the intuition here: From the red box, we d like to see an as the next input symbol. If we do, then we can reduce to the start state S. But, if we don t see an next, then we d like to see something that can be derived from, because then we could eventually reduce that derived thing back to, and then back to S. he  transitions take us to things that can be derived from. 48
49 NFA for Viable Prefixes in Detail 2) S. S... + Of course we can compile this down to a DFA, but at this point it s really nice to not have to know which choice to make. S + int * int ) 49
50 NFA for Viable Prefixes in Detail 3). ) S. S...+.int Same as before. Note that when we see a dot at the end of a rhs, we ve got a handle that we might want to reduce. We ll talk about this later, but this is kind of how we re going to go about recognizing handles. 50
51 NFA for Viable Prefixes in Detail 3). ) S. S..+.int Note that in these newly added links, the dot is all the way to the left, indicating that we haven t actually seen anything from the rhs of the production yet.. S + int * int ). 51
52 NFA for Viable Prefixes in Detail 4). ) S. S int.+ Note that when we see a nonterminal to the right of the dot, we create a link corresponding to that nonterminal being on top of the stack, and other links transitions) to the other productions that derive things from that nonterminal
53 NFA for Viable Prefixes in Detail 5). ).) S. S... +.int.+ Note that when we see a terminal to the right of the dot, there is only on possible move: we create a link corresponding to that terminal being on top of the stack.. 53
54 NFA for Viable Prefixes in Detail 6). ).).) S. S.. +.int
55 NFA for Viable Prefixes in Detail 7). ).).) ) ). S. S.. +.int
56 NFA for Viable Prefixes in Detail 8). ).).) ) ). S. S.. +.int
57 NFA for Viable Prefixes in Detail 9). ) S. S..). +.int.).+ + ) )
58 NFA for Viable Prefixes in Detail 10). ) S. S..). +.int.).+ int int. ) + )
59 NFA for Viable Prefixes in Detail 11). ) S. S..). +.int.).+ int int. ) + ) int int.*. 59
60 NFA for Viable Prefixes in Detail 12). ) S. S..). +.int.).+ int int. ) + ) int int.* *. int *. 60
61 NFA for Viable Prefixes in Detail 13). ) S. S..). +.int.).+ int int. ) + ) int * int *. int.* int *. 61
62 Start symbol NFA for Viable Prefixes) to DFA. ) S. S..). +.int.).+ int int. ) + ) int * int *. int.* int *. 62
63 NFA for Viable Prefixes) to DFA. ) S. S..). +.int.).+ int int. ) + ) int * int *. int.* int *. 63
64 NFA for Viable Prefixes) to DFA. ) S. S..). +.int.).+ int int. ) + ) int * int *. int.* int *. 64
65 NFA for Viable Prefixes) to DFA. ) S. S..). +.int.).+ int int. ) + ) int * int *. int.* int *. 65
66 +. ranslation to the DFA S int int. * S.. int int.. + int *.) int *..).. +.) int int *..) +.. ).. +.) ). ) 66
67 Lingo he states of the DFA are canonical collections of items or canonical collections of LR0) items he Dragon book gives another way of constructing LR0) items It s a little more complex, and a bit more difficult to understand if you re seeing this for the first time.) 67
68 Valid Items Item X β.γ is valid for a viable prefix αβ if S * αxω αβγω by a rightmost derivation After parsing αβ, the valid items are the possible tops of the stack of items Remember what this item is saying: I hope to see γ at the top of the stack so that I can reduce βγ to X. Saying the item is valid is saying that doing that reduction leaves open the possibility of a successful parse 68
69 Simpler Way: Items Valid for a Prefix For a given viable prefix α, the items that are valid in that prefix) are exactly the items that are in the final state of the DFA after it reads α. An item I is valid for a given viable prefix α if the DFA that recognizes viable prefixes terminates on input α in a state s containing I) he items in s describe what the top of the item stack might be after reading input α So valid items describe what might be on top of stack after you ve seen stack α )
70 Valid Items xample An item is often valid for many prefixes xample: he item.) is valid for prefixes... o see this, look at the DFA 70
71 +. Valid Items for S... + int. * S.. int int.. + int *.) int *..) ) int int int *..) +.. ).. +.) ). ) 71
72 So Now We re finally going to give the actual bottomup parsing algorithm. But first because we like to keep you in suspense), we give a few minutes to a very weak, but very similar, bottomup parsing algorithm. All this builds on what we ve been talking about so far. 72
73 LR0) Parsing Algorithm Idea: Assume stack contains α next input is t DFA on input α terminates in state s. i.e., s is a final state of the DFA) hen Reduce by X β if s contains item X β. we ve seen entire rhs here) Shift if s contains item X β.tω OK to add t to the stack) equivalent to saying s has a transition labeled t
74 LR0) Conflicts i.e. When does LR0) get into trouble) LR0) has a reduce/reduce conflict if: Any state has two reduce items: X β. and Y ω. two complete rhs, could do either) might not be able to decide between two possible reduce moves) LR0) has a shift/reduce conflict if: Any state has a reduce item and a shift item: X β. and Y ω.tδ Do we shift or reduce?) Only an issue if t is the next item in the input 74
75 if next input is +, can do shift or reduce here LR0) Conflicts S int S.. int int. * int.. + int *.) int *..) ) int int *..) ) ). +.. ).. +.) wo shift/reduce conflicts with LR0) rules 75
76 if next input is *, can do shift or reduce here LR0) Conflicts S int S.. int int. * int.. + int *.) int *..) ) int int *..) ) ). +.. ).. +.) wo shift/reduce conflicts with LR0) rules 76
77 SLR LR = Lefttoright scan SLR = Simple LR SLR improves on LR0) shift/reduce heuristics Add one small heuristic so that fewer states have conflicts 77
78 SLR Parsing Idea: Assume stack contains α next input is t DFA on input α terminates in state s Reduce by X β if s contains item X β. β is on top of stack and valid) t FollowX) Shift if s contains item X β.tω Without Follow here, decision is based entirely on stack contents. But it might not make sense to reduce, given what next input symbol is. 78
79 So what s going on here? We have our stack contents It ends in a β And now we re going to make a move a reduction) and replace that β by X So things look like this β t And we re going to make it look like X t And this makes no sense whatsoever if t cannot follow X So we add the restriction that we only make the move if t can follow X 79
80 SLR Parsing Cont.) If there are conflicts under these rules, the grammar is not SLR he rules amount to a heuristic for detecting handles Defn: he SLR grammars are those where the heuristics detect exactly the handles So we take into account two pieces of information: 1) the contents of the stack that s what the DFA does for us it tells us what items are possible when we get to the top of the stack) and 2) what s 80 coming in the input
81 +. SLR Conflicts S... + S.. int int. * int.. + int *.) int *..) + int.. +.) int int *..) ) ). +.. ).. + Follow) = { ), $ }.) Follow) = { +, ), $ } No conflicts with SLR rules! 81
82 Precedence Declarations Digression Lots of grammars aren t SLR including all ambiguous grammars So SLR is an improvement, but not a really very general class of grammar We can make SLR parse more grammars by using precedence declarations Instructions for resolving conflicts 82
83 Precedence Declarations Cont.) Consider our favorite ambiguous grammar: + * ) int he DFA for this grammar contains a state with the following items: *.. + shift/reduce conflict! xactly the question of precedence of + and * Declaring * has higher precedence than + resolves this conflict in favor of reducing Note that reducing here places the * lower in the parse tree! 83
84 Precedence Declarations Cont.) he term precedence declaration is misleading hese declarations do not define precedence; they define conflict resolutions Not quite the same thing! In our example, because we re dealing with a natural grammar, conflict resolution has same effect as enforcing precedence we want. But more complicated grammar > more interactions between various pieces of grammar > you might not do what you want in terms of enforcing precedence 84
85 Precedence Declarations Cont.) Fortunately, the tools provide means for printing out the parsing automaton, which allows you to see exactly how conflicts are being resolved and whether these are the resolutions you expected. It is recommended that when you are building a parser, especially a fairly complex one, that you examine the parsing automaton to make sure that it s doing exactly what you expect. 85
86 Naïve SLR Parsing Algorithm 1. Let M be DFA for viable prefixes of G 2. Let x 1 x n $ be initial configuration 3. Repeat until configuration is S $ Let α ω be current configuration Run M on current stack α If M rejects α, report parsing error Stack α is not a viable prefix If M accepts α with items I, let a be next input Shift if X β. a γ I Reduce if X β. I and a FollowX) Report parsing error if neither applies 86
87 Naïve SLR Parsing Algorithm 1. Let M be DFA for viable prefixes of G 2. Let x 1 x n $ be initial configuration 3. Repeat until configuration is S $ Let α ω be current configuration Run M on current stack α If M rejects α, report parsing error Stack α is not a viable prefix If M accepts α with items I, let a be next input Shift if X β. a γ I Reduce if X β. I and a FollowX) Report parsing error if neither applies not needed: bottom checks will always catch parse errors we will never form an invalid stack
88 Notes If there is a conflict in the last step, grammar is not SLRk) k is the amount of lookahead In practice k = 1 88
89 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
90 xtended xample of SLR Parsing Configuration DFA Halt State int * int$ Action 90
91 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
92 xtended xample of SLR Parsing Configuration DFA Halt State Action int * int$ 1 shift 92
93 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 93
94 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
95 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
96 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift 96
97 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 97
98 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
99 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
100 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
101 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift 101
102 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 102
103 2 Configuration int * int $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
104 2 Configuration int * int $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
105 2 Configuration int * int $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
106 2 Configuration int * int $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
107 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 3 $ Follow) red. int 107
108 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 3 $ Follow) red. int int * $ 108
109 2 Configuration int * $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
110 2 Configuration int * $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
111 2 Configuration int * $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
112 2 Configuration int * $ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
113 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 3 $ Follow) red. int int * $ 4 $ Follow) red. int* 113
114 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 3 $ Follow) red. int int * $ 4 $ Follow) red. int* $ 114
115 Configuration $ S.. S... +.) 1. + int. * int int. int * int *..) int ) int int *..) ) ) ).. +.) 8 115
116 Configuration $ S.. S... +.) 1. + int. * int int. int * int *..) int ) int int *..) ) ) ).. +.) 8 116
117 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 3 $ Follow) red. int int * $ 4 $ Follow) red. int* $ 5 $ Follow) red. 117
118 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 3 $ Follow) red. int int * $ 4 $ Follow) red. int* $ 5 $ Follow) red. $ 118
119 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
120 2 Configuration int * int$ S.. S... +.) 1. + int int. * int int. int * int *..) ) ) int int *..) ) ).. +.)
121 SLR xample Configuration DFA Halt State Action int * int$ 1 shift int * int$ 3 * not in Follow) shift int * int$ 11 shift int * int $ 3 $ Follow) red. int int * $ 4 $ Follow) red. int* $ 5 $ Follow) red. $ accept 121
122 Notes Rerunning the automaton at each step is wasteful Most of the work is repeated Put another way, at each iteration of the algorithm, the change occurs at the top of the stack, but each iteration we repeat through the DFA, which is 122 unnecessary
123 An Improvement Remember the state of the automaton on each prefix of the stack Change stack to contain pairs before it just contained symbols) Symbol, DFA State 123
124 An Improvement Cont.) For a stack sym 1, state 1... sym n, state n state n is the final state of the DFA on sym 1 sym n Detail: he bottom of the stack is any,start where any is any dummy symbol doesn t matter what) start is the start state of the DFA 124
125 Goto able Define goto[i,a] = j if state i A state j goto is just the transition function of the DFA written out as an array) One of two parsing tables other shown a few slides further on) 125
126 Refined Parser Has 4 Possible Moves Shift x Push a, x on the stack a is current input x is a DFA state Reduce X α As before pop an element from the rhs) off the stack and push an element lhs) onto the stack) Accept rror 126
127 Action able the second table) For each state s i and terminal a If s i has item X α.aβ and goto[i,a] = j then action[i,a] = shift j If s i has item X α. and a FollowX) and X S then action[i,a] = reduce X α If s i has item S S. then action[i,$] = accept Otherwise, action[i,a] = error ells us which kind of move to make in every possible state Indexed by the state of the automaton and next input symbol
128 SLR Parsing Algorithm Let I = w$ be initial input Let j = 0 Let DFA state 1 have item S.S Let stack = dummy, 1 repeat case action[top_statestack),i[j]] of shift k: push I[j++], k reduce X A: pop A pairs, push X, goto[top_statestack),x] accept: halt normally error: halt and report error 128
129 Notes on SLR Parsing Algorithm Note that the algorithm uses only the DFA states and the input he stack symbols are never used! However, we still need the symbols for semantic actions hrowing away the stack symbols amounts to throwing away the program, which we ll obviously need for later stages of compilation! 129
130 More Notes Could be k lookahead and thus k items here and item need not be $) Some common constructs are not SLR1) LR1) is more powerful Main difference between SLR and LR Build lookahead into the items An LR1) item is a pair: LR0) item x lookahead [. int *, $] means After seeing int * reduce if lookahead is $ More accurate than just using follow sets ake a look at the LR1) automaton for your parser! Actually, your parser is an LALR1) automaton Bit of an optimization over LR1) 130
Review: ShiftReduce Parsing. Bottomup parsing uses two actions: BottomUp Parsing II. Shift ABC xyz ABCx yz. Lecture 8. Reduce Cbxy ijk CbA ijk
Review: ShiftReduce Parsing Bottomup parsing uses two actions: BottomUp Parsing II Lecture 8 Shift ABC xyz ABCx yz Reduce Cbxy ijk CbA ijk Prof. Aiken CS 13 Lecture 8 1 Prof. Aiken CS 13 Lecture 8 2
More informationBottomUp Parsing II (Different types of ShiftReduce Conflicts) Lecture 10. Prof. Aiken (Modified by Professor Vijay Ganesh.
BottomUp Parsing II Different types of ShiftReduce Conflicts) Lecture 10 Ganesh. Lecture 10) 1 Review: BottomUp Parsing Bottomup parsing is more general than topdown parsing And just as efficient Doesn
More informationReview of CFGs and Parsing II Bottomup Parsers. Lecture 5. Review slides 1
Review of CFGs and Parsing II Bottomup Parsers Lecture 5 1 Outline Parser Overview opdown Parsers (Covered largely through labs) Bottomup Parsers 2 he Functionality of the Parser Input: sequence of
More informationTopDown Parsing and Intro to BottomUp Parsing. Lecture 7
TopDown Parsing and Intro to BottomUp Parsing Lecture 7 1 Predictive Parsers Like recursivedescent but parser can predict which production to use Predictive parsers are never wrong Always able to guess
More informationLecture 8: Deterministic BottomUp Parsing
Lecture 8: Deterministic BottomUp Parsing (From slides by G. Necula & R. Bodik) Last modified: Fri Feb 12 13:02:57 2010 CS164: Lecture #8 1 Avoiding nondeterministic choice: LR We ve been looking at general
More informationOutline. The strategy: shiftreduce parsing. Introduction to BottomUp Parsing. A key concept: handles
Outline Introduction to BottomUp Parsing Lecture Notes by Profs. Alex Aiken and George Necula (UCB) he strategy: reduce parsing A key concept: handles Ambiguity and precedence declarations CS780(Prasad)
More informationLecture 7: Deterministic BottomUp Parsing
Lecture 7: Deterministic BottomUp Parsing (From slides by G. Necula & R. Bodik) Last modified: Tue Sep 20 12:50:42 2011 CS164: Lecture #7 1 Avoiding nondeterministic choice: LR We ve been looking at general
More informationTopDown Parsing and Intro to BottomUp Parsing. Lecture 7
TopDown Parsing and Intro to BottomUp Parsing Lecture 7 1 Predictive Parsers Like recursivedescent but parser can predict which production to use Predictive parsers are never wrong Always able to guess
More informationIntroduction to BottomUp Parsing
Introduction to BottomUp Parsing Lecture 11 CS 536 Spring 2001 1 Outline he strategy: shiftreduce parsing Ambiguity and precedence declarations Next lecture: bottomup parsing algorithms CS 536 Spring
More informationBottomUp Parsing. Lecture 1112
BottomUp Parsing Lecture 1112 (From slides by G. Necula & R. Bodik) 2/20/08 Prof. Hilfinger CS164 Lecture 11 1 Administrivia Test I during class on 10 March. 2/20/08 Prof. Hilfinger CS164 Lecture 11
More informationBottomUp Parsing. Lecture 1112
BottomUp Parsing Lecture 1112 (From slides by G. Necula & R. Bodik) 9/22/06 Prof. Hilfinger CS164 Lecture 11 1 BottomUp Parsing Bottomup parsing is more general than topdown parsing And just as efficient
More informationA bottomup parser traces a rightmost derivation in reverse. BottomUp Parsing. Bottomup parsing is more general than topdown.
BottomUp Parsing BottomUp Parsing Bottomup parsing is more general than topdown parsing And just as efficient Builds on ideas in topdown parsing Bottomup is the preferred method Originated from Prof.
More informationBottomUp Parsing LR Parsing
BottomUp Parsing LR Parsing Maryam Siahbani 2/19/2016 1 What we need for LR parsing LR0) states: Describe all possible states in which parser can be Parsing table ransition between LR0) states Actions
More informationS Y N T A X A N A L Y S I S LR
LR parsing There are three commonly used algorithms to build tables for an LR parser: 1. SLR(1) = LR(0) plus use of FOLLOW set to select between actions smallest class of grammars smallest tables (number
More informationIntro to Bottomup Parsing. Lecture 9
Intro to Bottomup Parsing Lecture 9 BottomUp Parsing Bottomup parsing is more general than topdown parsing And just as efficient Builds on ideas in topdown parsing Bottomup is the preferred method
More informationSLR parsers. LR(0) items
SLR parsers LR(0) items As we have seen, in order to make shiftreduce parsing practical, we need a reasonable way to identify viable prefixes (and so, possible handles). Up to now, it has not been clear
More informationLR Parsing LALR Parser Generators
Outline LR Parsing LALR Parser Generators Review of bottomup parsing Computing the parsing DFA Using parser generators 2 Bottomup Parsing (Review) A bottomup parser rewrites the input string to the
More informationLR Parsing LALR Parser Generators
LR Parsing LALR Parser Generators Outline Review of bottomup parsing Computing the parsing DFA Using parser generators 2 Bottomup Parsing (Review) A bottomup parser rewrites the input string to the
More informationLR Parsing Techniques
LR Parsing Techniques Introduction BottomUp Parsing LR Parsing as Handle Pruning ShiftReduce Parser LR(k) Parsing Model Parsing Table Construction: SLR, LR, LALR 1 BottomUP Parsing A bottomup parser
More informationLecture BottomUp Parsing
Lecture 14+15 BottomUp Parsing CS 241: Foundations of Sequential Programs Winter 2018 Troy Vasiga et al University of Waterloo 1 Example CFG 1. S S 2. S AyB 3. A ab 4. A cd 5. B z 6. B wz 2 Stacks in
More informationCSE P 501 Compilers. LR Parsing Hal Perkins Spring UW CSE P 501 Spring 2018 D1
CSE P 501 Compilers LR Parsing Hal Perkins Spring 2018 UW CSE P 501 Spring 2018 D1 Agenda LR Parsing Tabledriven Parsers Parser States ShiftReduce and ReduceReduce conflicts UW CSE P 501 Spring 2018
More informationCS 2210 Sample Midterm. 1. Determine if each of the following claims is true (T) or false (F).
CS 2210 Sample Midterm 1. Determine if each of the following claims is true (T) or false (F). F A language consists of a set of strings, its grammar structure, and a set of operations. (Note: a language
More informationMIT Parse Table Construction. Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology
MIT 6.035 Parse Table Construction Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology Parse Tables (Review) ACTION Goto State ( ) $ X s0 shift to s2 error error goto s1
More informationCompiler Design 1. BottomUP Parsing. Goutam Biswas. Lect 6
Compiler Design 1 BottomUP Parsing Compiler Design 2 The Process The parse tree is built starting from the leaf nodes labeled by the terminals (tokens). The parser tries to discover appropriate reductions,
More informationFormal Languages and Compilers Lecture VII Part 3: Syntactic A
Formal Languages and Compilers Lecture VII Part 3: Syntactic Analysis Free University of BozenBolzano Faculty of Computer Science POS Building, Room: 2.03 artale@inf.unibz.it http://www.inf.unibz.it/
More informationBottom Up Parsing. Shift and Reduce. Sentential Form. Handle. Parse Tree. Bottom Up Parsing 9/26/2012. Also known as ShiftReduce parsing
Also known as ShiftReduce parsing More powerful than top down Don t need left factored grammars Can handle left recursion Attempt to construct parse tree from an input string eginning at leaves and working
More informationA leftsentential form is a sentential form that occurs in the leftmost derivation of some sentence.
Bottomup parsing Recall For a grammar G, with start symbol S, any string α such that S α is a sentential form If α V t, then α is a sentence in L(G) A leftsentential form is a sentential form that occurs
More informationMonday, September 13, Parsers
Parsers Agenda Terminology LL(1) Parsers Overview of LR Parsing Terminology Grammar G = (Vt, Vn, S, P) Vt is the set of terminals Vn is the set of nonterminals S is the start symbol P is the set of productions
More informationBottom up parsing. The sentential forms happen to be a right most derivation in the reverse order. S a A B e a A d e. a A d e a A B e S.
Bottom up parsing Construct a parse tree for an input string beginning at leaves and going towards root OR Reduce a string w of input to start symbol of grammar Consider a grammar S aabe A Abc b B d And
More informationParsing. Handle, viable prefix, items, closures, goto s LR(k): SLR(1), LR(1), LALR(1)
TD parsing  LL(1) Parsing First and Follow sets Parse table construction BU Parsing Handle, viable prefix, items, closures, goto s LR(k): SLR(1), LR(1), LALR(1) Problems with SLR Aho, Sethi, Ullman, Compilers
More informationMODULE 14 SLR PARSER LR(0) ITEMS
MODULE 14 SLR PARSER LR(0) ITEMS In this module we shall discuss one of the LR type parser namely SLR parser. The various steps involved in the SLR parser will be discussed with a focus on the construction
More informationPART 3  SYNTAX ANALYSIS. F. Wotawa TU Graz) Compiler Construction Summer term / 309
PART 3  SYNTAX ANALYSIS F. Wotawa (IST @ TU Graz) Compiler Construction Summer term 2016 64 / 309 Goals Definition of the syntax of a programming language using context free grammars Methods for parsing
More informationCompilers. Bottomup Parsing. (original slides by Sam
Compilers Bottomup Parsing Yannis Smaragdakis U Athens Yannis Smaragdakis, U. Athens (original slides by Sam Guyer@Tufts) BottomUp Parsing More general than topdown parsing And just as efficient Builds
More informationLexical and Syntax Analysis. BottomUp Parsing
Lexical and Syntax Analysis BottomUp Parsing Parsing There are two ways to construct derivation of a grammar. TopDown: begin with start symbol; repeatedly replace an instance of a production s LHS with
More informationWednesday, September 9, 15. Parsers
Parsers What is a parser A parser has two jobs: 1) Determine whether a string (program) is valid (think: grammatically correct) 2) Determine the structure of a program (think: diagramming a sentence) Agenda
More informationParsers. What is a parser. Languages. Agenda. Terminology. Languages. A parser has two jobs:
What is a parser Parsers A parser has two jobs: 1) Determine whether a string (program) is valid (think: grammatically correct) 2) Determine the structure of a program (think: diagramming a sentence) Agenda
More informationWednesday, August 31, Parsers
Parsers How do we combine tokens? Combine tokens ( words in a language) to form programs ( sentences in a language) Not all combinations of tokens are correct programs (not all sentences are grammatically
More informationBottomup parsing. BottomUp Parsing. Recall. Goal: For a grammar G, withstartsymbols, any string α such that S α is called a sentential form
Bottomup parsing Bottomup parsing Recall Goal: For a grammar G, withstartsymbols, any string α such that S α is called a sentential form If α V t,thenα is called a sentence in L(G) Otherwise it is just
More informationEDAN65: Compilers, Lecture 06 A LR parsing. Görel Hedin Revised:
EDAN65: Compilers, Lecture 06 A LR parsing Görel Hedin Revised: 20170911 This lecture Regular expressions Contextfree grammar Attribute grammar Lexical analyzer (scanner) Syntactic analyzer (parser)
More informationshiftreduce parsing
Parsing #2 Bottomup Parsing Rightmost derivations; use of rules from right to left Uses a stack to push symbols the concatenation of the stack symbols with the rest of the input forms a valid bottomup
More informationParsers. Xiaokang Qiu Purdue University. August 31, 2018 ECE 468
Parsers Xiaokang Qiu Purdue University ECE 468 August 31, 2018 What is a parser A parser has two jobs: 1) Determine whether a string (program) is valid (think: grammatically correct) 2) Determine the structure
More informationSection A. A grammar that produces more than one parse tree for some sentences is said to be ambiguous.
Section A 1. What do you meant by parser and its types? A parser for grammar G is a program that takes as input a string w and produces as output either a parse tree for w, if w is a sentence of G, or
More informationIn One Slide. Outline. LR Parsing. Table Construction
LR Parsing Table Construction #1 In One Slide An LR(1) parsing table can be constructed automatically from a CFG. An LR(1) item is a pair made up of a production and a lookahead token; it represents a
More informationUNIT III & IV. Bottom up parsing
UNIT III & IV Bottom up parsing 5.0 Introduction Given a grammar and a sentence belonging to that grammar, if we have to show that the given sentence belongs to the given grammar, there are two methods.
More information3. Syntax Analysis. Andrea Polini. Formal Languages and Compilers Master in Computer Science University of Camerino
3. Syntax Analysis Andrea Polini Formal Languages and Compilers Master in Computer Science University of Camerino (Formal Languages and Compilers) 3. Syntax Analysis CS@UNICAM 1 / 54 Syntax Analysis: the
More informationUNITIII BOTTOMUP PARSING
UNITIII BOTTOMUP PARSING Constructing a parse tree for an input string beginning at the leaves and going towards the root is called bottomup parsing. A general type of bottomup parser is a shiftreduce
More informationSyntax Analysis. Amitabha Sanyal. (www.cse.iitb.ac.in/ as) Department of Computer Science and Engineering, Indian Institute of Technology, Bombay
Syntax Analysis (www.cse.iitb.ac.in/ as) Department of Computer Science and Engineering, Indian Institute of Technology, Bombay September 2007 College of Engineering, Pune Syntax Analysis: 2/124 Syntax
More informationLecture 14: Parser Conflicts, Using Ambiguity, Error Recovery. Last modified: Mon Feb 23 10:05: CS164: Lecture #14 1
Lecture 14: Parser Conflicts, Using Ambiguity, Error Recovery Last modified: Mon Feb 23 10:05:56 2015 CS164: Lecture #14 1 Shift/Reduce Conflicts If a DFA state contains both [X: α aβ, b] and [Y: γ, a],
More informationPrinciples of Programming Languages
Principles of Programming Languages h"p://www.di.unipi.it/~andrea/dida2ca/plp 14/ Prof. Andrea Corradini Department of Computer Science, Pisa Lesson 8! Bo;om Up Parsing Shi? Reduce LR(0) automata and
More informationCSE 401 Compilers. LR Parsing Hal Perkins Autumn /10/ Hal Perkins & UW CSE D1
CSE 401 Compilers LR Parsing Hal Perkins Autumn 2011 10/10/2011 200211 Hal Perkins & UW CSE D1 Agenda LR Parsing Tabledriven Parsers Parser States ShiftReduce and ReduceReduce conflicts 10/10/2011
More informationParser Generation. BottomUp Parsing. Constructing LR Parser. LR Parsing. Construct parse tree bottomup  from leaves to the root
Parser Generation Main Problem: given a grammar G, how to build a topdown parser or a bottomup parser for it? parser : a program that, given a sentence, reconstructs a derivation for that sentence 
More informationExample CFG. Lectures 16 & 17 BottomUp Parsing. LL(1) Predictor Table Review. Stacks in LR Parsing 1. Sʹ " S. 2. S " AyB. 3. A " ab. 4.
Example CFG Lectures 16 & 17 BottomUp Parsing CS 241: Foundations of Sequential Programs Fall 2016 1. Sʹ " S 2. S " AyB 3. A " ab 4. A " cd Matt Crane University of Waterloo 5. B " z 6. B " wz 2 LL(1)
More informationParsing III. CS434 Lecture 8 Spring 2005 Department of Computer Science University of Alabama Joel Jones
Parsing III (Topdown parsing: recursive descent & LL(1) ) (Bottomup parsing) CS434 Lecture 8 Spring 2005 Department of Computer Science University of Alabama Joel Jones Copyright 2003, Keith D. Cooper,
More informationContextfree grammars
Contextfree grammars Section 4.2 Formal way of specifying rules about the structure/syntax of a program terminals  tokens nonterminals  represent higherlevel structures of a program start symbol,
More informationLet us construct the LR(1) items for the grammar given below to construct the LALR parsing table.
MODULE 18 LALR parsing After understanding the most powerful CALR parser, in this module we will learn to construct the LALR parser. The CALR parser has a large set of items and hence the LALR parser is
More informationCS 4120 Introduction to Compilers
CS 4120 Introduction to Compilers Andrew Myers Cornell University Lecture 6: BottomUp Parsing 9/9/09 Bottomup parsing A more powerful parsing technology LR grammars  more expressive than LL can handle
More informationLL(k) Parsing. Predictive Parsers. LL(k) Parser Structure. Sample Parse Table. LL(1) Parsing Algorithm. Push RHS in Reverse Order 10/17/2012
Predictive Parsers LL(k) Parsing Can we avoid backtracking? es, if for a given input symbol and given nonterminal, we can choose the alternative appropriately. his is possible if the first terminal of
More informationCompiler Construction: Parsing
Compiler Construction: Parsing Mandar Mitra Indian Statistical Institute M. Mitra (ISI) Parsing 1 / 33 Contextfree grammars. Reference: Section 4.2 Formal way of specifying rules about the structure/syntax
More informationParsing Wrapup. Roadmap (Where are we?) Last lecture Shiftreduce parser LR(1) parsing. This lecture LR(1) parsing
Parsing Wrapup Roadmap (Where are we?) Last lecture Shiftreduce parser LR(1) parsing LR(1) items Computing closure Computing goto LR(1) canonical collection This lecture LR(1) parsing Building ACTION
More informationLR Parsing, Part 2. Constructing Parse Tables. An NFA Recognizing Viable Prefixes. Computing the Closure. GOTO Function and DFA States
TDDD16 Compilers and Interpreters TDDB44 Compiler Construction LR Parsing, Part 2 Constructing Parse Tables Parse table construction Grammar conflict handling Categories of LR Grammars and Parsers An NFA
More informationPrinciple of Compilers Lecture IV Part 4: Syntactic Analysis. Alessandro Artale
Free University of Bolzano Principles of Compilers Lecture IV Part 4, 2003/2004 AArtale (1) Principle of Compilers Lecture IV Part 4: Syntactic Analysis Alessandro Artale Faculty of Computer Science Free
More informationFormal Languages and Compilers Lecture VII Part 4: Syntactic A
Formal Languages and Compilers Lecture VII Part 4: Syntactic Analysis Free University of BozenBolzano Faculty of Computer Science POS Building, Room: 2.03 artale@inf.unibz.it http://www.inf.unibz.it/
More informationSyntax Analysis: Contextfree Grammars, Pushdown Automata and Parsing Part  4. Y.N. Srikant
Syntax Analysis: Contextfree Grammars, Pushdown Automata and Part  4 Department of Computer Science and Automation Indian Institute of Science Bangalore 560 012 NPTEL Course on Principles of Compiler
More informationLL(1) predictive parsing
LL(1) predictive parsing Informatics 2A: Lecture 11 Mary Cryan School of Informatics University of Edinburgh mcryan@staffmail.ed.ac.uk 10 October 2018 1 / 15 Recap of Lecture 10 A pushdown automaton (PDA)
More informationCompiler Construction 2016/2017 Syntax Analysis
Compiler Construction 2016/2017 Syntax Analysis Peter Thiemann November 2, 2016 Outline 1 Syntax Analysis Recursive topdown parsing Nonrecursive topdown parsing Bottomup parsing Syntax Analysis tokens
More informationConflicts in LR Parsing and More LR Parsing Types
Conflicts in LR Parsing and More LR Parsing Types Lecture 10 Dr. Sean Peisert ECS 142 Spring 2009 1 Status Project 2 Due Friday, Apr. 24, 11:55pm The usual lecture time is being replaced by a discussion
More informationSYNTAX ANALYSIS 1. Define parser. Hierarchical analysis is one in which the tokens are grouped hierarchically into nested collections with collective meaning. Also termed as Parsing. 2. Mention the basic
More informationLR Parsing. Leftmost and Rightmost Derivations. Compiler Design CSE 504. Derivations for id + id: T id = id+id. 1 ShiftReduce Parsing.
LR Parsing Compiler Design CSE 504 1 ShiftReduce Parsing 2 LR Parsers 3 SLR and LR(1) Parsers Last modifled: Fri Mar 06 2015 at 13:50:06 EST Version: 1.7 16:58:46 2016/01/29 Compiled at 12:57 on 2016/02/26
More informationBottom Up Parsing Handout. 1 Introduction. 2 Example illustrating bottomup parsing
Bottom Up Parsing Handout Compiled by: Nomair. Naeem dditional Material by: driel DeanHall and Brad Lushman his handout is intended to accompany material covered during lectures and is not consered a
More informationLR Parsing. Table Construction
#1 LR Parsing Table Construction #2 Outline Review of bottomup parsing Computing the parsing DFA Closures, LR(1) Items, States Transitions Using parser generators Handling Conflicts #3 In One Slide An
More informationShift. Reduce. Review: ShiftReduce Parsing. Bottomup parsing uses two actions: BottomUp Parsing II. ABC xyz ABCx yz. Lecture 8.
Rviw: ShiftRduc Parsing Bottomup parsing uss two actions: BottomUp Parsing II Lctur 8 Shift ABC xyz ABCx yz Rduc Cbxy ijk CbA ijk Prof. Aikn CS 13 Lctur 8 1 Prof. Aikn CS 13 Lctur 8 2 Rcall: h Stack
More information4. Lexical and Syntax Analysis
4. Lexical and Syntax Analysis 4.1 Introduction Language implementation systems must analyze source code, regardless of the specific implementation approach Nearly all syntax analysis is based on a formal
More informationCSE 130 Programming Language Principles & Paradigms Lecture # 5. Chapter 4 Lexical and Syntax Analysis
Chapter 4 Lexical and Syntax Analysis Introduction  Language implementation systems must analyze source code, regardless of the specific implementation approach  Nearly all syntax analysis is based on
More informationCSE302: Compiler Design
CSE302: Compiler Design Instructor: Dr. Liang Cheng Department of Computer Science and Engineering P.C. Rossin College of Engineering & Applied Science Lehigh University March 20, 2007 Outline Recap LR(0)
More informationHow do LL(1) Parsers Build Syntax Trees?
How do LL(1) Parsers Build Syntax Trees? So far our LL(1) parser has acted like a recognizer. It verifies that input token are syntactically correct, but it produces no output. Building complete (concrete)
More informationParsing. Roadmap. > Contextfree grammars > Derivations and precedence > Topdown parsing > Leftrecursion > Lookahead > Tabledriven parsing
Roadmap > Contextfree grammars > Derivations and precedence > Topdown parsing > Leftrecursion > Lookahead > Tabledriven parsing The role of the parser > performs contextfree syntax analysis > guides
More informationSyntax Analysis, V Bottomup Parsing & The Magic of Handles Comp 412
Midterm Exam: Thursday October 18, 7PM Herzstein Amphitheater Syntax Analysis, V Bottomup Parsing & The Magic of Handles Comp 412 COMP 412 FALL 2018 source code IR Front End Optimizer Back End IR target
More informationLR Parsing Techniques
LR Parsing Techniques BottomUp Parsing  LR: a special form of BU Parser LR Parsing as Handle Pruning ShiftReduce Parser (LR Implementation) LR(k) Parsing Model  k lookaheads to determine next action
More information4. Lexical and Syntax Analysis
4. Lexical and Syntax Analysis 4.1 Introduction Language implementation systems must analyze source code, regardless of the specific implementation approach Nearly all syntax analysis is based on a formal
More informationTop down vs. bottom up parsing
Parsing A grammar describes the strings that are syntactically legal A recogniser simply accepts or rejects strings A generator produces sentences in the language described by the grammar A parser constructs
More informationMore BottomUp Parsing
More BottomUp Parsing Lecture 7 Dr. Sean Peisert ECS 142 Spring 2009 1 Status Project 1 Back By Wednesday (ish) savior lexer in ~cs142/s09/bin Project 2 Due Friday, Apr. 24, 11:55pm My office hours 3pm
More informationAmbiguity. Grammar E E + E E * E ( E ) int. The string int * int + int has two parse trees. * int
Administrivia Ambiguity, Precedence, Associativity & opdown Parsing eam assignments this evening for all those not listed as having one. HW#3 is now available, due next uesday morning (Monday is a holiday).
More informationParsing III. (Topdown parsing: recursive descent & LL(1) )
Parsing III (Topdown parsing: recursive descent & LL(1) ) Roadmap (Where are we?) Previously We set out to study parsing Specifying syntax Contextfree grammars Ambiguity Topdown parsers Algorithm &
More informationLL(1) predictive parsing
LL(1) predictive parsing Informatics 2A: Lecture 11 John Longley School of Informatics University of Edinburgh jrl@staffmail.ed.ac.uk 13 October, 2011 1 / 12 1 LL(1) grammars and parse tables 2 3 2 / 12
More informationCS143 Handout 20 Summer 2011 July 15 th, 2011 CS143 Practice Midterm and Solution
CS143 Handout 20 Summer 2011 July 15 th, 2011 CS143 Practice Midterm and Solution Exam Facts Format Wednesday, July 20 th from 11:00 a.m. 1:00 p.m. in Gates B01 The exam is designed to take roughly 90
More information3. Parsing. Oscar Nierstrasz
3. Parsing Oscar Nierstrasz Thanks to Jens Palsberg and Tony Hosking for their kind permission to reuse and adapt the CS132 and CS502 lecture notes. http://www.cs.ucla.edu/~palsberg/ http://www.cs.purdue.edu/homes/hosking/
More informationChapter 4: LR Parsing
Chapter 4: LR Parsing 110 Some definitions Recall For a grammar G, with start symbol S, any string α such that S called a sentential form α is If α Vt, then α is called a sentence in L G Otherwise it is
More informationCSE P 501 Compilers. Parsing & ContextFree Grammars Hal Perkins Winter /15/ Hal Perkins & UW CSE C1
CSE P 501 Compilers Parsing & ContextFree Grammars Hal Perkins Winter 2008 1/15/2008 200208 Hal Perkins & UW CSE C1 Agenda for Today Parsing overview Context free grammars Ambiguous grammars Reading:
More informationBottomUp Parsing. Parser Generation. LR Parsing. Constructing LR Parser
Parser Generation Main Problem: given a grammar G, how to build a topdown parser or a bottomup parser for it? parser : a program that, given a sentence, reconstructs a derivation for that sentence 
More informationMIT Specifying Languages with Regular Expressions and ContextFree Grammars. Martin Rinard Massachusetts Institute of Technology
MIT 6.035 Specifying Languages with Regular essions and ContextFree Grammars Martin Rinard Massachusetts Institute of Technology Language Definition Problem How to precisely define language Layered structure
More informationCSCI312 Principles of Programming Languages
Copyright 2006 The McGrawHill Companies, Inc. CSCI312 Principles of Programming Languages! LL Parsing!! Xu Liu Derived from Keith Cooper s COMP 412 at Rice University Recap Copyright 2006 The McGrawHill
More informationMIT Specifying Languages with Regular Expressions and ContextFree Grammars
MIT 6.035 Specifying Languages with Regular essions and ContextFree Grammars Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology Language Definition Problem How to precisely
More informationCS 406/534 Compiler Construction Parsing Part I
CS 406/534 Compiler Construction Parsing Part I Prof. Li Xu Dept. of Computer Science UMass Lowell Fall 2004 Part of the course lecture notes are based on Prof. Keith Cooper, Prof. Ken Kennedy and Dr.
More informationChapter 4. Lexical and Syntax Analysis
Chapter 4 Lexical and Syntax Analysis Chapter 4 Topics Introduction Lexical Analysis The Parsing Problem RecursiveDescent Parsing BottomUp Parsing Copyright 2012 AddisonWesley. All rights reserved.
More informationCS5371 Theory of Computation. Lecture 8: Automata Theory VI (PDA, PDA = CFG)
CS5371 Theory of Computation Lecture 8: Automata Theory VI (PDA, PDA = CFG) Objectives Introduce Pushdown Automaton (PDA) Show that PDA = CFG In terms of descriptive power Pushdown Automaton (PDA) Roughly
More informationCS 164 Programming Languages and Compilers Handout 9. Midterm I Solution
Midterm I Solution Please read all instructions (including these) carefully. There are 5 questions on the exam, some with multiple parts. You have 1 hour and 20 minutes to work on the exam. The exam is
More informationCS453 : JavaCUP and error recovery. CS453 Shiftreduce Parsing 1
CS453 : JavaCUP and error recovery CS453 Shiftreduce Parsing 1 Shiftreduce parsing in an LR parser LR(k) parser Lefttoright parse Rightmost derivation Ktoken look ahead LR parsing algorithm using
More informationLR(0) Parsing Summary. LR(0) Parsing Table. LR(0) Limitations. A NonLR(0) Grammar. LR(0) Parsing Table CS412/CS413
LR(0) Parsing ummary C412/C41 Introduction to Compilers Tim Teitelbaum Lecture 10: LR Parsing February 12, 2007 LR(0) item = a production with a dot in RH LR(0) state = set of LR(0) items valid for viable
More informationCompilation 2012 ContextFree Languages Parsers and Scanners. Jan Midtgaard Michael I. Schwartzbach Aarhus University
Compilation 2012 Parsers and Scanners Jan Midtgaard Michael I. Schwartzbach Aarhus University ContextFree Grammars Example: sentence subject verb object subject person person John Joe Zacharias verb asked
More informationIntroduction to Parsing. Lecture 8
Introduction to Parsing Lecture 8 Adapted from slides by G. Necula Outline Limitations of regular languages Parser overview Contextfree grammars (CFG s) Derivations Languages and Automata Formal languages
More information