LR Parsing. Table Construction

Similar documents
In One Slide. Outline. LR Parsing. Table Construction

LR Parsing LALR Parser Generators

Bottom-Up Parsing. Lecture 11-12

LR Parsing LALR Parser Generators

Conflicts in LR Parsing and More LR Parsing Types

Bottom-Up Parsing. Lecture 11-12

More Bottom-Up Parsing

Compilers. Bottom-up Parsing. (original slides by Sam

Lecture 8: Deterministic Bottom-Up Parsing

Lecture 7: Deterministic Bottom-Up Parsing

Lecture 14: Parser Conflicts, Using Ambiguity, Error Recovery. Last modified: Mon Feb 23 10:05: CS164: Lecture #14 1

Lexical and Syntax Analysis. Bottom-Up Parsing

MIT Parse Table Construction. Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology

shift-reduce parsing

LALR stands for look ahead left right. It is a technique for deciding when reductions have to be made in shift/reduce parsing. Often, it can make the

Wednesday, August 31, Parsers

Monday, September 13, Parsers

Context-free grammars

Compiler Construction: Parsing

Wednesday, September 9, 15. Parsers

Parsers. What is a parser. Languages. Agenda. Terminology. Languages. A parser has two jobs:

Parsers. Xiaokang Qiu Purdue University. August 31, 2018 ECE 468

Formal Languages and Compilers Lecture VII Part 4: Syntactic A

EDAN65: Compilers, Lecture 06 A LR parsing. Görel Hedin Revised:

Simple LR (SLR) LR(0) Drawbacks LR(1) SLR Parse. LR(1) Start State and Reduce. LR(1) Items 10/3/2012

CS 2210 Sample Midterm. 1. Determine if each of the following claims is true (T) or false (F).

CS 4120 Introduction to Compilers

3. Syntax Analysis. Andrea Polini. Formal Languages and Compilers Master in Computer Science University of Camerino

Lecture Bottom-Up Parsing

S Y N T A X A N A L Y S I S LR

LR(0) Parsing Summary. LR(0) Parsing Table. LR(0) Limitations. A Non-LR(0) Grammar. LR(0) Parsing Table CS412/CS413

Principles of Programming Languages

Review: Shift-Reduce Parsing. Bottom-up parsing uses two actions: Bottom-Up Parsing II. Shift ABC xyz ABCx yz. Lecture 8. Reduce Cbxy ijk CbA ijk

Table-driven using an explicit stack (no recursion!). Stack can be viewed as containing both terminals and non-terminals.

Example CFG. Lectures 16 & 17 Bottom-Up Parsing. LL(1) Predictor Table Review. Stacks in LR Parsing 1. Sʹ " S. 2. S " AyB. 3. A " ab. 4.

Parsing Wrapup. Roadmap (Where are we?) Last lecture Shift-reduce parser LR(1) parsing. This lecture LR(1) parsing

Principle of Compilers Lecture IV Part 4: Syntactic Analysis. Alessandro Artale

A left-sentential form is a sentential form that occurs in the leftmost derivation of some sentence.

LR Parsing - The Items

Let us construct the LR(1) items for the grammar given below to construct the LALR parsing table.

Bottom-Up Parsing II (Different types of Shift-Reduce Conflicts) Lecture 10. Prof. Aiken (Modified by Professor Vijay Ganesh.

Lecture Notes on Bottom-Up LR Parsing

Parser Generation. Bottom-Up Parsing. Constructing LR Parser. LR Parsing. Construct parse tree bottom-up --- from leaves to the root

How do LL(1) Parsers Build Syntax Trees?

CS143 Handout 20 Summer 2011 July 15 th, 2011 CS143 Practice Midterm and Solution

Bottom-up parsing. Bottom-Up Parsing. Recall. Goal: For a grammar G, withstartsymbols, any string α such that S α is called a sentential form

Bottom-Up Parsing II. Lecture 8

Lecture Notes on Bottom-Up LR Parsing

Compiler Design 1. Bottom-UP Parsing. Goutam Biswas. Lect 6

Bottom up parsing. The sentential forms happen to be a right most derivation in the reverse order. S a A B e a A d e. a A d e a A B e S.

Compiler Construction 2016/2017 Syntax Analysis

CSE P 501 Compilers. LR Parsing Hal Perkins Spring UW CSE P 501 Spring 2018 D-1

Concepts Introduced in Chapter 4

PART 3 - SYNTAX ANALYSIS. F. Wotawa TU Graz) Compiler Construction Summer term / 309

UNIT III & IV. Bottom up parsing

LR Parsing, Part 2. Constructing Parse Tables. An NFA Recognizing Viable Prefixes. Computing the Closure. GOTO Function and DFA States

LR Parsing. Leftmost and Rightmost Derivations. Compiler Design CSE 504. Derivations for id + id: T id = id+id. 1 Shift-Reduce Parsing.

LR Parsing Techniques

Review of CFGs and Parsing II Bottom-up Parsers. Lecture 5. Review slides 1

Configuration Sets for CSX- Lite. Parser Action Table

CS453 : JavaCUP and error recovery. CS453 Shift-reduce Parsing 1

Introduction to Parsing. Lecture 5

G53CMP: Lecture 4. Syntactic Analysis: Parser Generators. Henrik Nilsson. University of Nottingham, UK. G53CMP: Lecture 4 p.1/32

Syntax Analysis: Context-free Grammars, Pushdown Automata and Parsing Part - 4. Y.N. Srikant

CS415 Compilers. LR Parsing & Error Recovery

Parsing. Roadmap. > Context-free grammars > Derivations and precedence > Top-down parsing > Left-recursion > Look-ahead > Table-driven parsing

Bottom Up Parsing. Shift and Reduce. Sentential Form. Handle. Parse Tree. Bottom Up Parsing 9/26/2012. Also known as Shift-Reduce parsing

Fall Compiler Principles Lecture 4: Parsing part 3. Roman Manevich Ben-Gurion University of the Negev

FROWN An LALR(k) Parser Generator

CS 314 Principles of Programming Languages

Outline CS412/413. Administrivia. Review. Grammars. Left vs. Right Recursion. More tips forll(1) grammars Bottom-up parsing LR(0) parser construction

Formal Languages and Compilers Lecture VII Part 3: Syntactic A

Syntax Analysis. Amitabha Sanyal. ( as) Department of Computer Science and Engineering, Indian Institute of Technology, Bombay

Compilers and computer architecture From strings to ASTs (2): context free grammars

Introduction to Parsing. Lecture 5

Top-Down Parsing and Intro to Bottom-Up Parsing. Lecture 7

MODULE 14 SLR PARSER LR(0) ITEMS

Parsing. Handle, viable prefix, items, closures, goto s LR(k): SLR(1), LR(1), LALR(1)

MIT Specifying Languages with Regular Expressions and Context-Free Grammars

Compilation 2012 Context-Free Languages Parsers and Scanners. Jan Midtgaard Michael I. Schwartzbach Aarhus University

SLR parsers. LR(0) items

Compiler Construction

Lecture Notes on Shift-Reduce Parsing

Midterm I (Solutions) CS164, Spring 2002

Compilation 2013 Parser Generators, Conflict Management, and ML-Yacc

CSE302: Compiler Design

Midterm I - Solution CS164, Spring 2014

LR Parsing - Conflicts

Top-Down Parsing and Intro to Bottom-Up Parsing. Lecture 7

Fall Compiler Principles Lecture 5: Parsing part 4. Roman Manevich Ben-Gurion University

3. Parsing. Oscar Nierstrasz

LALR Parsing. What Yacc and most compilers employ.

Section A. A grammar that produces more than one parse tree for some sentences is said to be ambiguous.

SYED AMMAL ENGINEERING COLLEGE (An ISO 9001:2008 Certified Institution) Dr. E.M. Abdullah Campus, Ramanathapuram

MIT Specifying Languages with Regular Expressions and Context-Free Grammars. Martin Rinard Massachusetts Institute of Technology

Bottom-Up Parsing. Parser Generation. LR Parsing. Constructing LR Parser

COMP 181. Prelude. Prelude. Summary of parsing. A Hierarchy of Grammar Classes. More power? Syntax-directed translation. Analysis

Action Table for CSX-Lite. LALR Parser Driver. Example of LALR(1) Parsing. GoTo Table for CSX-Lite

Syntax-Directed Translation. Lecture 14

Introduction to Parsing. Lecture 8

Transcription:

#1 LR Parsing Table Construction

#2 Outline Review of bottom-up parsing Computing the parsing DFA Closures, LR(1) Items, States Transitions Using parser generators Handling Conflicts

#3 In One Slide An LR(1) parsing table can be constructed automatically from a CFG. An LR(1) item is a pair made up of a production and a lookahead token; it represents a possible parser context. After we extend LR(1) items by closing them they become LR(1) DFA states. Grammars can have shift/reduce or reduce/reduce conflicts. You can fix most conflicts with precedence and associativity declarations. LALR(1) tables are formed from LR(1) tables by merging states with similar cores.

#4 Bottom-up Parsing (Review) A bottom-up parser rewrites the input string to the start symbol The state of the parser is described as α I γ α is a stack of terminals and non-terminals γ is the string of terminals not yet examined Initially: I x 1 x 2... x n

#5 Shift and Reduce Actions (Review) Recall the CFG: E! int E + (E) A bottom-up parser uses two kinds of actions: Shift pushes a terminal from input on the stack E + (I int ) E + (int I ) Reduce pops 0 or more symbols off of the stack (production RHS) and pushes a non-terminal on the stack (production LHS) E + (E + ( E ) I ) E +(E I )

#6 Key Issue: When to Shift or Reduce? Idea: use a finite automaton (DFA) to decide when to shift or reduce The input is the stack The language consists of terminals and non-terminals We run the DFA on the stack and we examine the resulting state X and the token tok after I If X has a transition labeled tok then shift If X is labeled with A! β on tok then reduce

LR(1) Parsing. An Example accept on $ E E! E + (E) on $, + int 0 1 + ( 2 3 4 7 8 + ) + 10 E 6 E ( ) 5 11 E! int on $, + int 9 E! int on ), + int E! E + (E) on ), + I int + (int) + (int)$ shift int I + (int) + (int)$ E! int E I + (int) + (int)$ shift(x3) E + (int I ) + (int)$ E! int E + (E I ) + (int)$ shift E + (E) I + (int)$ E! E+(E) E I + (int)$ shift (x3) E + (int I )$ E! int E + (E I )$ shift E + (E) I $ E! E+(E) E I $ accept #7

End of review #8

#9 Key Issue: How is the DFA Constructed? The stack describes the context of the parse What non-terminal we are looking for What production rhs we are looking for What we have seen so far from the rhs

LR(1) Table Construction Three hours later, you can finally parse E! E + E int #10

#11 Parsing Contexts Consider the state: E int + ( int ) + ( int ) The stack is E + ( I int ) + ( int ) Context: Red dot = We are looking for an E! E + ( ² E ) where we are. Have have seen E + ( from the right-hand side We are also looking for E! ² int or E! ² E + ( E ) Have seen nothing from the right-hand side One DFA state describes several contexts

#12 LR(1) Items An LR(1) item is a pair: X! α²β, a X! αβ is a production a is a terminal (the lookahead terminal) LR(1) means 1 lookahead terminal [X! α²β, a] describes a context of the parser We are trying to find an X followed by an a, and We have α already on top of the stack Thus we need to see next a prefix derived from βa

#13 Note The symbol I was used before to separate the stack from the rest of input α I γ, where α is the stack and γ is the remaining string of terminals In LR(1) items ² is used to mark a prefix of a production rhs: X! α²β, a Here β might contain non-terminals as well In both case the stack is on the left

#14 Convention We add to our grammar a fresh new start symbol S and a production S! E Where E is the old start symbol No need to do this if E had only one production The initial parsing context contains: S! ² E, $ Trying to find an S as a string derived from E$ The stack is empty

#15 LR(1) Items (Cont.) In context containing E! E + ² ( E ), + If ( follows then we can perform a shift to context containing E! E + (² E ), + In context containing E! E + ( E ) ², + We can perform a reduction with E! E + ( E ) But only if a + follows

LR(1) Items (Cont.) Consider a context with the item E! E + (² E ), + We expect next a string derived from E ) + There are two productions for E E! int and E! E + ( E) We describe this by extending the context with two more items: E! ² int, ) E! ² E + ( E ), ) #16

The Closure Operation The operation of extending the context with items is called the closure operation Closure(Items) = repeat for each [X! α²yβ, a] in Items for each production Y! γ for each b 2 First(βa) add [Y! ²γ, b] to Items until Items is unchanged #17

#18 Constructing the Parsing DFA (1) Construct the start context: Closure({S! ²E, $}) = S! ²E, $ E! ²E+(E), $ E! ²int, $ E! ²E+(E), + E! ²int, + We abbreviate as: S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+

#19

#20 Constructing the Parsing DFA (2) An LR(1) DFA state is a closed set of LR(1) items This means that we performed Closure The start state contains [S! ²E, $] A state that contains [X! α², b] is labeled with reduce with X! α on b And now the transitions

The DFA Transitions A state State that contains [X! α²yβ, b] has a transition labeled y to a state that contains the items Transition(State, y) y can be a terminal or a non-terminal Transition(State, y) = Items à ; for each [X! α²yβ, b] 2 State add [X! αy²β, b] to Items return Closure(Items) #21

#22 LR(1) DFA Construction Example S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ 0

#23 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ E 0 1 int

#24 LR(1) DFA Construction Example S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ 0 int 1 E! int², $/+ 2 E

#25 LR(1) DFA Construction Example S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ 0 int 1 E! int², $/+ E! int on $, + 2 E

#26 LR(1) DFA Construction Example S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ 0 int 1 E! int², $/+ E! int on $, + 2 E S! E², $ E! E²+(E), $/+

#27 LR(1) DFA Construction Example S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ 0 int 1 E! int², $/+ E! int on $, + 2 accept on $ E + S! E², $ E! E²+(E), $/+

#28 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ accept on $ 0 int E + S! E², $ E! E²+(E), $/+ 1 E! int², $/+ E! E+² (E), $/+ E! int on $, + 3

#29 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ accept on $ 0 E + S! E², $ E! E²+(E), $/+ int 1 E! int², $/+ E! E+² (E), $/+ ( E! int on $, + 3

#30 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ accept on $ 0 E + S! E², $ E! E²+(E), $/+ int 1 E! int², $/+ E! E+² (E), $/+ ( E! E+(²E), $/+ E! ²E+(E), )/+ E! ²int, )/+ E! int on $, + 4 3

#31 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ accept on $ 0 E + S! E², $ E! E²+(E), $/+ E int 1 E! int², $/+ E! E+² (E), $/+ E! E+(²E), $/+ E! ²E+(E), )/+ E! ²int, )/+ ( int E! int on $, + 4 3

#32 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ accept on $ 0 E + S! E², $ E! E²+(E), $/+ E int 1 E! int², $/+ E! E+² (E), $/+ E! E+(²E), $/+ E! ²E+(E), )/+ E! ²int, )/+ ( int E! int on $, + 4 3 5 E! int², )/+

#33 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ accept on $ 0 E + S! E², $ E! E²+(E), $/+ E int 1 E! int², $/+ E! E+² (E), $/+ E! E+(²E), $/+ E! ²E+(E), )/+ E! ²int, )/+ ( E! int on $, + 4 3 int 5 E! int², )/+ E! int on ), +

#34 LR(1) DFA Construction Example 2 S! ²E, $ E! ²E+(E), $/+ E! ²int, $/+ accept on $ 0 E + S! E², $ E! E²+(E), $/+ E int 1 E! int², $/+ E! E+² (E), $/+ E! E+(²E), $/+ E! ²E+(E), )/+ E! ²int, )/+ ( E! int on $, + 4 3 6 E! E+(E²), $/+ E! E²+(E), )/+ and so on int 5 E! int², )/+ E! int on ), +

Q: Movie Music (420 / 842) In a 1995 Disney movie that has been uncharitably referred to as "Hokey-Hontas", the Stephen Schwartz lyrics "what I love most about rivers is: / you can't step in the same river twice" refer to the ideas of which Greek philosopher?

Q: Games (522 / 842) In this 1982 arcade game features lance-wielding knights mounted on giant flying birds and dueling over a pit of lava. Destroying an enemy knight required ramming it such that your lance was higher than the enemy's.

#37 LR Parsing Tables. Notes Parsing tables (= the DFA) can be constructed automatically for a CFG The tables which cannot be constructed are constructed automatically in response to a CFG input. You asked for a miracle, Theo. I give you the L-R-1. Hans Gruber, Die Hard But we still need to understand the construction to work with parser generators e.g., they report errors in terms of sets of items What kind of errors can we expect?

#38

#39 Shift/Reduce Conflicts If a DFA state contains both [X! α²aβ, b] and [Y! γ², a] Then on input a we could either Shift into state [X! αa²β, b], or Reduce with Y! γ This is called a shift-reduce conflict

#40 Shift/Reduce Conflicts Typically due to ambiguities in the grammar Classic example: the dangling else S! if E then S if E then S else S OTHER Will have DFA state containing [S! if E then S², else] [S! if E then S² else S, x] If else follows then we can shift or reduce Default (bison, CUP, etc.) is to shift Default behavior is as needed in this case

#41 More Shift/Reduce Conflicts Consider the ambiguous grammar E! E + E E * E int We will have the states containing [E! E * ² E, +] [E! E * E², +] [E! ² E + E, +] E [E! E ² + E, +] Again we have a shift/reduce on input + We need to reduce (* binds more tightly than +) Solution: declare the precedence of * and +

#42 More Shift/Reduce Conflicts In bison declare precedence and associativity: %left + %left * // high precedence Precedence of a rule = that of its last terminal See bison manual for ways to override this default Resolve shift/reduce conflict with a shift if: no precedence declared for either rule or terminal input terminal has higher precedence than the rule the precedences are the same and right associative

#43 Using Precedence to Solve S/R Conflicts Back to our example: [E! E * ² E, +] [E! E * E², +] [E! ² E + E, +] E [E! E ² + E, +] Will choose reduce on input + because precedence of rule E! E * E is higher than of terminal +

#44 Using Precedence to Solve S/R Conflicts Same grammar as before E! E + E E * E int We will also have the states [E! E + ² E, +] [E! E + E², +] [E! ² E + E, +] E [E! E ² + E, +] Now we also have a shift/reduce on input + We choose reduce because E! E + E and + have the same precedence and + is left-associative

#45 Using Precedence to Solve S/R Conflicts Back to our dangling else example [S! if E then S², else] [S! if E then S² else S, x] Can eliminate conflict by declaring else with higher precedence than then Or just rely on the default shift action But this starts to look like hacking the parser Avoid overuse of precedence declarations or you ll end with unexpected parse trees The kiss of death

#46 Reduce/Reduce Conflicts If a DFA state contains both [X! α², a] and [Y! β², a] Then on input a we don t know which production to reduce This is called a reduce/reduce conflict

#47 Reduce/Reduce Conflicts Usually due to gross ambiguity in the grammar Example: a sequence of identifiers S! ε id id S There are two parse trees for the string id S! id S! id S! id How does this confuse the parser?

#48 More on Reduce/Reduce Conflicts Consider the states [S! id ², $] [S! ² S, $] [S! id ² S, $] [S! ², $] id [S! ², $] [S! ² id, $] [S! ² id, $] [S! ² id S, $] [S! ² id S, $] Reduce/reduce conflict on input $ S! S! id S! S! id S! id Better rewrite the grammar: S! ε id S

Can s someone learn this for me? #49

#50 Using Parser Generators Parser generators construct the parsing DFA given a CFG Use precedence declarations and default conventions to resolve conflicts The parser algorithm is the same for all grammars (and is provided as a library function) But most parser generators do not construct the DFA as described before Why might that be?

#51 Using Parser Generators Parser generators construct the parsing DFA given a CFG Use precedence declarations and default conventions to resolve conflicts The parser algorithm is the same for all grammars (and is provided as a library function) But most parser generators do not construct the DFA as described before Because the LR(1) parsing DFA has 1000s of states even for a simple language

#52 LR(1) Parsing Tables are Big But many states are similar, e.g. 1 E! int on $, + E! int², $/+ E! int², )/+ and 5 E! int on ), + Idea: merge the DFA states whose items differ only in the lookahead tokens We say that such states have the same core We obtain 1 E! int², $/+) E! int on $, +, )

#53 The Core of a Set of LR Items Definition: The core of a set of LR items is the set of first components Without the lookahead terminals Example: the core of { [X! α²β, b], [Y! γ²δ, d]} is {X! α²β, Y! γ²δ}

#54 LALR States Consider for example the LR(1) states {[X! α², a], [Y! β², c]} {[X! α², b], [Y! β², d]} They have the same core and can be merged And the merged state contains: {[X! α², a/b], [Y! β², c/d]} These are called LALR(1) states Stands for LookAhead LR Typically 10x fewer LALR(1) states than LR(1)

#55 LALR(1) DFA Repeat until all states have distinct core Choose two distinct states with same core Merge the states by creating a new one with the union of all the items Point edges from predecessors to new state New state points to all the previous successors A B C A C BE D E F D F

Example LALR(1) to LR(1) E E! E + (E) on $, + int 0 1 + ( 2 3 4 accept on $ 7 8 + ) + 10 6 E E ) ( 5 11 E! int on $, + int 9 E! int on ), + int E! E + (E) on ), + E accept on $ E! E + (E) on $, +, ) int 0 1,5 int ( 2 + 3,8 4,9 7,11 ) + 6,10 E E! int on $, +, ) #56

#57 The LALR Parser Can Have Conflicts Consider for example the LR(1) states {[X! α², a], [Y! β², b]} {[X! α², b], [Y! β², a]} And the merged LALR(1) state {[X! α², a/b], [Y! β², a/b]} Has a new reduce-reduce conflict In practice such cases are rare

#58 LALR vs. LR Parsing LALR languages are not natural They are an efficiency hack on LR languages Any reasonable programming language has a LALR(1) grammar LALR(1) has become a standard for programming languages and for parser generators

#59 A Hierarchy of Grammar Classes From Andrew Appel, Modern Compiler Implementation in Java

#60 Notes on Parsing Parsing A solid foundation: context-free grammars A simple parser: LL(1) A more powerful parser: LR(1) An efficiency hack: LALR(1) LALR(1) parser generators Now we move on to semantic analysis

Take a bow, you survived! #61

#62 Supplement to LR Parsing Strange Reduce/Reduce Conflicts Due to LALR Conversion (from the bison manual)

#63 Strange Reduce/Reduce Conflicts Consider the grammar S! P R, P! T NL : T N! id NL! N N, NL R! T N : T T! id P - parameters specification R - result specification N - a parameter or result name T - a type name NL - a list of names

#64 Strange Reduce/Reduce Conflicts In P an id is a N when followed by, or : T when followed by id In R an id is a N when followed by : T when followed by, This is an LR(1) grammar. But it is not LALR(1). Why? For obscure reasons

#65 A Few LR(1) States P ² T id P ² NL : T id NL ² N : NL ² N, NL : 1 id T id ² id N id ² : N id ², 3 LALR reduce/reduce conflict on, N ² id : N ² id, T ² id id LALR merge T id ² id/, N id ² :/, R ² T, R ² N : T, 2 id T id ², N id ² : 4 T ² id, N ² id :

#66 What Happened? Two distinct states were confused because they have the same core Fix: add dummy productions to distinguish the two confused states E.g., add R! id bogus bogus is a terminal not used by the lexer This production will never be used during parsing But it distinguishes R from P

#67 A Few LR(1) States After Fix P ² T id P ² NL : T id NL ² N : NL ² N, NL : N ² id : N ² id, T ² id id 1 id T id ² id 3 N id ² : N id ², Different cores no LALR merging R. T, R. N : T, R. id bogus, T. id, N. id : 2 id T id ², N id ² : R id ² bogus, 4

#68 Homework Today: WA2 Was Due Thursday: Chapter 3.1 3.6 Optional Wikipedia Article Tuesday February 26: WA3 due Wednesday February 27: PA3 due Parsing! Thursday Feb 28 Midterm 1 in Class