CS 406/534 Compiler Construction Parsing Part II LL(1) and LR(1) Parsing

Similar documents
Parsing III. CS434 Lecture 8 Spring 2005 Department of Computer Science University of Alabama Joel Jones

Introduction to Parsing. Comp 412

Bottom-up Parser. Jungsik Choi

Syntax Analysis, VI Examples from LR Parsing. Comp 412

CSCI312 Principles of Programming Languages

Syntactic Analysis. Top-Down Parsing

Syntax Analysis, V Bottom-up Parsing & The Magic of Handles Comp 412

CS 406/534 Compiler Construction Parsing Part I

Parsing III. (Top-down parsing: recursive descent & LL(1) )

CS415 Compilers. LR Parsing & Error Recovery

CS415 Compilers. Syntax Analysis. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

Syntax Analysis, III Comp 412

Compilers. Yannis Smaragdakis, U. Athens (original slides by Sam

Parsing. Note by Baris Aktemur: Our slides are adapted from Cooper and Torczon s slides that they prepared for COMP 412 at Rice.

Parsing Part II (Top-down parsing, left-recursion removal)

Syntax Analysis, III Comp 412

Parsing II Top-down parsing. Comp 412

Bottom-up parsing. Bottom-Up Parsing. Recall. Goal: For a grammar G, withstartsymbols, any string α such that S α is called a sentential form

Top down vs. bottom up parsing

CS 406/534 Compiler Construction Putting It All Together

Syntax Analysis, VII One more LR(1) example, plus some more stuff. Comp 412 COMP 412 FALL Chapter 3 in EaC2e. target code.

Parsing. Roadmap. > Context-free grammars > Derivations and precedence > Top-down parsing > Left-recursion > Look-ahead > Table-driven parsing

CS1622. Today. A Recursive Descent Parser. Preliminaries. Lecture 9 Parsing (4)

Types of parsing. CMSC 430 Lecture 4, Page 1

Top-Down Parsing and Intro to Bottom-Up Parsing. Lecture 7

COMP 181. Prelude. Next step. Parsing. Study of parsing. Specifying syntax with a grammar

Top-Down Parsing and Intro to Bottom-Up Parsing. Lecture 7

3. Parsing. Oscar Nierstrasz

Parsing Part II. (Ambiguity, Top-down parsing, Left-recursion Removal)

CS 314 Principles of Programming Languages

8 Parsing. Parsing. Top Down Parsing Methods. Parsing complexity. Top down vs. bottom up parsing. Top down vs. bottom up parsing

A left-sentential form is a sentential form that occurs in the leftmost derivation of some sentence.

Chapter 3. Parsing #1

Compiler Design 1. Bottom-UP Parsing. Goutam Biswas. Lect 6

Monday, September 13, Parsers

Computer Science 160 Translation of Programming Languages

Wednesday, September 9, 15. Parsers

Parsers. What is a parser. Languages. Agenda. Terminology. Languages. A parser has two jobs:

Prelude COMP 181 Tufts University Computer Science Last time Grammar issues Key structure meaning Tufts University Computer Science

Chapter 4: LR Parsing

CA Compiler Construction

Wednesday, August 31, Parsers

Lexical and Syntax Analysis. Top-Down Parsing

Abstract Syntax Trees & Top-Down Parsing

Abstract Syntax Trees & Top-Down Parsing

Parsers. Xiaokang Qiu Purdue University. August 31, 2018 ECE 468

4. Lexical and Syntax Analysis

CSE P 501 Compilers. LR Parsing Hal Perkins Spring UW CSE P 501 Spring 2018 D-1

Ambiguity, Precedence, Associativity & Top-Down Parsing. Lecture 9-10

CS2210: Compiler Construction Syntax Analysis Syntax Analysis

Syntactic Analysis. Top-Down Parsing. Parsing Techniques. Top-Down Parsing. Remember the Expression Grammar? Example. Example

Abstract Syntax Trees & Top-Down Parsing

Syntax Analysis. Martin Sulzmann. Martin Sulzmann Syntax Analysis 1 / 38

Lexical and Syntax Analysis

Administrativia. WA1 due on Thu PA2 in a week. Building a Parser III. Slides on the web site. CS164 3:30-5:00 TT 10 Evans.

4. Lexical and Syntax Analysis

Chapter 4. Lexical and Syntax Analysis

Syntax Analysis. Amitabha Sanyal. ( as) Department of Computer Science and Engineering, Indian Institute of Technology, Bombay

Compiler Construction 2016/2017 Syntax Analysis

MIT Parse Table Construction. Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology

Syntax Analysis: Context-free Grammars, Pushdown Automata and Parsing Part - 4. Y.N. Srikant

Formal Languages and Compilers Lecture VII Part 3: Syntactic A

10/5/17. Lexical and Syntactic Analysis. Lexical and Syntax Analysis. Tokenizing Source. Scanner. Reasons to Separate Lexical and Syntax Analysis

The Parsing Problem (cont d) Recursive-Descent Parsing. Recursive-Descent Parsing (cont d) ICOM 4036 Programming Languages. The Complexity of Parsing

Compiler Design 1. Top-Down Parsing. Goutam Biswas. Lect 5

S Y N T A X A N A L Y S I S LR

Review main idea syntax-directed evaluation and translation. Recall syntax-directed interpretation in recursive descent parsers


10/4/18. Lexical and Syntactic Analysis. Lexical and Syntax Analysis. Tokenizing Source. Scanner. Reasons to Separate Lexical and Syntactic Analysis

Compilers. Bottom-up Parsing. (original slides by Sam

Building a Parser Part III

CSE 401 Compilers. LR Parsing Hal Perkins Autumn /10/ Hal Perkins & UW CSE D-1

Parsing II Top-down parsing. Comp 412

Parsing #1. Leonidas Fegaras. CSE 5317/4305 L3: Parsing #1 1

CS502: Compilers & Programming Systems

Part III : Parsing. From Regular to Context-Free Grammars. Deriving a Parser from a Context-Free Grammar. Scanners and Parsers.

CSE 130 Programming Language Principles & Paradigms Lecture # 5. Chapter 4 Lexical and Syntax Analysis

Context-free grammars

Table-Driven Parsing

EDAN65: Compilers, Lecture 06 A LR parsing. Görel Hedin Revised:

4 (c) parsing. Parsing. Top down vs. bo5om up parsing

Ambiguity. Grammar E E + E E * E ( E ) int. The string int * int + int has two parse trees. * int

Bottom-Up Parsing. Lecture 11-12

Compiler Construction: Parsing

Introduction to Parsing

CMSC 430 Introduction to Compilers. Fall Lexing and Parsing

Bottom-Up Parsing. Lecture 11-12

CS 2210 Sample Midterm. 1. Determine if each of the following claims is true (T) or false (F).

LR Parsing Techniques

CS 4120 Introduction to Compilers

More Bottom-Up Parsing

Building a Parser III. CS164 3:30-5:00 TT 10 Evans. Prof. Bodik CS 164 Lecture 6 1

Topic 3: Syntax Analysis I

Compilers: CS31003 Computer Sc & Engg: IIT Kharagpur 1. Top-Down Parsing. Lect 5. Goutam Biswas

Compilers. Predictive Parsing. Alex Aiken

Computing Inside The Parser Syntax-Directed Translation. Comp 412 COMP 412 FALL Chapter 4 in EaC2e. source code. IR IR target.

Lecture 8: Deterministic Bottom-Up Parsing

Parser Generation. Bottom-Up Parsing. Constructing LR Parser. LR Parsing. Construct parse tree bottom-up --- from leaves to the root

CS /534 Compiler Construction University of Massachusetts Lowell

Syntax Analysis Part I

Transcription:

CS 406/534 Compiler Construction Parsing Part II LL(1) and LR(1) Parsing Prof. Li Xu Dept. of Computer Science UMass Lowell Fall 2004 Part of the course lecture notes are based on Prof. Keith Cooper, Prof. Ken Kennedy and Dr. Linda Torczon s teaching materials at Rice University. All rights reserved. 1

Administravia Lab 2 has been posted on the web Due on 11/14, in about 4 weeks Parser and Type Checker for NOTHING More about the lab in today s lecture Check the web for lab handout and NOTHING specification Non-trivial project, need to start early CS406/534 Fall 2004, Prof. Li Xu 2 2

Administravia Midterm exam handed out today Due on 10/25 before the class Covers from Lecture 1 to today s lecture Ch1 Intro Ch3.4 Bottom-up parsing, no register allocation Closed-book, closed-notes, take-home Two-hour time limit Review lecture notes and textbook before you take the exam. Exercises are good practice. Sign the honor pledge CS406/534 Fall 2004, Prof. Li Xu 3 3

What We Did Last Time Introduction to parsing CFG, derivation, ambiguity, left recursion Predictive top-down parsing Recursive descent parsing CS406/534 Fall 2004, Prof. Li Xu 4 4

Today s Goals Table-driven LL(1) Parsing LR(1) Shift-Reduce Parsing Lab 2 Mechanics CS406/534 Fall 2004, Prof. Li Xu 5 5

Building Top-down Parsers Given an LL(1) grammar, and its FIRST & FOLLOW sets Emit a routine for each non-terminal Nest of if-then-else statements to check alternate rhs s Each returns true on success and throws an error on false Simple, working (, perhaps ugly,) code This automatically constructs a recursive-descent parser Another approach Use table to encode the options Interpret the table with a skeleton, as we did in scanning CS406/534 Fall 2004, Prof. Li Xu 6 6

Strategy Table-driven LL(1) Parsing Encode knowledge in a table Use a standard skeleton parser to interpret the table Example The non-terminal Factor has three expansions ( Expr ) or Identifier or Number Terminal Symbols Table might look like: 1 Goal Expr 2 Expr Term Expr 3 Expr + Term Expr 4 Term Expr 5 ε 6 Term Factor Term 7 Term * Factor Term 8 / Factor Term 9 ε 10 Factor number 11 id 12 (Expr) + - * / Id. Num. ( EOF Non-terminal Symbols Factor 11 10 12 CS406/534 Fall 2004, Prof. Li Xu 7 7

Table-driven LL(1) Parsing Building the complete table Need a row for every NT & a column for every T Need a table-driven interpreter for the table CS406/534 Fall 2004, Prof. Li Xu 8 8

Table-driven LL(1) Parsing Building the complete table Need a row for every NT & a column for every T Need an algorithm to build the table Filling in TABLE[X,y], X NT, y T 1. entry is the rule X β, if y FIRST(β ) 2. entry is the rule X ε if y FOLLOW(X ) and X ε G 3. entry is error if neither 1 nor 2 define it If any entry is defined multiple times, G is not LL(1) This is the LL(1) table construction algorithm CS406/534 Fall 2004, Prof. Li Xu 9 9

LL(1) Expression Parser 1 Goal Expr 2 Expr Term Expr 3 Expr + Term Expr 4 Term Expr 5 ε 6 Term Factor Term 7 Term * Factor Term 8 / Factor Term 9 ε 10 Factor number 11 id 12 (Expr) FIRST( Goal) = FIRST( Expr) = FIRST( Term) = FIRST( Factor) = { id, number, ( } FIRST( Expr ) = { +, -, ε } FIRST( Term ) = { *, /, ε } FOLLOW( Goal) = { EOF } FOLLOW( Expr) = { ), EOF } FOLLOW( Expr ) = { ), EOF } FOLLOW( Term) = { +, -, ), EOF } FOLLOW( Term ) = { +, -, ), EOF } FOLLOW( Factor ) = { +, -, *, /, ), EOF } CS406/534 Fall 2004, Prof. Li Xu 10 10

LL(1) Expression Parsing Table + - * / Id Num ( ) EOF Goal 1 1 1 Expr 2 2 2 Expr 3 4 5 5 Term 6 6 6 Term 9 9 7 8 9 9 Factor 11 10 12 CS406/534 Fall 2004, Prof. Li Xu 11 11

LL(1) Skeleton Parser token next_token() push EOF onto Stack push the start symbol, S, onto Stack TOS top of Stack loop forever if TOS = EOF and token = EOF then break & report success exit on success else if TOS is a terminal then if TOS matches token then pop Stack // recognized TOS token next_token() else report error looking for TOS else // TOS is a non-terminal if TABLE[TOS,token] is A B 1 B 2 B k then pop Stack // get rid of A push B k, B k-1,, B 1 // in that order else report error expanding TOS TOS top of Stack CS406/534 Fall 2004, Prof. Li Xu 12 12

X-2*y Example CS406/534 Fall 2004, Prof. Li Xu 13 13

The Big Picture Top-down parsers (LL(1), recursive descent) Start at the root of the parse tree and grow toward leaves Pick a production & try to match the input Bad pick may need to backtrack Some grammars are backtrack-free (predictive parsing) Bottom-up parsers (LR(1), operator precedence) Start at the leaves and grow toward root As input is consumed, encode possibilities in an internal state Start in a state valid for legal first tokens Bottom-up parsers handle a large class of grammars CS406/534 Fall 2004, Prof. Li Xu 14 14

Bottom-up Parsing The point of parsing is to construct a derivation A derivation consists of a series of rewrite steps S γ 0 γ 1 γ 2 γ n 1 γ n sentence Each γ i is a sentential form If γ contains only terminal symbols, γ is a sentence in L(G) If γ contains 1 non-terminals, γ is a sentential form To get γ i from γ i 1, expand some NT A γ i 1 by using A β Replace the occurrence of A γ i 1 with β to get γ i In a leftmost derivation, it would be the first NT A γ i 1 A left-sentential form occurs in a leftmost derivation A right-sentential form occurs in a rightmost derivation CS406/534 Fall 2004, Prof. Li Xu 15 15

Bottom-up Parsing A bottom-up parser builds a derivation by working from the input sentence back toward the start symbol S S γ 0 γ 1 γ 2 γ n 1 γ n sentence To reduce γ i to γ i 1 match some rhs β against γ i then replace β with its corresponding lhs, A. (assuming the production A β) In terms of the parse tree, this is working from leaves to root Nodes with no parent in a partial tree form its upper fringe Since each replacement of β with A shrinks the upper fringe, we call it a reduction. The parse tree need not be built, it can be simulated parse tree nodes = words + reductions CS406/534 Fall 2004, Prof. Li Xu 16 16

Finding Reductions Consider the simple grammar 1 Goal a A B e 2 A A b c 3 b 4 B d And the input string abbcde Sentential Next Reduction Form Prod n Pos n abbcde 3 2 a A bcde 2 4 a A de 4 3 a A B e 1 4 Goal The trick is scanning the input and finding the next reduction The mechanism for doing this must be efficient CS406/534 Fall 2004, Prof. Li Xu 17 17

Finding Reductions The parser must find a substring β of the tree s frontier that matches some production A β that occurs as one step in the rightmost derivation ( β A is in RRD) Informally, we call this substring β a handle Formally, A handle of a right-sentential form γ is a pair <A β,k> where A β P and k is the position in γ of β s rightmost symbol. If <A β,k> is a handle, then replacing β at k with A produces the right sentential form from which γ is derived in the rightmost derivation. Because γ is a right-sentential form, the substring to the right of a handle contains only terminal symbols the parser doesn t need to scan past the handle CS406/534 Fall 2004, Prof. Li Xu 18 18

Finding Reductions Critical Insight If G is unambiguous, then every right-sentential form has a unique handle. If we can find those handles, we can build a derivation! Sketch of Proof: 1 G is unambiguous rightmost derivation is unique 2 a unique production A β applied to derive γ i from γ i 1 3 a unique position k at which A β is applied 4 a unique handle <A β,k> This all follows from the definitions CS406/534 Fall 2004, Prof. Li Xu 19 19

Example 1 Goal Expr 2 Expr Expr + Term 3 Expr Term 4 Term 5 Term Term * Factor 6 Term / Factor 7 Factor 8 Factor number 9 id 10 ( Expr ) Prod n. Sentential Form Handle Goal 1 Expr 1,1 3 Expr Term 3,3 5 Expr Term * Factor 5,5 9 Expr Term * <id,y> 9,5 7 Expr Factor * <id,y> 7,3 8 Expr <num,2> * <id,y> 8,3 4 Term <num,2> * <id,y> 4,1 7 Factor <num,2> * <id,y> 7,1 9 <id,x> <num,2> * <id,y> 9,1 The expression grammar Handles for rightmost derivation of x 2 * y CS406/534 Fall 2004, Prof. Li Xu 20 20

Handle-pruning, Bottom-up Parsers The process of discovering a handle & reducing it to the appropriate left-hand side is called handle pruning Handle pruning forms the basis for a bottom-up parsing method To construct a rightmost derivation S γ 0 γ 1 γ 2 γ n 1 γ n w Apply the following simple algorithm for i n to 1 by 1 Find the handle <A i β i, k i > in γ i Replace β i with A i to generate γ i 1 This takes 2n steps CS406/534 Fall 2004, Prof. Li Xu 21 21

Handle-pruning, Bottom-up Parsers One implementation technique is the shift-reduce parser push INVALID token next_token( ) repeat until (top of stack = Goal and token = EOF) if the top of the stack is a handle A β then // reduce β to A pop β symbols off the stack push A onto the stack else if (token EOF) then // shift push token token next_token( ) else // need to shift, but out of input report an error How do errors show up? failure to find a handle hitting EOF & needing to shift (final else clause) Either generates an error CS406/534 Fall 2004, Prof. Li Xu 22 22

Back to x - 2 * y Stack Input Handle Action $ id num * id none shift $ id num * id 1. Shift until the top of the stack is the right end of a handle 2. Find the left end of the handle & reduce CS406/534 Fall 2004, Prof. Li Xu 23 23

Back to x - 2 * y Stack Input Handle Action $ id num * id none shift $ id num * id 9,1 red. 9 $ Factor num * id 7,1 red. 7 $ Term num * id 4,1 red. 4 $ Expr num * id 1. Shift until the top of the stack is the right end of a handle 2. Find the left end of the handle & reduce CS406/534 Fall 2004, Prof. Li Xu 24 24

Back to x - 2 * y Stack Input Handle Action $ id num * id none shift $ id num * id 9,1 red. 9 $ Factor num * id 7,1 red. 7 $ Term num * id 4,1 red. 4 $ Expr num * id none shift $ Expr num * id none shift $ Expr num * id 1. Shift until the top of the stack is the right end of a handle 2. Find the left end of the handle & reduce CS406/534 Fall 2004, Prof. Li Xu 25 25

Back to x - 2 * y Stack Input Handle Action $ id num * id none shift $ id num * id 9,1 red. 9 $ Factor num * id 7,1 red. 7 $ Term num * id 4,1 red. 4 $ Expr num * id none shift $ Expr num * id none shift $ Expr num * id 8,3 red. 8 $ Expr Factor * id 7,3 red. 7 $ Expr Term * id 1. Shift until the top of the stack is the right end of a handle 2. Find the left end of the handle & reduce CS406/534 Fall 2004, Prof. Li Xu 26 26

Back to x - 2 * y Stack Input Handle Action $ id num * id none shift $ id num * id 9,1 red. 9 $ Factor num * id 7,1 red. 7 $ Term num * id 4,1 red. 4 $ Expr num * id none shift $ Expr num * id none shift $ Expr num * id 8,3 red. 8 $ Expr Factor * id 7,3 red. 7 $ Expr Term * id none shift $ Expr Term * id none shift $ Expr Term * id 1. Shift until the top of the stack is the right end of a handle 2. Find the left end of the handle & reduce CS406/534 Fall 2004, Prof. Li Xu 27 27

Back to x 2 * y Stack Input Handle Action $ id num * id none shift $ id num * id 9,1 red. 9 $ Factor num * id 7,1 red. 7 $ Term num * id 4,1 red. 4 $ Expr num * id none shift $ Expr num * id none shift $ Expr num * id 8,3 red. 8 $ Expr Factor * id 7,3 red. 7 $ Expr Term * id none shift $ Expr Term * id none shift $ Expr Term * id 9,5 red. 9 $ Expr Term * Factor 5,5 red. 5 $ Expr Term 3,3 red. 3 $ Expr 1,1 red. 1 $ Goal none accept 5 shifts + 9 reduces + 1 accept 1. Shift until the top of the stack is the right end of a handle 2. Find the left end of the handle & reduce CS406/534 Fall 2004, Prof. Li Xu 28 28

Example Stack Input Action $ id num * id shift $ id num * id red. 9 $ Factor num * id red. 7 $ Term num * id red. 4 $ Expr num * id shift $ Expr num * id shift $ Expr num * id red. 8 $ Expr Factor * id red. 7 $ Expr Term * id shift $ Expr Term * id shift $ Expr Term * id red. 9 $ Expr Term * Factor red. 5 $ Expr Term red. 3 $ Expr red. 1 $ Goal accept Goal Expr Expr Term Term Fact. Term Fact. * <id,x> <num,2> Fact. <id,y> CS406/534 Fall 2004, Prof. Li Xu 29 29

Shift-Reduce Parsing Shift reduce parsers are easily built and easily understood A shift-reduce parser has just four actions Shift next word is shifted onto the stack Reduce right end of handle is at top of stack Locate left end of handle within the stack Pop handle off stack & push appropriate lhs Accept stop parsing & report success Error call an error reporting/recovery routine Accept & Error are simple Shift is just a push and a call to the scanner Reduce takes rhs pops & 1 push Handle finding is key handle is on stack finite set of handles use a DFA! If handle-finding requires state, put it in the stack 2x work CS406/534 Fall 2004, Prof. Li Xu 30 30

An Important Lesson about Handles To be a handle, a substring of a sentential form γ must have two properties: It must match the right hand side β of some rule A β There must be some rightmost derivation from the goal symbol that produces the sentential form γ with A β as the last production applied Simply looking for right hand sides that match strings is not good enough Critical Question: How can we know when we have found a handle without generating lots of different derivations? Answer: we use look ahead in the grammar along with tables produced as the result of analyzing the grammar. LR(1) parsers build a DFA that runs over the stack & finds them CS406/534 Fall 2004, Prof. Li Xu 31 31

LR(k) items The LR(1) table construction algorithm uses LR(1) items to represent valid configurations of an LR(1) parser An LR(k) item is a pair [P, δ], where P is a production A β with a at some position in the rhs δ is a lookahead string of length k (words or EOF) The in an item indicates the position of the top of the stack [A βγ,a] means that the input seen so far is consistent with the use of A βγ immediately after the symbol on top of the stack [A β γ,a] means that the input sees so far is consistent with the use of A βγ at this point in the parse, and that the parser has already recognized β. [A βγ,a] means that the parser has seen βγ, and that a lookahead symbol of a is consistent with reducing to A. CS406/534 Fall 2004, Prof. Li Xu 32 32

LR(1) Table Construction High-level overview 1 Build the canonical collection of sets of LR(1) Items, I a Begin in an appropriate state, s 0 [S S,EOF], along with any equivalent items Derive equivalent items as closure( s 0 ) b Repeatedly compute, for each s k, and each X, goto(s k,x) If the set is not already in the collection, add it Record all the transitions created by goto( ) This eventually reaches a fixed point 2 Fill in the table from the collection of sets of LR(1) items The canonical collection completely encodes the transition diagram for the handle-finding DFA CS406/534 Fall 2004, Prof. Li Xu 33 33

The SheepNoise Grammar We will use this grammar extensively in today s lecture 1. Goal SheepNoise 2. SheepNoise SheepNoise baa 3. baa CS406/534 Fall 2004, Prof. Li Xu 34 34

Define FIRST as Computing FIRST Sets If α * aβ, a T, β (T NT)*, then a FIRST(α) If α * ε, then ε FIRST(α) Note: if α = Xβ, FIRST(α) = FIRST(X) To compute FIRST Use a fixed-point method FIRST(A) 2 (T ε) Loop is monotonic Algorithm halts Computation of FOLLOW uses FIRST, so build FIRST sets before FOLLOW sets CS406/534 Fall 2004, Prof. Li Xu 35 35

Computing FIRST Sets for each x T, FIRST(x) { x } for each A NT, FIRST(A) Ø while (FIRST sets are still changing) for each p P, of the form A β, if β is ε then FIRST(A) FIRST(A) { ε } on on SN G SN baa else if β is B 1 B 2 B k then begin FIRST(A) FIRST(A) ( FIRST(B 1 ) { ε } ) for i 1 to k 1 by 1 while ε FIRST(B i ) FIRST(A) FIRST(A) ( FIRST(B i +1 ) { ε } ) if i = k 1 and ε FIRST(B k ) then FIRST(A) FIRST(A) { ε } end For SheepNoise: FIRST(Goal) = { baa } FIRST(SN ) = { baa } FIRST(baa) = { baa } CS406/534 Fall 2004, Prof. Li Xu 36 36

Computing FOLLOW Sets FOLLOW(S) {EOF } for each A NT, FOLLOW(A) Ø while (FOLLOW sets are still changing) for each p P, of the form A β 1 β 2 β k FOLLOW(β k ) FOLLOW(β k ) FOLLOW(A) TRAILER FOLLOW(A) for i k down to 2 if ε FIRST(β i ) then FOLLOW(β i-1 ) FOLLOW(β i-1 ) { FIRST(β i ) { ε }} TRAILER else FOLLOW(β i-1 ) FOLLOW(β i-1 ) FIRST(β i ) TRAILER Ø For SheepNoise: FOLLOW(Goal ) = { EOF } FOLLOW(SN) = { baa, EOF } CS406/534 Fall 2004, Prof. Li Xu 37 37

Computing Closures Closure(s) adds all the items implied by items already in s Any item [A β Bδ,a] implies [B τ,x] for each production with B on the lhs, and each x FIRST(δa) Since βbδ is valid, any way to derive βbδ is valid, too The algorithm Closure( s ) while ( s is still changing ) items [A β Bδ,a] s productions B τ P b FIRST(δa) // δ might be ε if [B τ,b] s then add [B τ,b] to s Classic fixed-point method Halts because s ITEMS Worklist version is faster Closure fills out a state CS406/534 Fall 2004, Prof. Li Xu 38 38

Example From SheepNoise Initial step builds the item [Goal SheepNoise,EOF] and takes its closure( ) Closure( [Goal SheepNoise,EOF] ) Item [Goal SheepNoise,EOF] [SheepNoise SheepNoise baa,eof] [SheepNoise baa,eof] [SheepNoise SheepNoise baa,baa] [SheepNoise baa,baa] From Original item 1, δa is EOF 1, δa is EOF 2, δa is baa EOF 2, δa is baa EOF Remember, this is the left-recursive SheepNoise; EaC shows the rightrecursive version. So, S 0 is { [Goal SheepNoise,EOF], [SheepNoise SheepNoise baa,eof], [SheepNoise baa,eof], [SheepNoise SheepNoise baa,baa], [SheepNoise baa,baa] } CS406/534 Fall 2004, Prof. Li Xu 39 39

Computing Gotos Goto(s,x) computes the state that the parser would reach if it recognized an x while in state s Goto( { [A β Xδ,a] }, X )produces [A βx δ,a] (easy part) Should also includes closure( [A βx δ,a] ) (fill out the state) The algorithm Goto( s, X ) new Ø items [A β Xδ,a] s new new [A βx δ,a] return closure(new) Not a fixed-point method! Straightforward computation Uses closure ( ) Goto() moves forward CS406/534 Fall 2004, Prof. Li Xu 40 40

Example from SheepNoise S 0 is { [Goal SheepNoise,EOF], [SheepNoise SheepNoise baa,eof], [SheepNoise baa,eof], [SheepNoise SheepNoise baa,baa], [SheepNoise baa,baa] } Goto( S 0, baa ) Loop produces Item From [SheepNoise baa, EOF] Item 3 in s 0 [SheepNoise baa, baa] Item 5 in s 0 Closure adds nothing since is at end of rhs in each item In the construction, this produces s 2 { [SheepNoise baa, {EOF,baa}]} New, but obvious, notation for two distinct items [SheepNoise baa, EOF] & [SheepNoise baa, baa] CS406/534 Fall 2004, Prof. Li Xu 41 41

Example from SheepNoise S 0 : { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa], [SheepNoise baa, baa]} S 1 = Goto(S 0, SheepNoise) = { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa] } S 2 = Goto(S 0, baa) = { [SheepNoise baa, EOF], [SheepNoise baa, baa]} S 3 = Goto(S 1, baa) = { [SheepNoise SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa] } CS406/534 Fall 2004, Prof. Li Xu 42 42

Building the Canonical Collection Start from s 0 = closure( [S S,EOF ] ) Repeatedly construct new states, until all are found The algorithm s 0 closure ([S S,EOF]) S { s 0 } k 1 while ( S is still changing ) s j S and x ( T NT ) s k goto(s j,x) record s j s k on x if s k S then S S s k k k + 1 Fixed-point computation Loop adds to S S 2 ITEMS, so S is finite Worklist version is faster CS406/534 Fall 2004, Prof. Li Xu 43 43

Example from SheepNoise Starts with S 0 S 0 : { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa], [SheepNoise baa, baa]} CS406/534 Fall 2004, Prof. Li Xu 44 44

Example from SheepNoise Starts with S 0 S 0 : { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa], [SheepNoise baa, baa]} Iteration 1 computes S 1 = Goto(S 0, SheepNoise) = { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa] } S 2 = Goto(S 0, baa) = { [SheepNoise baa, EOF], [SheepNoise baa, baa]} CS406/534 Fall 2004, Prof. Li Xu 45 45

Example from SheepNoise Starts with S 0 S 0 : { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa], [SheepNoise baa, baa]} Iteration 1 computes S 1 = Goto(S 0, SheepNoise) = { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa] } S 2 = Goto(S 0, baa) = { [SheepNoise baa, EOF], [SheepNoise baa, baa]} Iteration 2 computes S 3 = Goto(S 1, baa) = { [SheepNoise SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa] } CS406/534 Fall 2004, Prof. Li Xu 46 46

Example from SheepNoise Starts with S 0 S 0 : { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa], [SheepNoise baa, baa]} Iteration 1 computes S 1 = Goto(S 0, SheepNoise) = { [Goal SheepNoise, EOF], [SheepNoise SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa] } S 2 = Goto(S 0, baa) = { [SheepNoise baa, EOF], [SheepNoise baa, baa]} Iteration 2 computes S 3 = Goto(S 1, baa) = { [SheepNoise SheepNoise baa, EOF], [SheepNoise SheepNoise baa, baa] } Nothing more to compute, since is at the end of every item in S 3. CS406/534 Fall 2004, Prof. Li Xu 47 47

LR(1) Parsers How does this LR(1) stuff work? Unambiguous grammar unique rightmost derivation Keep upper fringe on a stack All active handles include top of stack (TOS) Shift inputs until TOS is right end of a handle Language of handles is regular (finite) Build a handle-recognizing DFA ACTION & GOTO tables encode the DFA To match subterm, invoke subterm DFA & leave old DFA s state on stack Final state in DFA a reduce action New state is GOTO[state at TOS (after pop), lhs] S 0 SN baa S 1 S 2 baa Reduce action S 3 Reduce action For SN, this takes the DFA to s 1 Control DFA for SN CS406/534 Fall 2004, Prof. Li Xu 48 48

Building LR(1) Parsers How do we generate the ACTION and GOTO tables? Use the grammar to build a model of the DFA Use the model to build ACTION & GOTO tables If construction succeeds, the grammar is LR(1) The Big Picture Model the state of the parser Use two functions goto( s, X ) and closure( s ) goto() is analogous to move() in the subset construction closure() adds information to round out a state Build up the states and transition functions of the DFA Use this information to fill in the ACTION and GOTO tables CS406/534 Fall 2004, Prof. Li Xu 49 49

Mechanics of Lab 2 Implement front-end for NOTHING Build lexer and parser Type checker and context-sensitive analysis Generate AST tree, collect program stat using AST tree Use tools to automate your parser generation ANTLR can create lexer, parser and tree generators from ANTLR grammar file Lex/yacc, flex/bison, jlex/cup also possible Massage NOTHING grammar to work with tools Embed action code in tool spec file CS406/534 Fall 2004, Prof. Li Xu 50 50

NOTHING PASCAL-like small language for compiler implementation practice Data types are int, float, char and one dimensional array Two-level program scopes: main, subprogram Assignment, conditional and other statements Arithemetic and logical expressions for NOTHING data types CS406/534 Fall 2004, Prof. Li Xu 51 51

NOTHING Parser Massage the NOTHING grammar for use with tools Left-recursion Ambiguity Operator precedence Syntax error handling and recovery Informative compiler error messages CS406/534 Fall 2004, Prof. Li Xu 52 52

Type Checking Semantic analysis using type information Collect type info for variables Report context-sensitive type errors CS406/534 Fall 2004, Prof. Li Xu 53 53

AST Tree Build AST Tree during parsing Record token info with tree node Separate tree walker on AST tree to collect program stat CS406/534 Fall 2004, Prof. Li Xu 54 54

ANTLR Expression Parser class CalcScanner extends Lexer; INT: ('1'..'9')('0'..'9')* '0'; LPAREN: '('; RPAREN: ')'; WS: ' ' '\t' { _ttype=token.skip; } ; NEWLINE: '\n' {newline(); } ; PLUS: '+'; MINUS: '-'; TIMES: '*'; DIVIDES: '/'; class CalcParser extends Parser; file: (expr NEWLINE)+; expr: term ((PLUS MINUS) term)* term: factor ((TIMES DIVIDES) factor)*; factor: INT LPAREN expr RPAREN; CS406/534 Fall 2004, Prof. Li Xu 55 55

Embed Action Code file {int e; } : (e=expr NEWLINE {System.out.println("Result: "+e);} )+; expr returns [int result=0] {int t; boolean is_add;} : t=term {result = t;} ((PLUS {is_add = true;} MINUS {is_add = false;}) t=term {if (is_add) result += t; else result -= t;} )*; public final int expr() throws RecognitionException, TokenStreamException { int result=0; int t; boolean is_add; CS406/534 Fall 2004, Prof. Li Xu 56 56

Embed Action Code term returns [int result=0] {int f; boolean is_mult; } : f=factor {result = f;} ((TIMES {is_mult = true;} DIVIDES {is_mult = false;}) f=factor {if (is_mult) result *= f; else result /= f;} )*; factor returns [int result=0] { int e; } : i:int {result = Integer.parseInt(i.getText());} LPAREN e=expr RPAREN {result = e;} ; CS406/534 Fall 2004, Prof. Li Xu 57 57

class CalcParser extends Parser; options { buildast = true; } Build AST Tree file {currentast.root = astfactory.create(null, "null");} :! (e:expr NEWLINE { #file=#(astfactory.create(expr, "expr"), file, e); })+; expr : term ((PLUS^ MINUS^ ) term )*; term : factor ((TIMES^ DIVIDES^) factor)*; factor : INT LPAREN! expr RPAREN!; CS406/534 Fall 2004, Prof. Li Xu 58 58

Tree Walker file {int e; } : #(EXPR file e=expr {System.out.println("expr: "+e);} ) NULL; expr returns [int r=0] {int a, b;} : #(PLUS a=expr b=expr) {r=a+b;} #(MINUS a=expr b=expr) {r=a-b;} #(TIMES a=expr b=expr) {r=a*b;} #(DIVIDES a=expr b=expr) {r=a/b;} i:int {r=integer.parseint(i.gettext());}; CS406/534 Fall 2004, Prof. Li Xu 59 59

Summary Table-driven LL(1) Parsing LR(1) Shift-Reduce Parsing Lab 2 Mechanics CS406/534 Fall 2004, Prof. Li Xu 60 60

Next Class Wrap-up of parsing Context-sensitive analysis Attribute grammar CS406/534 Fall 2004, Prof. Li Xu 61 61