Syntax Analysis, III Comp 412

Similar documents
Syntax Analysis, III Comp 412

Parsing. Note by Baris Aktemur: Our slides are adapted from Cooper and Torczon s slides that they prepared for COMP 412 at Rice.

Parsing II Top-down parsing. Comp 412

Parsing Part II (Top-down parsing, left-recursion removal)

Syntactic Analysis. Top-Down Parsing

CSCI312 Principles of Programming Languages

Introduction to Parsing. Comp 412

CS 406/534 Compiler Construction Parsing Part I

Parsing III. (Top-down parsing: recursive descent & LL(1) )

Syntax Analysis, V Bottom-up Parsing & The Magic of Handles Comp 412

Parsing III. CS434 Lecture 8 Spring 2005 Department of Computer Science University of Alabama Joel Jones

Computer Science 160 Translation of Programming Languages

Parsing Part II. (Ambiguity, Top-down parsing, Left-recursion Removal)

Parsing II Top-down parsing. Comp 412

Types of parsing. CMSC 430 Lecture 4, Page 1

Syntactic Analysis. Top-Down Parsing. Parsing Techniques. Top-Down Parsing. Remember the Expression Grammar? Example. Example

Parsing. Roadmap. > Context-free grammars > Derivations and precedence > Top-down parsing > Left-recursion > Look-ahead > Table-driven parsing

Syntax Analysis, VI Examples from LR Parsing. Comp 412

CS1622. Today. A Recursive Descent Parser. Preliminaries. Lecture 9 Parsing (4)

3. Parsing. Oscar Nierstrasz

Syntax Analysis, VII One more LR(1) example, plus some more stuff. Comp 412 COMP 412 FALL Chapter 3 in EaC2e. target code.

CA Compiler Construction

CS415 Compilers. Syntax Analysis. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

Compilers. Yannis Smaragdakis, U. Athens (original slides by Sam

Chapter 3. Parsing #1

Prelude COMP 181 Tufts University Computer Science Last time Grammar issues Key structure meaning Tufts University Computer Science

Computing Inside The Parser Syntax-Directed Translation. Comp 412

8 Parsing. Parsing. Top Down Parsing Methods. Parsing complexity. Top down vs. bottom up parsing. Top down vs. bottom up parsing

Top-Down Parsing and Intro to Bottom-Up Parsing. Lecture 7

Building a Parser Part III

Chapter 4: LR Parsing

Top down vs. bottom up parsing

Computing Inside The Parser Syntax-Directed Translation. Comp 412 COMP 412 FALL Chapter 4 in EaC2e. source code. IR IR target.

Parsing #1. Leonidas Fegaras. CSE 5317/4305 L3: Parsing #1 1

CS 406/534 Compiler Construction Parsing Part II LL(1) and LR(1) Parsing

Top-Down Parsing and Intro to Bottom-Up Parsing. Lecture 7

Computing Inside The Parser Syntax-Directed Translation, II. Comp 412

Computing Inside The Parser Syntax-Directed Translation, II. Comp 412 COMP 412 FALL Chapter 4 in EaC2e. source code. IR IR target.

COMP 181. Prelude. Next step. Parsing. Study of parsing. Specifying syntax with a grammar

Introduction to Parsing

Building a Parser III. CS164 3:30-5:00 TT 10 Evans. Prof. Bodik CS 164 Lecture 6 1

The procedure attempts to "match" the right hand side of some production for a nonterminal.

Syntax Analysis. Martin Sulzmann. Martin Sulzmann Syntax Analysis 1 / 38

Fall Compiler Principles Lecture 2: LL parsing. Roman Manevich Ben-Gurion University of the Negev

Bottom-up Parser. Jungsik Choi

LL(k) Parsing. Predictive Parsers. LL(k) Parser Structure. Sample Parse Table. LL(1) Parsing Algorithm. Push RHS in Reverse Order 10/17/2012

CS 314 Principles of Programming Languages

CSE 130 Programming Language Principles & Paradigms Lecture # 5. Chapter 4 Lexical and Syntax Analysis

Compilers. Predictive Parsing. Alex Aiken

CS 230 Programming Languages

Academic Formalities. CS Modern Compilers: Theory and Practise. Images of the day. What, When and Why of Compilers

Syntax Analysis. The Big Picture. The Big Picture. COMP 524: Programming Languages Srinivas Krishnan January 25, 2011

Defining syntax using CFGs

MIT Top-Down Parsing. Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology

Chapter 4. Lexical and Syntax Analysis

Syntax Analysis. COMP 524: Programming Language Concepts Björn B. Brandenburg. The University of North Carolina at Chapel Hill

CS 2210 Sample Midterm. 1. Determine if each of the following claims is true (T) or false (F).

Compiler Construction 2016/2017 Syntax Analysis

CS415 Compilers. Lexical Analysis

LECTURE 7. Lex and Intro to Parsing

Bottom-up parsing. Bottom-Up Parsing. Recall. Goal: For a grammar G, withstartsymbols, any string α such that S α is called a sentential form

Ambiguity, Precedence, Associativity & Top-Down Parsing. Lecture 9-10

Intermediate Representations

Administrativia. WA1 due on Thu PA2 in a week. Building a Parser III. Slides on the web site. CS164 3:30-5:00 TT 10 Evans.

Handling Assignment Comp 412

Generating Code for Assignment Statements back to work. Comp 412 COMP 412 FALL Chapters 4, 6 & 7 in EaC2e. source code. IR IR target.

Parser Generation. Bottom-Up Parsing. Constructing LR Parser. LR Parsing. Construct parse tree bottom-up --- from leaves to the root

Bottom-Up Parsing. Parser Generation. LR Parsing. Constructing LR Parser

4. Lexical and Syntax Analysis

Lexical Analysis. Introduction

EECS 6083 Intro to Parsing Context Free Grammars

Arrays and Functions

Lexical Analysis. Note by Baris Aktemur: Our slides are adapted from Cooper and Torczon s slides that they prepared for COMP 412 at Rice.

CS 406/534 Compiler Construction Putting It All Together

4. Lexical and Syntax Analysis

The Parsing Problem (cont d) Recursive-Descent Parsing. Recursive-Descent Parsing (cont d) ICOM 4036 Programming Languages. The Complexity of Parsing


Lexical Analysis - An Introduction. Lecture 4 Spring 2005 Department of Computer Science University of Alabama Joel Jones

10/4/18. Lexical and Syntactic Analysis. Lexical and Syntax Analysis. Tokenizing Source. Scanner. Reasons to Separate Lexical and Syntactic Analysis

Fall Compiler Principles Lecture 2: LL parsing. Roman Manevich Ben-Gurion University of the Negev

Compiler Construction: Parsing

Grammars. CS434 Lecture 15 Spring 2005 Department of Computer Science University of Alabama Joel Jones

CMSC 330: Organization of Programming Languages

CSE 417 Dynamic Programming (pt 6) Parsing Algorithms

10/5/17. Lexical and Syntactic Analysis. Lexical and Syntax Analysis. Tokenizing Source. Scanner. Reasons to Separate Lexical and Syntax Analysis

4 (c) parsing. Parsing. Top down vs. bo5om up parsing

Section A. A grammar that produces more than one parse tree for some sentences is said to be ambiguous.

Note that for recursive descent to work, if A ::= B1 B2 is a grammar rule we need First k (B1) disjoint from First k (B2).

EDAN65: Compilers, Lecture 04 Grammar transformations: Eliminating ambiguities, adapting to LL parsing. Görel Hedin Revised:

A left-sentential form is a sentential form that occurs in the leftmost derivation of some sentence.

Syntax Analysis Part I

Local Optimization: Value Numbering The Desert Island Optimization. Comp 412 COMP 412 FALL Chapter 8 in EaC2e. target code

Outline. Top Down Parsing. SLL(1) Parsing. Where We Are 1/24/2013

Context-free grammars

Comp 411 Principles of Programming Languages Lecture 3 Parsing. Corky Cartwright January 11, 2019

Fall Compiler Principles Lecture 3: Parsing part 2. Roman Manevich Ben-Gurion University

EDA180: Compiler Construc6on. Top- down parsing. Görel Hedin Revised: a

Syntax. Syntax. We will study three levels of syntax Lexical Defines the rules for tokens: literals, identifiers, etc.

Revisit the example. Transformed DFA 10/1/16 A B C D E. Start

3. Syntax Analysis. Andrea Polini. Formal Languages and Compilers Master in Computer Science University of Camerino

Transcription:

Updated algorithm for removal of indirect left recursion to match EaC3e (3/2018) COMP 412 FALL 2018 Midterm Exam: Thursday October 18, 7PM Herzstein Amphitheater Syntax Analysis, III Comp 412 source code IR Front End Optimizer Back End IR target code Copyright 2018, Keith D. Cooper & Linda Torczon, all rights reserved. Students enrolled in Comp 412 at Rice University have explicit permission to make copies of these materials for their personal use. Faculty from other educational institutions may use these materials for nonprofit educational purposes, provided this copyright notice is preserved. Chapter 3 in EaC2e

The Classic Expression Grammar Review 0 Goal Expr 1 Expr Expr + Term 2 Expr - Term 3 Term 4 Term Term * Factor 5 Term / Factor 6 Factor 7 Factor ( Expr ) 8 number 9 id This left-recursive expression grammar encodes standard algebraic precedence and left-associativity (which corresponds to left-to-right evaluation). We refer to this grammar as either the classic expression grammar or the canonical expression grammar Both have the same acronym, CEG Classic Expression Grammar, Left-recursive Version COMP 412, Fall 2018 1

Top-down Parsing Review The Algorithm A top-down parser starts with the root of the parse tree The root node is labeled with the goal symbol of the grammar Construct the root node of the parse tree Repeat until lower fringe of the parse tree matches the input string 1. At a node labeled with NT A, select a production with A on its LHS and, for each symbol on its RHS, construct the appropriate child 2. When a terminal symbol is added to the fringe and it doesn t match the fringe, backtrack 3. Find the next node to be expanded (label Î NT) The key is selecting the right production in step 1 That choice should be guided by the input string COMP 412, Fall 2018 2

Top-down parsing with the CEG Review With deterministic choices, the parse can expand indefinitely Rule Sentential Form Input Goal x - 2 * y 0 Expr x - 2 * y 1 Expr +Term x - 2 * y 1 Expr + Term +Term x - 2 * y 1 Expr + Term +Term + Term x - 2 * y 1 and so on. x - 2 * y Consumes no input! This expansion doesn t terminate Wrong choice of expansion leads to non-termination Non-termination is a bad property for a parser to have Parser must make the right choice COMP 412, Fall 2018 3 *

Recursion in the CEG 0 Goal Expr 1 Expr Expr + Term 2 Expr - Term 3 Term 4 Term Term * Factor 5 Term / Factor 6 Factor 7 Factor ( Expr ) 8 number 9 id Classic Expression Grammar, Left-recursive Version COMP 412, Fall 2018 The problems with unbounded expansion arise from left-recursion in the CEG LHS symbol cannot derive, in one or more steps, a sentential form that starts with the LHS Left recursion is incompatible with top-down leftmost derivations A leftmost derivation matches the scanner s left-to-right scan We need a technique to transform left recursion into right recursion (and, possible, the reverse) Similarly, right recursion is incompatible with rightmost derivaions Review 4

Eliminating Left Recursion Review To remove left recursion, we can transform the grammar Consider a grammar fragment of the form Fee Fee a b where neither a nor b start with Fee Language is b followed by 0 or more a s We can rewrite this fragment as Fee b Fie Fie a Fie e where Fie is a new non-terminal The new grammar defines the same language as the old grammar, using only right recursion. New Idea: the e production COMP 412, Fall 2018 Added a reference to the empty string 5

Eliminating Left Recursion Review The expression grammar contains two cases of left recursion Expr Expr + Term Term Term * Factor Expr - Term Term * Factor Term Factor Applying the transformation yields Expr Expr Term Expr + Term Expr - Term Expr e Term Factor Term Term * Factor Term / Factor Term e These fragments use only right recursion COMP 412, Fall 2018 6 *

Eliminating Left Recursion Review Substituting them back into the grammar yields 0 Goal Expr 1 Expr Term Expr 2 Expr + Term Expr 3 - Term Expr 4 e 5 Term Factor Term 6 Term * Factor Term 7 / Factor Term 8 e 9 Factor ( Expr ) 10 number 11 id Right-recursive expression grammar This grammar is correct, if somewhat counter-intuitive. A top-down parser will terminate using it. A top-down parser may need to backtrack with it. It is left associative, as was the original COMP 412, Fall 2018 7

Eliminating Left Recursion LHS symbol cannot derive, in one or more steps, a sentential form that starts with the LHS The two-production transformation eliminates immediate left recursion What about more general, indirect left recursion? The general algorithm to eliminate left recursion: arrange the NTs into some order A 0, A 1,, A n for i 1 to n for s 0 to i 1 { replace each production A i A s g with A i d 1 g ½d 2 g½ ½d k g, where A s d 1 ½d 2 ½ ½d k are all the current productions for A s } eliminate any immediate left recursion on A i using the direct transformation The algorithm assumes that the initial grammar has no cycles (A i Þ + A i ), and no epsilon productions EaC2e COMP 412, shows Fall the 2018inner loop running from 1 to n, so 1 st iteration of inner loop never executes. 9

Eliminating Left Recursion How does this algorithm work? 1. Impose arbitrary order on the non-terminals 2. Outer loop cycles through NT in order 3. Inner loop ensures that a production expanding A i cannot directly derive a non-terminal A s at the start of its RHS, for s < i 4. Last step in outer loop converts any direct recursion on A i to right recursion using the transformation showed earlier 5. New non-terminals are added at the end of the order & have no left recursion; they will have right recursion on themselves At the start of the i th outer loop iteration For all k < i, no production that expands A k begins with a non-terminal A s, for s < k COMP 412, Fall 2018 10

Eliminating Indirect Left Recursion 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 A 1 b Example Grammar This grammar generates a ( ba )* Subscripts indicate the imposed order Indirect left recursion is A B A arrange the NTs into some order A 0, A 1,, A n for i 1 to n for s 0 to i 1 { replace each production A i A s g with A i d 1 g ½d 2 g½ ½d k g, where A s d 1 ½d 2 ½ ½d k are all the current productions for A s } eliminate any immediate left recursion on A i using the direct transformation COMP 412, Fall 2018 13

Eliminating Indirect Recursion General algorithm is a diagonalization, similar to LU decomposition RHS LHS NT 0 NT 1 NT 2 NT 0 0 1 0 NT 1 0 0 1 NT 2 0 1 0 Original Grammar 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 A 1 b If [i,j] element is k, then NT i derives NT j in a the k th position of its RHS Diagonal elements represent self-recursion Upper triangle represents forward references in the order Lower triangle represents backward references in the order Inner loop systematically zeroes the lower triangle Last step of outer loop eliminates ones on the diagonal COMP 412, Fall 2018 14

Eliminating Indirect Left Recursion Applying the Algorithm s = 0 1 2 i = 1 no action no iteration no iteration 2 no action see below no iteration For B 2 (i = 2) and A 1 (s = 1): First, it rewrites rule 3 with B 2 B 2 a b a b Next, it uses the rule for immediate left recursion to rewrite this pair of rules as: B 2 a b C 3 C 3 a b C 3 e Algorithm iterates through the possible backward references. Handles indirect left recursion with forward substitution of RHS for LHS Handles direct left recursion with the transformation COMP 412, Fall 2018 16

Eliminating Indirect Left Recursion Before and After 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 A 1 b 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 a b C 3 4 C 3 a b C 3 5 e Original Grammar Transformed Grammar The transformed grammar (still) generates a (ba)* COMP 412, Fall 2018 17

Eliminating Indirect Recursion General algorithm is a diagonalization, similar to LU decomposition NT 0 NT 1 NT 2 NT 0 0 1 0 NT 1 0 0 1 NT 2 0 1 0 Original Grammar 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 A 1 b Original matrix is now upper triangular NT 0 NT 1 NT 2 NT 3 NT 0 0 1 0 0 NT 1 0 0 1 0 NT 2 0 0 0 3 NT 3 0 0 0 3 Transformed Grammar 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 a b C 3 4 C 3 a b C 3 5 e COMP 412, Fall 2018 18

Eliminating Indirect Left Recursion Before and After 0 Start 0 A 2 1 A 1 B 2 a 2 a 3 B 2 A 1 b 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 a b C 3 4 C 3 a b C 3 5 e Original Grammar Transformed Grammar The transformed grammar generates a (ba)* The transformed grammar is still not predictively parsable. What does predictively parsable even mean? COMP 412, Fall 2018 19

Predictive Parsing If a top-down parser picks the wrong production, it may need to backtrack. The alternative is to look ahead in the input & use context to pick the correct production. How much lookahead is needed for a context-free grammar? In general, an arbitrarily large amount Use the Cocke-Younger, Kasami algorithm or Earley s algorithm Fortunately, Large subclasses of CFGs can be parsed with limited lookahead Most programming language constructs fall in those subclasses Among the interesting subclasses are LL(1) and LR(1) grammars We will focus, for now, on LL(1) grammars & predictive parsing COMP 412, Fall 2018 20

Predictive Parsing Basic idea Given A a b, the parser should be able to choose between a & b FIRST sets For some RHS a Î G, define FIRST(a) as the set of tokens that appear as the first symbol in some string that derives from a That is, x Î FIRST(a) iff a Þ * x g, for some g We will defer the problem of how to compute FIRST sets for the moment and assume that they are given. COMP 412, Fall 2018 21

Predictive Parsing Basic idea Given A a b, the parser should be able to choose between a & b FIRST sets For some RHS a Î G, define FIRST(a) as the set of tokens that appear as the first symbol in some string that derives from a That is, x Î FIRST(a) iff a Þ * x g, for some g The Predictive, or LL(1), Property If A a and A b both appear in the grammar, we would like FIRST(a) Ç FIRST(b) = Æ This would allow the parser to make a correct choice with a lookahead of exactly one symbol! This rule is intuitive. Unfortunately, it is not correct, because it does not handle e rules. See the next slide COMP 412, Fall 2018 22

Predictive Parsing What about e-productions? Þ They complicate the definition of the predictive, or LL(1) property If A a and A b and e Î FIRST(a), then we need to ensure that FIRST(b) is disjoint from FOLLOW(A), too, where FOLLOW(A) is the set of terminal symbols that can appear immediately after A in some sentential form Define FIRST + (A a) as FIRST(a) È FOLLOW(A), if e Î FIRST(a) FIRST(a), otherwise Note that FIRST + is defined over productions, not NTs Then, a grammar is LL(1) iff A a and A b implies that FIRST + (A a) Ç FIRST + (A b) = Æ COMP 412, Fall 2018 23

The LL(1) Property Our transformed grammar for a ( ba )* fails the test 0 Start 0 A 1 1 A 1 B 2 a 2 a 3 B 2 a b C 3 4 C 3 a b C 3 5 e Transformed Grammar Derivations for A begin with a: A B a a B a FIRST + (A B a) = { a } FIRST + (A a) = { a } These sets are identical, rather than disjoint. So, left recursion elimination did not produce a grammar with the LL(1) condition a grammar that would be predictively parsable. COMP 412, Fall 2018 24

The LL(1) Property Can we write an LL(1) grammar for a ( ba )*? 0 Start 1 A 2 1 A 2 B 3 a 2 a 0 Start a B 1 B b a B 2 e 3 B 3 a b C 3 4 C 3 a b C 3 5 e Transformed Grammar LL(1) Grammar for a ( ba )* COMP 412, Fall 2018 25

What If My Grammar Is Still Not LL(1)? Can we transform a non-ll(1) grammar into an LL(1) grammar? In general, the answer is no In some cases, however, the answer is yes Assume a grammar G with productions A a b 1 and A a b 2 If a derives anything other than e, then And the grammar is not LL(1) FIRST + (A a b 1 ) Ç FIRST + (A a b 2 ) Æ If we pull the common prefix, a, into a separate production, we may make the grammar LL(1). A a A, A b 1, and A b 2 Now, if FIRST + (A b 1 ) Ç FIRST + (A b 2 ) = Æ, G may be LL(1) COMP 412, Fall 2018 26

What If My Grammar Is Not LL(1)? Left Factoring For each NT A find the longest prefix a common to 2 or more alternatives for A if a e then replace all of the productions A a b 1 a b 2 a b 3 a b n γ with A a A γ A b 1 b 2 b 3 b n Repeat until no NT has alternative rhs with a common prefix This transformation makes some grammars into LL(1) grammars There are languages for which no LL(1) grammar exists COMP 412, Fall 2018 27

Left Factoring Example Consider a short grammar for subscripted identifiers 0 Factor name 1 name [ ArgList ] 2 name ( ArgList ) 3 ArgList Expr MoreArgs 4 MoreArgs, Expr MoreArgs 5 e Common prefix is name. Introduce a new production Factor name Arguments And expand Arguments into the three alternatives. To choose between expanding by 0, 1, or 2, a top-down parser must look beyond the name to see the next word. FIRST + (0) = FIRST + (1) = FIRST + (2) = { name } Left factoring productions 0, 1, & 2 fixes this problem. COMP 412, Fall 2018 Left factoring cannot fix all non-ll(1) grammars. 28

Left Factoring Example After left factoring: 0 Factor name Args 1 Args [ ArgList ] 2 ( ArgList ) 3 e 4 ArgList Expr MoreArgs 5 MoreArgs, Expr MoreArgs 6 e Left factoring can transform some non-ll(1) grammars into LL(1) grammars. There are languages for which no LL(1) grammar exists. See exercise 3.12 in EaC2e. Clearly, FIRST + (1) & FIRST + (2) are disjoint. If neither ( or [ can follow Factor, then FIRST + (3) will be, too. This tiny grammar now has the LL(1) property COMP 412, Fall 2018 29

Predictive Parsing Given a grammar that has the LL(1) property We can write a simple routine to recognize an instance of each LHS Code is patterned, simple, & fast Consider A b 1 b 2 b 3, with FIRST + (A b i ) Ç FIRST + (A b j ) = Æ if i j /* find an A */ if (current_word Î FIRST + (A b 1 )) find a b 1 and return true else if (current_word Î FIRST + (A b 2 )) find a b 2 and return true else if (current_word Î FIRST + (A b 3 )) find a b 3 and return true else report an error and return false Grammars that have the LL(1) property are called predictive grammars because the parser can predict the correct expansion at each point in the parse. Parsers that capitalize on the LL(1) property are called predictive parsers. One kind of predictive parser is the recursive descent parser. COMP 412, Fall 2018 Of course, there is more detail to find a b i typically a recursive call to another small routine (see pp. 108--111 in EaC2e) 30

Recursive Descent Parsing Recall the expression grammar, after transformation 0 Goal Expr 1 Expr Term Expr 2 Expr + Term Expr 3 - Term Expr 4 e 5 Term Factor Term 6 Term * Factor Term 7 / Factor Term 8 e 9 Factor ( Expr ) 10 number 11 id This grammar leads to a parser that has six mutually recursive routines: 1. Goal 2. Expr 3. EPrime 4. Term 5. TPrime 6. Factor Each routine recognizes an rhs for that NT. The term descent refers to the direction in which the parse tree is built. This COMP grammar 412, Fall 2018 has the LL(1) property 31

Recursive Descent Parsing (Procedural) A couple of routines from the expression parser Goal( ) token next_token( ); if (Expr( ) = true & token = EOF) then next compilation step; else report syntax error; return false; Expr( ) if (Term( ) = false) then return false; else return Eprime( ); looking for number, identifier, or (, found token instead, or failed to find Expr or ) after ( COMP 412, Fall 2018 Factor( ) if (token = number) then token next_token( ); return true; else if (token = identifier) then token next_token( ); return true; else if (token = lparen) token next_token( ); if (Expr( ) = true & token = rparen) then token next_token( ); return true; // fall out of if statement report syntax error; return false; EPrime, Term, & TPrime follow the same basic lines (Figure 3.10, EaC2e) 32

Recursive Descent Page 111 in EaC2e sketches a recursive descent parser for the RR CEG. One routine per NT Check each RHS by checking each symbol Includes ε-productions Main( ) /* Goal Expr */ word NextWord( ); if (Expr( )) then if (word = eof ) then report success; else Fail( ); Review from last lecture Fail( ) report syntax error; attempt error recovery or exit; Expr( ) /* Expr Term Expr */ if ( Term( ) ) then return EPrime( ); else Fail(); EPrime( ) /* Expr + Term Expr */ /* Expr - Term Expr */ if (word = + or word = - ) then begin; word NextWord( ); if ( Term() ) then return EPrime( ); else Fail(); end; else if (word = ) or word = eof) /* Expr ϵ */ then return true; else Fail(); Term( ) /* Term Factor Term */ if ( Factor( ) ) then return TPrime( ); else Fail(); TPrime( ) /* Term Factor Term */ /* Term Factor Term */ if (word = or word = ) then begin; word NextWord( ); if ( Factor( ) ) then return TPrime( ); else Fail(); end; else if (word = + or word = - or word = ) or word = eof) /* Term ϵ */ then return true; else Fail(); Factor( ) /* Factor ( Expr ) */ if (word = ( ) then begin; word NextWord( ); if (not Expr( ) ) then Fail(); if (word ) ) then Fail(); word NextWord( ); return true; end; /* Factor num */ /* Factor name */ else if (word = num or word = name ) then begin; word NextWord( ); return true; end; COMP 412, Fall 2018 else Fail(); 33

Top-Down Recursive Descent Parser At this point, you almost have enough information to build a top-down recursive-descent parser Need a right-recursive grammar that meets the LL(1) condition Can use left-factoring to eliminate common prefixes Can transform direct left recursion into right recursion Need a general algorithm to handle indirect left recursion Need algorithms to construct FIRST and FOLLOW Next Parsing Lecture Algorithms for FIRST and FOLLOW An LL(1) skeleton parser, table, and table-construction algorithm Next Lecture Local Register Allocation and Lab 2 Read materials on local allocation from projects page of web site COMP 412, Fall 2018 34