Context-sensitive Analysis. Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved.

Similar documents
Context-sensitive Analysis

Context-sensitive Analysis

Grammars. CS434 Lecture 15 Spring 2005 Department of Computer Science University of Alabama Joel Jones

There is a level of correctness that is deeper than grammar. There is a level of correctness that is deeper than grammar

Context-sensitive Analysis Part II Chapter 4 (up to Section 4.3)

Syntactic Directed Translation

CS 406/534 Compiler Construction LR(1) Parsing and CSA

Context-sensitive Analysis Part II

Syntax Directed Translation

CS415 Compilers Context-Sensitive Analysis Type checking Symbol tables

Intermediate Representations

Syntax-Directed Translation. CS Compiler Design. SDD and SDT scheme. Example: SDD vs SDT scheme infix to postfix trans

Syntactic Directed Translation

Context-sensitive analysis. Semantic Processing. Alternatives for semantic processing. Context-sensitive analysis

Context-sensitive Analysis Part IV Ad-hoc syntax-directed translation, Symbol Tables, andtypes

Syntax-directed translation. Context-sensitive analysis. What context-sensitive questions might the compiler ask?

5. Semantic Analysis. Mircea Lungu Oscar Nierstrasz

COMP 181. Prelude. Prelude. Summary of parsing. A Hierarchy of Grammar Classes. More power? Syntax-directed translation. Analysis

5. Semantic Analysis. Mircea Lungu Oscar Nierstrasz

5. Semantic Analysis!

Computing Inside The Parser Syntax-Directed Translation. Comp 412 COMP 412 FALL Chapter 4 in EaC2e. source code. IR IR target.

CS 406/534 Compiler Construction Parsing Part I

Semantic Analysis with Attribute Grammars Part 1

intermediate code generator syntax tree token stream intermediate representation parser checker tree

Semantic analysis. Syntax tree is decorated with typing- and other context dependent

Parsing. Note by Baris Aktemur: Our slides are adapted from Cooper and Torczon s slides that they prepared for COMP 412 at Rice.

Parsing Part II. (Ambiguity, Top-down parsing, Left-recursion Removal)

More On Syntax Directed Translation

Parsing II Top-down parsing. Comp 412

Syntax Analysis, III Comp 412

CS415 Compilers. Syntax Analysis. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

Computing Inside The Parser Syntax-Directed Translation. Comp 412

Chapter 4 - Semantic Analysis. June 2, 2015

Semantic Analysis Attribute Grammars

Parsing Part II (Top-down parsing, left-recursion removal)

Syntax Analysis, III Comp 412

Parsing III. CS434 Lecture 8 Spring 2005 Department of Computer Science University of Alabama Joel Jones

Syntax-Directed Translation

What is a compiler? var a var b mov 3 a mov 4 r1 cmpi a r1 jge l_e mov 2 b jmp l_d l_e: mov 3 b l_d: ;done

Syntactic Directed Translation

Syntax-Directed Translation. Introduction

Semantics of programming languages

Syntax-Directed Translation

Instruction Selection: Peephole Matching. Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved.

Instruction Selection, II Tree-pattern matching

Semantic Analysis. Compiler Architecture

Compilers - Chapter 2: An introduction to syntax analysis (and a complete toy compiler)

Semantic Analysis. CSE 307 Principles of Programming Languages Stony Brook University

CS 406/534 Compiler Construction Putting It All Together

Code Shape II Expressions & Assignment

Summary: Semantic Analysis

Syntax Directed Translation

Introduction to Parsing. Comp 412

TDDD55 - Compilers and Interpreters Lesson 3

CS 406: Syntax Directed Translation

Syntax-Directed Translation. Lecture 14

Computing Inside The Parser Syntax-Directed Translation, II. Comp 412

Prof. Mohamed Hamada Software Engineering Lab. The University of Aizu Japan

CSE P 501 Compilers. Static Semantics Hal Perkins Winter /22/ Hal Perkins & UW CSE I-1

Chapter 4 :: Semantic Analysis

CS415 Compilers. Intermediate Represeation & Code Generation

LECTURE 11. Semantic Analysis and Yacc

CMPT 379 Compilers. Parse trees

EECS 6083 Intro to Parsing Context Free Grammars

G53CMP: Lecture 4. Syntactic Analysis: Parser Generators. Henrik Nilsson. University of Nottingham, UK. G53CMP: Lecture 4 p.1/32

The View from 35,000 Feet

COMPILER CONSTRUCTION LAB 2 THE SYMBOL TABLE. Tutorial 2 LABS. PHASES OF A COMPILER Source Program. Lab 2 Symbol table

CMSC 330: Organization of Programming Languages. Context Free Grammars

Compiler Principle and Technology. Prof. Dongming LU April 15th, 2019

Semantic analysis and intermediate representations. Which methods / formalisms are used in the various phases during the analysis?

Syntax-Directed Translation. Concepts Introduced in Chapter 5. Syntax-Directed Definitions

Semantics of programming languages

CS415 Compilers. Lexical Analysis

Static semantics. Lecture 3-6: Semantics. Attribute grammars (2) Attribute grammars. Attribute grammars example. Dynamic semantics

CSCI312 Principles of Programming Languages

Chapter 3. Semantics. Topics. Introduction. Introduction. Introduction. Introduction

Semantic Analysis. Lecture 9. February 7, 2018

Lecture 14 Sections Mon, Mar 2, 2009

CSE 504: Compiler Design. Instruction Selection

Alternatives for semantic processing

Lexical and Syntax Analysis. Top-Down Parsing

TDDD55- Compilers and Interpreters Lesson 3

Semantics of programming languages

Attribute Grammars. Attribute Grammars

CSCI Compiler Design

Programming Languages & Translators PARSING. Baishakhi Ray. Fall These slides are motivated from Prof. Alex Aiken: Compilers (Stanford)

Part 5 Program Analysis Principles and Techniques

Building a Parser Part III

SEMANTIC ANALYSIS TYPES AND DECLARATIONS

CS /534 Compiler Construction University of Massachusetts Lowell. NOTHING: A Language for Practice Implementation

Syntax-Directed Translation Part I

What is a compiler? Xiaokang Qiu Purdue University. August 21, 2017 ECE 573

LECTURE 3. Compiler Phases

Crafting a Compiler with C (II) Compiler V. S. Interpreter

Syntax-Directed Translation

1. (a) What are the closure properties of Regular sets? Explain. (b) Briefly explain the logical phases of a compiler model. [8+8]

COMPILERS AND INTERPRETERS Lesson 4 TDDD16

7. Introduction to Denotational Semantics. Oscar Nierstrasz

CS415 Compilers Overview of the Course. These slides are based on slides copyrighted by Keith Cooper, Ken Kennedy & Linda Torczon at Rice University

Semantic actions for declarations and expressions

Transcription:

Context-sensitive Analysis Copyright 2003, Keith D. Cooper, Ken Kennedy & Linda Torczon, all rights reserved.

Beyond Syntax There is a level of correctness that is deeper than grammar fie(a,b,c,d) int a, b, c, d; { } fee() { int f[3],g[0], h, i, j, k; char *p; fie(h,i, ab,j, k); k = f * i + j; h = g[7]; printf( <%s,%s>.\n, p,q); p = 0; } What is wrong with this program? (let me count the ways ) declared g[0], used g[7] wrong number of args to fie() ab is not an int wrong dimension on use of f undeclared variable q 0 is not a character string All of these are deeper than syntax To generate code, we need to understand its meaning!

Beyond Syntax To generate code, the compiler needs to answer many questions Is x a scalar, an array, or a function? Is x declared? Are there names that are not declared? Declared but not used? Which declaration of x does each use reference? Is the expression x * y + z type-consistent? In a[i,j,k], does a have three dimensions? Where can z be stored? (register, local, global, heap, static) In f 5, how should 5 be represented? How many arguments does fie() take? What about printf ()? Does *p reference the result of a malloc()? Do p & q refer to the same memory location? Is x defined before it is used? These are beyond a CFG

Beyond Syntax These questions are part of context-sensitive analysis Answers depend on values, not parts of speech Questions & answers involve non-local information Answers may involve computation How can we answer these questions? Use formal methods Context-sensitive grammars? Attribute grammars? Use ad-hoc techniques Symbol tables Ad-hoc code (attributed grammars?) (action routines) In scanning & parsing, formalism won; different story here.

Beyond Syntax Telling the story The attribute grammar formalism is important Succinctly makes many points clear Sets the stage for actual, ad-hoc practice The problems with attribute grammars motivate practice Non-local computation Need for centralized information Some folks still argue for attribute grammars Knowledge is power Information is immunization We will cover attribute grammars, then move on to ad-hoc ideas

Attribute Grammars What is an attribute grammar? A context-free grammar augmented with a set of rules Each symbol in the derivation has a set of values, or attributes The rules specify how to compute a value for each attribute Example grammar Number Sign Sign + - 0 This grammar describes signed binary numbers We would like to augment it with rules that compute the decimal value of each valid input string

Examples For Number Sign Number Sign For 0 Number Sign Sign Sign Sign Sign Sign 0 Sign 0 0 Number Sign 0 We will use these two throughout the lecture

Attribute Grammars Add rules to compute the decimal value of a signed binary number Number Sign Sign 0 Sign + - 0.pos 0 if Sign.neg then Number.val -.val else Number.val.val Sign.neg false Sign.neg true.pos 0.pos +.pos 0.pos 0.val.val +.val.pos.pos.val.val.val 0.val 2^.pos Symbol Number Sign Attributes val neg pos, val pos, val

Back to the Examples Rules + parse tree imply an attribute dependence graph For Sign neg true Number Number.val.val.pos 0.val.val.pos 0.val 2.pos One possible evaluation order:.pos 2 Sign.neg 3.pos 4.val 5.val 6 Number.val Other orders are possible Knuth suggested a data-flow model for evaluation Independent attributes first Others in order as input values become available Evaluation order must be consistent with the attribute dependence graph

Back to the Examples Number 5 Sign neg: true pos: 0 5 pos: 4 pos: 0 pos: 2 4 pos: 0 pos: 2 4 0 For 0

Back to the Examples Sign Number neg: true pos: 2 4 pos: 2 4 5 pos: 4 0 pos: 0 pos: 0 5 pos: 0 For 0 This is the complete attribute dependence graph for 0. It shows the flow of all attribute values in the example. Some flow downward inherited attributes Some flow upward synthesized attributes A rule may use attributes in the parent, children, or siblings of a node

The Rules of the Game Attributes associated with nodes in parse tree Rules are value assignments associated with productions Attribute is defined once, using local information Label identical terms in production for uniqueness Rules & parse tree define an attribute dependence graph Graph must be non-circular This produces a high-level, functional specification Synthesized attribute Depends on values from children Inherited attribute Depends on values from siblings & parent

Using Attribute Grammars Attribute grammars can specify context-sensitive actions Take values from syntax Perform computations with values Insert tests, logic, Synthesized Attributes Use values from children & from constants S-attributed grammars Evaluate in a single bottom-up pass Good match to LR parsing Inherited Attributes Use values from parent, constants, & siblings directly express context can rewrite to avoid them Thought to be more natural Not easily done at parse time We want to use both kinds of attribute

Evaluation Methods Dynamic, dependence-based methods Build the parse tree Build the dependence graph Topological sort the dependence graph Define attributes in topological order Rule-based methods Analyze rules at compiler-generation time Determine a fixed (static) ordering Evaluate nodes in that order (treewalk) Oblivious methods (passes, dataflow) Ignore rules & parse tree Pick a convenient order (at design time) & use it

Back to the Example Number Sign 0 For 0

Back to the Example Number Sign neg: pos: pos: pos: pos: pos: pos: 0 For 0

Back to the Example Number Inherited Attributes Sign neg: pos: 0 pos: pos: 0 pos: 2 pos: pos: 2 0 For 0

Back to the Example Number Synthesized attributes Sign neg: true pos: 0 pos: pos: 0 pos: 2 pos: 0 pos: 2 4 0 For 0

Back to the Example Number 5 Synthesized attributes Sign neg: true pos: 0 5 pos: 4 pos: 0 pos: 2 4 pos: 0 pos: 2 4 0 For 0

Back to the Example Number 5 If we show the computation... Sign neg: true pos: 0 5 pos: 4 pos: 0 & then peel away the parse tree... pos: 2 4 pos: 0 pos: 2 4 0 For 0

Back to the Example 5 All that is left is the attribute dependence graph. neg: true pos: 0 5 This succinctly represents the flow of values in the problem instance. pos: 2 4 pos: 2 4 pos: 4 0 pos: 0 pos: 0 For 0 The dynamic methods sort this graph to find independent values, then work along graph edges. The rule-based methods try to discover good orders by analyzing the rules. The oblivious methods ignore the structure of this graph. The dependence graph must be acyclic

Circularity We can only evaluate acyclic instances We can prove that some grammars can only generate instances with acyclic dependence graphs Largest such class is strongly non-circular grammars (SNC) SNC grammars can be tested in polynomial time Failing the SNC test is not conclusive Many evaluation methods discover circularity dynamically Bad property for a compiler to have SNC grammars were first defined by Kennedy & Warren

The Realist s Alternative Ad-hoc syntax-directed translation Associate a snippet of code with each production At each reduction, the corresponding snippet runs Allowing arbitrary code provides complete flexibility Includes ability to do tasteless & bad things To make this work Need names for attributes of each symbol on lhs & rhs Typically, one attribute passed through parser + arbitrary code (structures, globals, statics, ) Yacc introduced $$, $, $2, $n, left to right Need an evaluation scheme Fits nicely into LR() parsing algorithm

Reality Most parsers are based on this ad-hoc style of contextsensitive analysis Advantages Addresses the shortcomings of the AG paradigm Efficient, flexible Disadvantages Must write the code with little assistance Programmer deals directly with the details Most parser generators support a yacc-like notation

Typical Uses Building a symbol table Enter declaration information as processed At end of declaration syntax, do some post processing Use table to check errors as parsing progresses assumes Simple error checking/type checking table is global Define before use lookup on reference Dimension, type,... check as encountered Type conformability of expression bottom-up walk Procedure interfaces are harder Build a representation for parameter list & types Create list of sites to check Check offline, or handle the cases for arbitrary orderings

Is This Really Ad-hoc? Relationship between practice and attribute grammars Similarities Both rules & actions associated with productions Application order determined by tools, not author (Somewhat) abstract names for symbols Differences Actions applied as a unit; not true for AG (Attribute Grammar) rules Anything goes in ad-hoc actions; AG rules are functional AG rules are higher level than ad-hoc actions

Limitations Forced to evaluate in a given order: postorder Left to right only Bottom up only Implications Declarations before uses Context information cannot be passed down How do you know what rule you are called from within? Example: cannot pass bit position from right down Could you use globals? In this case we could get the position from the left, which is not much help (and it requires initialization)

Alternative Strategy: Visitor Build Abstract Syntax Tree Use tree walk routines Use visitor design pattern to add functionality ParseTreeVisitor<T> T visitexp(lexpcontext) T visitfactor(factorcontext) TypeCheckVisitor<T> AnalysisVisitor<T> T visitexp(lexpcontext) T visitfactor(factorcontext) T visitexp(lexpcontext) T visitfactor(factorcontext) antlr4 visitor Hello.g4

AST builder Visitor AstBuilderVisitor<SCNode> { SCNode visitexp(expcontext ec) { if (fc.lexp().length == 2) { Refers to current visitor! SCNode lhs = fc.lexp(0).accept(this); SCNode rhs = fc.lexp().accept(this); String op = fc.op.gettext(); return new SCExp(lhs, rhs, op); }... grammar }... exp: lexp op='>' lexp lexp op='<' lexp; } class SCExp extends SCNode { SCLexp lhs; SCLexp rhs; String op; }

Summary: Strategies for Context-Sensitive Analysis Attribute Grammars Pros: Formal, powerful, can deal with propagation strategies Cons: Too many copy rules, no global tables, works on parse tree Postorder Code Execution Pros: Simple and functional, can be specified in grammar (Yacc) but does not require parse tree Cons: Rigid evaluation order, no context inheritance Generalized Tree Walk Pros: Full power and generality, operates on abstract syntax tree (using Visitor pattern) Cons: Requires specific code for each tree node type, more complicated