CS5363 Final Review. cs5363 1

Similar documents
CS 406/534 Compiler Construction Putting It All Together

About the Authors... iii Introduction... xvii. Chapter 1: System Software... 1

About the Tutorial. Audience. Prerequisites. Copyright & Disclaimer. Compiler Design

SYED AMMAL ENGINEERING COLLEGE (An ISO 9001:2008 Certified Institution) Dr. E.M. Abdullah Campus, Ramanathapuram

Faculty of Electrical Engineering, Mathematics, and Computer Science Delft University of Technology

Front End. Hwansoo Han

Introduction to Optimization, Instruction Selection and Scheduling, and Register Allocation

CS415 Compilers. Instruction Scheduling and Lexical Analysis

CJT^jL rafting Cm ompiler

Review. Pat Morin COMP 3002

2068 (I) Attempt all questions.

LECTURE NOTES ON COMPILER DESIGN P a g e 2

DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING Subject Name: CS2352 Principles of Compiler Design Year/Sem : III/VI

Parsing. source code. while (k<=n) {sum = sum+k; k=k+1;}

COP4020 Programming Languages. Syntax Prof. Robert van Engelen

PSD3A Principles of Compiler Design Unit : I-V. PSD3A- Principles of Compiler Design

Question Bank. 10CS63:Compiler Design

Appendix A The DL Language

Torben./Egidius Mogensen. Introduction. to Compiler Design. ^ Springer

CMSC 330: Organization of Programming Languages

EDAN65: Compilers, Lecture 04 Grammar transformations: Eliminating ambiguities, adapting to LL parsing. Görel Hedin Revised:

CS 2210 Sample Midterm. 1. Determine if each of the following claims is true (T) or false (F).

Principles of Programming Languages [PLP-2015] Detailed Syllabus

Intermediate Code Generation

Introduction to Compiler

COP4020 Programming Languages. Syntax Prof. Robert van Engelen

Section A. A grammar that produces more than one parse tree for some sentences is said to be ambiguous.

Time : 1 Hour Max Marks : 30

Appendix Set Notation and Concepts

COMP-421 Compiler Design. Presented by Dr Ioanna Dionysiou

CS Lecture 2. The Front End. Lecture 2 Lexical Analysis

List of Figures. About the Authors. Acknowledgments

Compiler Optimisation

CS 314 Principles of Programming Languages

A programming language requires two major definitions A simple one pass compiler

Gujarat Technological University Sankalchand Patel College of Engineering, Visnagar B.E. Semester VII (CE) July-Nov Compiler Design (170701)

CS606- compiler instruction Solved MCQS From Midterm Papers

Working of the Compilers

4. An interpreter is a program that

CS415 Compilers. Intermediate Represeation & Code Generation

A simple syntax-directed

CST-402(T): Language Processors

Syntax-Directed Translation. Lecture 14

Compiler Design 1. Bottom-UP Parsing. Goutam Biswas. Lect 6

CS 4240: Compilers and Interpreters Project Phase 1: Scanner and Parser Due Date: October 4 th 2015 (11:59 pm) (via T-square)

Abstract Syntax Trees & Top-Down Parsing

Abstract Syntax Trees & Top-Down Parsing

Alternatives for semantic processing

VIVA QUESTIONS WITH ANSWERS

Compiler phases. Non-tokens

Comp 411 Principles of Programming Languages Lecture 3 Parsing. Corky Cartwright January 11, 2019

Abstract Syntax Trees & Top-Down Parsing

Faculty of Electrical Engineering, Mathematics, and Computer Science Delft University of Technology

Instruction Selection and Scheduling

COLLEGE OF ENGINEERING, NASHIK. LANGUAGE TRANSLATOR

Intermediate Representations

CS415 Compilers. Lexical Analysis

CPS 506 Comparative Programming Languages. Syntax Specification

9/5/17. The Design and Implementation of Programming Languages. Compilation. Interpretation. Compilation vs. Interpretation. Hybrid Implementation

SEMANTIC ANALYSIS TYPES AND DECLARATIONS

EDA180: Compiler Construc6on Context- free grammars. Görel Hedin Revised:

Syntax Analysis Part I

MIT Specifying Languages with Regular Expressions and Context-Free Grammars. Martin Rinard Massachusetts Institute of Technology

Introduction to Parsing. Lecture 8

CS1622. Semantic Analysis. The Compiler So Far. Lecture 15 Semantic Analysis. How to build symbol tables How to use them to find

Wednesday, September 9, 15. Parsers

MIT Specifying Languages with Regular Expressions and Context-Free Grammars

Parsers. What is a parser. Languages. Agenda. Terminology. Languages. A parser has two jobs:

BSCS Fall Mid Term Examination December 2012

Compiler Design Aug 1996

Syntax Analysis Check syntax and construct abstract syntax tree

Compilers Course Lecture 4: Context Free Grammars

Optimizing Finite Automata

CSE 504: Compiler Design. Instruction Selection

Syntax and Parsing COMS W4115. Prof. Stephen A. Edwards Fall 2003 Columbia University Department of Computer Science

Introduction to Parsing. Lecture 5

EDAN65: Compilers, Lecture 06 A LR parsing. Görel Hedin Revised:

Intermediate Representations

CSE P 501 Compilers. Parsing & Context-Free Grammars Hal Perkins Spring UW CSE P 501 Spring 2018 C-1

KINGS COLLEGE OF ENGINEERING DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING ACADEMIC YEAR / EVEN SEMESTER

Lecture 7: Deterministic Bottom-Up Parsing

CS 314 Principles of Programming Languages Fall 2017 A Compiler and Optimizer for tinyl Due date: Monday, October 23, 11:59pm

Architecture of Compilers, Interpreters. CMSC 330: Organization of Programming Languages. Front End Scanner and Parser. Implementing the Front End

Derivations vs Parses. Example. Parse Tree. Ambiguity. Different Parse Trees. Context Free Grammars 9/18/2012

CSE 5317 Midterm Examination 4 March Solutions

Lecture 8: Deterministic Bottom-Up Parsing

1. (a) What are the closure properties of Regular sets? Explain. (b) Briefly explain the logical phases of a compiler model. [8+8]

VALLIAMMAI ENGINEERING COLLEGE

The analysis part breaks up the source program into constituent pieces and creates an intermediate representation of the source program.

LANGUAGE PROCESSORS. Presented By: Prof. S.J. Soni, SPCE Visnagar.

CS 432 Fall Mike Lam, Professor. Code Generation

The Compiler So Far. CSC 4181 Compiler Construction. Semantic Analysis. Beyond Syntax. Goals of a Semantic Analyzer.

Parsers. Xiaokang Qiu Purdue University. August 31, 2018 ECE 468

Topic 3: Syntax Analysis I

Compiling and Interpreting Programming. Overview of Compilers and Interpreters

CS 406/534 Compiler Construction Instruction Scheduling

IR Optimization. May 15th, Tuesday, May 14, 13

Outline. Parser overview Context-free grammars (CFG s) Derivations Syntax-Directed Translation

QUESTIONS RELATED TO UNIT I, II And III

DEPARTMENT OF INFORMATION TECHNOLOGY / COMPUTER SCIENCE AND ENGINEERING UNIT -1-INTRODUCTION TO COMPILERS 2 MARK QUESTIONS

Transcription:

CS5363 Final Review cs5363 1

Programming language implementation Programming languages Tools for describing data and algorithms Instructing machines what to do Communicate between computers and programmers Different programming languages FORTRAN, Pascal, C, C++, Java, Lisp, Scheme, ML, Compilers/translators Translate programming languages to machine languages Translate one programming language to another Interpreters Interpret the meaning of programs and perform the operations accordingly cs5363 2

Objectives of compilers Fundamental principles Compilers shall preserve the meaning of the input program --- it must be correct Translation should not alter the original meaning Compilers shall do something of value Optimize the performance of the input application Source program Front end IR optimizer IR Target Back end (Mid end) program compiler cs5363 3

Front end Source program for (w = 1; w < 100; w = w * 2); Input: a stream of characters f o r ( `w = 1 ; w < 1 0 0 ; w Scanning--- convert input to a stream of words (tokens) for ( w = 1 ; w < 100 ; w Parsing---discover the syntax/structure of sentences forstmt: for ( expr1 ; expr2 ; expr3 ) stmt expr1 : localvar(w) = integer(1) expr2 : localvar(w) < integer(100) expr3: localvar(w) = expr4 expr4: localvar(w) * integer(2) stmt: ; cs5363 4

Lexical analysis/scanning Called by the parser each time a new token is needed Each token has a type and an optional value Regular expression: compact description of composition of tokens Alphabet : the set of characters that make up tokens A regular expression over could be the empty string, a symbol s, or (α), αß, α ß, or α*, where α and ß are regular expressions. Finite automata Include an alphabet, a set of states S (including a start state s0 and a set of final states F), and a transition function δ DFA δ: S * S; NFA δ: S * power(s) Regular expressions and finite automata Describing and recognizing an input language From R.E to NFA to DFA Examples: comments, identifiers, integers, floating point numbers, cs5363 5

Context-free grammar Describe how to recursively compose programs/sentences from tokens Loops, statements, expressions, declarations,. A context-free grammar includes (T,NT,S,P) BNF: each production has format A ::= B (or AB) where a is a single non-terminal; B is a sequence of terminals and non-terminals Using CFG to describe regular expressions n ::= dn d d ::= 0 1 2 3 4 5 6 7 8 9 Given a CFG G=(T,NT,P,S), a sentence s belongs to L(G) if there is a derivation from S to s Derivation: top-down replacement of non-terminals Each replacement follows a production rule Left-most vs. right-most derivations Example: derivations for 5 + 15 * 20 e=>e*e=>e+e*e=>5+e*e=>5+15*e=>5+15*20 e=>e+e=>5+e=>5+e*e=>5+15*e=>5+15*20 Writing grammars for languages E.g., the set of balanced parentheses cs5363 6

Parse trees and abstract syntax trees Parse tree: graphical representation of derivations Parent: left-hand of production; children: right-hand of production A grammar is syntactically ambiguous if some program has multiple parse trees Rewrite an ambiguous grammar: identify source of ambiguity, restrict the applicability of some productions Standard rewrite for defining associativity and precedence of operators Abstract syntax tree: condensed form of parse tree Operators and keywords do not appear as leaves Chains of single productions may be collapsed e Abstract syntax tree: Parse tree: e + e + e * e 5 * 5 15 20 15 20 cs5363 7

Top-down and bottom-up parsing Top-down parsing: start from the starting non-terminal, try to find a left-most derivation Recursive descent parsing and LL(k) predictive parsers Transformation to grammars: eliminate left-recursion and Leftfactoring Build LL(1) parsers: compute First for each production and Follow for each non-terminal Bottom-up parsing: start from the input string, try to reduce the input string to the starting non-terminal Equivalent to the reverse of a right-most derivation Right-sentential forms and their handles Shift-reduce parsing and LR(k) parsers The meaning of LR(1) items; building DFA for handle pruning; canonical LR(1) collection How to build LR(1) parse table and how to interpret LR(1) table Top-down vs. bottom-up parsers: which is better? cs5363 8

Intermediate representation Source program for (w = 1; w < 100; w = w * 2); Parsing --- convert input tokens to IR Abstract syntax tree --- structure of program forstmt assign less assign emptystmt Lv(w) int(1) Lv(w) Lv(w) int(100) mult Lv(w) int(2) Context sensitive analysis --- the surrounding environment Symbol table: information about symbols V: local variable, has type int, allocated to register At least one symbol table for each scope cs5363 9

Context-sensitive analysis Attribute grammar (syntax-directed definition) Associate a collection of attributes with each grammar symbol Define actions to evaluate attribute values during parsing Synthesized and inherited attribute Dependences in attribute evaluation Annotated parse tree and attribute dependence graph Bottom-up parsing and L-attribute evaluation Translation scheme: define attribute evaluation within the parsing of grammar symbols Type checking Basic types and compound types Types of variables and expressions Type environment (symbol table) Type system, type checking and type conversion Compile-time vs. runtime type checking Type checking and type inference cs5363 10

Variation of IR IR: intermediate language between source and target Source-level IR vs. machine-level IR Graphical IR vs. linear IR Mapping names/storages to variables Translating from source language to IR --- syntax-directed translation IR for the purpose of program analysis Control-flow graph Dependence graph Static single assignment (SSA) cs5363 11

Execution model of programs Procedural abstraction: scope and storage management Nested blocks and namespaces Scoping rules static/lexical vs. dynamic scoping Local vs. global variables Parameter passing: pass-by-value vs pass-by-reference Activation record for blocks and functions: what are the necessary fields? The simplified memory model Runtime stack, heap and code space program pointer and activation record pointer Allocating activation records on stack how to set up the activation record? Allocating variables in memory base address and offset; local vs. static/global variables Coordinates of variables: nesting level of variable scope Access link and global display cs5363 12

Mid end --- improving the code Original code int j = 0, k; while (j < 500) { j = j + 1; k = j * 8; a[k] = 0; } Improved code int k = 0; while (k < 4000) { k = k + 8; a[k] = 0; } Program analysis --- recognize optimization opportunities Data flow analysis: where data are defined and used Dependence analysis: when operations can be reordered Transformations --- improve target program speed or space Redundancy elimination Improve data movement and instruction parallelization cs5363 13

Data-flow analysis Program analysis: statically examines input computation to ensure safety and profitability of optimizations Data-flow analysis: reason about flow of values on controlflow graph Forward vs. backward flow problem Define domain of analysis; build the control-flow graph Define a set of data-flow equations at each basic block Evaluate local data-flow sets at each basic block Iteratively modify result at each basic block until reaching a fixed point Traversal order of basic blocks: (reverse) postorder Example: available expression analysis, live variable analysis, reaching definition analysis, dominator analysis SSA (static single assignment) Two rules that must be satisfied Insertion of functions; rewrite from SSA to normal code Computing dominance relations and dominance frontiers cs5363 14

Scope of optimization Local methods Applicable only to basic blocks Superlocal methods Operate on extended basic blocks (EBB) B1,B2,B3,,Bm, where Bi is the single predecessor of B(i+1) Regional methods Operate beyond EBBs, e.g. loops, conditionals Global (intraprocedural) methods Operate on entire procedure (subroutine) Whole-program (interprocedural) methods Operate on entire program S0: if i< 50 goto s1 s1: t1 := b * 2 a := a + t1 goto s0 EBB i :=0 goto s2 S2: cs5363 15

Program optimizations Redundant expression elimination Value numbering Simulate runtime evaluation of instruction sequence Use an integer number to unique identify each runtime value Map each expression to a value number Scope of optimization: local, EBB, dominator based Global redundancy elimination Find available expressions at the entry of each basic block Remove expressions that are redundant Naming of variables change availability of expressions Dead code elimination Mark instructions that are necessary to evaluation of program; remove expressions with never-used results Computing control dependence among basic blocks cs5363 16

Back end --- code generation Memory management Every variable must be allocated with a memory location Address stored in symbol tables during translation Instruction selection Assembly language of the target machine Abstract assembly (three/two address code) Register allocation Most instructions must operate on registers Values in registers are faster to access Instruction scheduling Reorder instructions to enhance parallelism/pipelining in processors cs5363 17

Example of code generation Code for w w * 2 * x * y * z in ILOC loadai rarp, @w rw // load w loadi 2 r2 // constant 2 into r2 loadai rarp, @x rx // load x loadai rarp, @y ry // load y loadai rarp, @z rz // load z mult rw, r2 rw // rw w * 2 Mult rw, rx rw // rw w*2*x Mult rw, ry rw // rw w * 2 * x * y Mult rw, rz rw // rw w * 2 * x * y * z storeai rw rarp, @w // write rw back to w ILOC: Imtermediate language for an optimizing compiler similar to the assembly language for a simple RISC machine cs5363 18

Machine code generation Assigning storage: register or memory Every expression e must have A type that determines the size/meaning of its value A location to store its value (e.place) A variable may require a permanent storage Non-local variables or variables that might be aliased Translating to three-address code Different code shapes may have different efficiency Translating expressions Mixed type expressions --- implicit type conversion Arithmetic vs. boolean expressions; short-circuit translation Translating variable access, arrays, and function calls Translating control-flow statements cs5363 19

Register allocation and assignment Values in registers are easier and faster to access than memory Reserve a few registers for memory access Efficiently utilize the rest of general-purpose registers Register allocation: at each program point, select a set of values to reside in registers Register assignment: pick a specific register for each value, subject to hardware constraints Register-to-register vs. memory model Local register allocation: top-down vs. bottom-up Graph-coloring based register allocation Construct global live ranges Build interference graph Coalesce live ranges to eliminate register copying Rank all live ranges based on spilling cost Color the interference graph cs5363 20

Instruction selection Table-based instruction selector Create a description of target machine, use back-end generator to produce a pattern-matching table AST tiling: pattern-based instruction selection through tree-grammar Bottom-up walk of the AST, for each node n, find all applicable tree patterns and select the one with lowest cost Peephole optimization Use a simple scheme to translate IR to machine code Discover local improvements by examining short sequences of adjacent operations: expand simplify match cs5363 21

Instruction scheduling Dependence/precedence graph G = (N,E) Each node n N is a single operation type(n) and delay(n) Edge (n1,n2) N indicates n2 uses result of n1 as operand What about anti-dependences? G is acyclic within each basic block Given a dependence graph D = (N,E), a schedule S maps each node n N to the cycle number that n is issued. Each schedule S must be well-formed, correct, and feasible. Critical path: the longest path in the dependence graph List scheduling: greedy heuristic to scheduling operations in a single basic block Build a dependence graph (rename to avoid anti-dependences) Assign priorities to each operation n (the length of longest latency path from n to end) Iteratively select an operation and schedule it cs5363 22