MIDTERM EXAM (Solutions)

Similar documents
Programming Language Syntax and Analysis

Chapter 4. Lexical and Syntax Analysis

1. true / false By a compiler we mean a program that translates to code that will run natively on some machine.

Programming Languages, Summary CSC419; Odelia Schwartz

MIDTERM EXAMINATION - CS130 - Spring 2003

2068 (I) Attempt all questions.

Semantic Analysis. Lecture 9. February 7, 2018

Lexical and Syntax Analysis

11. a b c d e. 12. a b c d e. 13. a b c d e. 14. a b c d e. 15. a b c d e

CSE 130 Programming Language Principles & Paradigms Lecture # 5. Chapter 4 Lexical and Syntax Analysis

4. Lexical and Syntax Analysis

Question Marks 1 /11 2 /4 3 /13 4 /5 5 /11 6 /14 7 /32 8 /6 9 /11 10 /10 Total /117

4. Lexical and Syntax Analysis

NOTE: Answer ANY FOUR of the following 6 sections:

CS 230 Programming Languages

Chapter 5. Names, Bindings, and Scopes

Lexical and Syntax Analysis. Top-Down Parsing

Scope. CSC 4181 Compiler Construction. Static Scope. Static Scope Rules. Closest Nested Scope Rule

Informal Semantics of Data. semantic specification names (identifiers) attributes binding declarations scope rules visibility

10/4/18. Lexical and Syntactic Analysis. Lexical and Syntax Analysis. Tokenizing Source. Scanner. Reasons to Separate Lexical and Syntactic Analysis

Semantic Analysis. Outline. The role of semantic analysis in a compiler. Scope. Types. Where we are. The Compiler so far

Syntax-Directed Translation. Lecture 14

CS 314 Principles of Programming Languages

CS 2210 Sample Midterm. 1. Determine if each of the following claims is true (T) or false (F).

Lexical and Syntax Analysis

R13 SET Discuss how producer-consumer problem and Dining philosopher s problem are solved using concurrency in ADA.

Programming Languages Third Edition. Chapter 7 Basic Semantics

The Parsing Problem (cont d) Recursive-Descent Parsing. Recursive-Descent Parsing (cont d) ICOM 4036 Programming Languages. The Complexity of Parsing

CSE 5317 Midterm Examination 4 March Solutions

10/5/17. Lexical and Syntactic Analysis. Lexical and Syntax Analysis. Tokenizing Source. Scanner. Reasons to Separate Lexical and Syntax Analysis

CS 415 Midterm Exam Spring SOLUTION

G Programming Languages - Fall 2012

COMPILER CONSTRUCTION LAB 2 THE SYMBOL TABLE. Tutorial 2 LABS. PHASES OF A COMPILER Source Program. Lab 2 Symbol table

Principles of Programming Languages Topic: Scope and Memory Professor Louis Steinberg Fall 2004

CST-402(T): Language Processors

The role of semantic analysis in a compiler

Answer: Early binding generally leads to greater efficiency (compilation approach) Late binding general leads to greater flexibility

Chapter 3. Syntax - the form or structure of the expressions, statements, and program units

Lexical and Syntax Analysis (2)

Software II: Principles of Programming Languages

Today's Topics. CISC 458 Winter J.R. Cordy

Programming Languages Third Edition

Semantic Analysis. Outline. The role of semantic analysis in a compiler. Scope. Types. Where we are. The Compiler Front-End

MIDTERM EXAMINATION. CSE 130: Principles of Programming Languages. Professor Goguen. February 16, points total

SE352b: Roadmap. SE352b Software Engineering Design Tools. W3: Programming Paradigms

1 Lexical Considerations

COSC252: Programming Languages: Semantic Specification. Jeremy Bolton, PhD Adjunct Professor

Syntax. A. Bellaachia Page: 1

Faculty of Electrical Engineering, Mathematics, and Computer Science Delft University of Technology

St. MARTIN S ENGINEERING COLLEGE Dhulapally, Secunderabad

CS101 Introduction to Programming Languages and Compilers

CPS 506 Comparative Programming Languages. Syntax Specification

Top down vs. bottom up parsing

Context-free grammars

1. Consider the following program in a PCAT-like language.

Appendix Set Notation and Concepts

CS1622. Semantic Analysis. The Compiler So Far. Lecture 15 Semantic Analysis. How to build symbol tables How to use them to find

Question Marks 1 /12 2 /6 3 /14 4 /8 5 /5 6 /16 7 /34 8 /25 Total /120

Programming Language Specification and Translation. ICOM 4036 Fall Lecture 3

Chapter 4. Lexical and Syntax Analysis. Topics. Compilation. Language Implementation. Issues in Lexical and Syntax Analysis.

RYERSON POLYTECHNIC UNIVERSITY DEPARTMENT OF MATH, PHYSICS, AND COMPUTER SCIENCE CPS 710 FINAL EXAM FALL 97 INSTRUCTIONS

Semantic Analysis. CSE 307 Principles of Programming Languages Stony Brook University

About the Tutorial. Audience. Prerequisites. Copyright & Disclaimer. Compiler Design

Compiler Construction: Parsing

INSTITUTE OF AERONAUTICAL ENGINEERING

Section A. A grammar that produces more than one parse tree for some sentences is said to be ambiguous.

8 Parsing. Parsing. Top Down Parsing Methods. Parsing complexity. Top down vs. bottom up parsing. Top down vs. bottom up parsing

COP 3402 Systems Software Syntax Analysis (Parser)

DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING Subject Name: CS2352 Principles of Compiler Design Year/Sem : III/VI

RYERSON POLYTECHNIC UNIVERSITY DEPARTMENT OF MATH, PHYSICS, AND COMPUTER SCIENCE CPS 710 FINAL EXAM FALL 96 INSTRUCTIONS

MIDTERM EXAMINATION - CS130 - Spring 2005

ICOM 4036 Spring 2004

LECTURE 18. Control Flow

Anatomy of a Compiler. Overview of Semantic Analysis. The Compiler So Far. Why a Separate Semantic Analysis?

Two hours UNIVERSITY OF MANCHESTER SCHOOL OF COMPUTER SCIENCE. Date: Friday 20th May 2016 Time: 14:00-16:00

Context-Free Grammar. Concepts Introduced in Chapter 2. Parse Trees. Example Grammar and Derivation

The PCAT Programming Language Reference Manual

CS 415 Midterm Exam Spring 2002

Informatica 3 Syntax and Semantics

Type Bindings. Static Type Binding

3.5 Practical Issues PRACTICAL ISSUES Error Recovery

UNIVERSITY OF CALIFORNIA

Com S 541. Programming Languages I

5. Semantic Analysis. Mircea Lungu Oscar Nierstrasz

SEMANTIC ANALYSIS TYPES AND DECLARATIONS

Problem Score Max Score 1 Syntax directed translation & type

The Compiler So Far. CSC 4181 Compiler Construction. Semantic Analysis. Beyond Syntax. Goals of a Semantic Analyzer.

Chapter 3. Semantics. Topics. Introduction. Introduction. Introduction. Introduction

EDAN65: Compilers, Lecture 06 A LR parsing. Görel Hedin Revised:

CSE 582 Autumn 2002 Exam 11/26/02

4. LEXICAL AND SYNTAX ANALYSIS

Faculty of Electrical Engineering, Mathematics, and Computer Science Delft University of Technology

Recursive Descent Parsers

Introduction to Parsing. Lecture 8

A simple syntax-directed

n <exp>::= 0 1 b<exp> <exp>a n <exp>m<exp> 11/7/ /7/17 4 n Read tokens left to right (L) n Create a rightmost derivation (R)

Programming Languages and Compilers (CS 421)

Chapter 3. Describing Syntax and Semantics

Let us construct the LR(1) items for the grammar given below to construct the LALR parsing table.

Lecture 7: Type Systems and Symbol Tables. CS 540 George Mason University

Transcription:

MIDTERM EXAM (Solutions) Total Score: 100, Max. Score: 83, Min. Score: 26, Avg. Score: 57.3 1. (10 pts.) List all major categories of programming languages, outline their definitive characteristics and give an example of a language from each category. Imperative Languages (C/C++, Pascal, Ada, Fortran): The imperative languages (called also declarative or procedural) are a straightforward reflection of the von Neumann architecture. The program represents an explicit sequence of operators; main features are: variables, assignment statements; iteration; the programmer has to assure the correct control flow. Variety of data structures is supported. Functional Languages (LISP, Scheme): In the functional languages, all computations are represented as function calls with appropriate parameters, the data structures are usually limited to lists (without reducing the computational power). Logic (Programming) Languages (Prolog): The computation is represented by a process of logical inference (i.e. proving a statement in the form of logical formula) based on facts and rules. It can also be viewed as evaluation of predicate expressions. Markup/Hybrid Languages (HTML, L A TEX): The main purpose of the markup languages is to describe the visual (or spatial) layout of the output. These languages usually do not have arithmetic and/or procedural features. 1

2. (15 pts.) In the figure below, fill in the blank rectangles. That is, correctly identify the components of the compilation process that are not named. Explain their role in the compilation process. The unnamed components, form top to bottom are: lexical analyzer, syntax analyzer, and symbol table (aka table of names). The lexical analyzer scans the input (the program), identifies the lexical elements (lexemes), and categorizes them according to the language specification. It is usually a finite automaton, using regular expressions (state diagrams). Can be viewed as a subroutine of the parser (syntax analyzer) which supplies one lexeme at a time. The syntax analyzer has to assure that the program conforms to the syntactic rules of the language (usually represented as a Context-Free Grammar), and produce the parse tree reflecting the structure of the program. As discussed in class, and in the text, the lexical and syntax analyzers can be viewed as a single component, depending on the implementation. The symbol table contains the information about the symbols (lexemes) used in the program. These might be variables, constants, operators, user-defined names, etc. It further serves as a basis of generating a portable code. 2

3. (15 pts.) With reference to the grammar of the arithmetic expression discussed in class, write the grammar of the logical formulae. Assume that the logical variables are a, b, c, t(rue), f(alse). Further, assume the only logical operations are conjunction ( ), disjunction ( ) and negation ( ). The conjunction and the disjunction have the same priority and are left associative, the negation has higher priority than both conjunction and disjunction. Also, remember that the negation is a unary operation. Further, sub-formulae enclosed in brackets have highest priority. Give the derivation and the parse tree for the formula The grammar is: E E T E T T T F F F (E) a b c p q r t f p (q r) Note that here we need three additional variables (p, q, r) to represent the specific expression. The (leftmost) derivation is: E E T T T F T p T p F p (E) p (E T) p (T T) p (F T) p (q T) p (q F) p (q r) The parse tree can be easily constructed on the basis of the derivation, and is not given here. 3

4. (10 pts.) Compute the weakest precondition for the given set of statements, and postcondition. x = 3 * ( 2 * y + a ) y = 2 * x - 1 {y > 5} We first consider the second (last) statement and its postcondition. We have: 2 * x - 1 > 5 x > 3 Thus, {x > 3} is the weakest precondition of the second statement, and we have: x = 3 * ( 2 * y + a ) {x > 3} y = 2 * x - 1 {y > 5} Now we consider the first statement. 3 * ( 2 * y + a) > 3 2 * y + a > 1 y > (1 - a) / 2 Therefore, the weakest precondition for the two given statements is: {y > (1 - a) / 2} 4

5. (15 pts.) Describe the recursive descent method. For the grammar rule A B#C write a recursive descent method procedure. Assume the procedures lex() and error() are given. Assume that # is a terminal symbol in the given grammar. The recursive descent method is used to parse the (context-free) grammars from the LL-class. These grammars should be free of direct left recursion. If not, the process of left factorization should be applied to the rules for the non-terminals that are directly left-recursive. Further, the productions of each non-terminal should be pairwise disjoint, meaning that all the terminal strings produced from given non-terminal should be distinguishable (in terms of the productions used in their derivation) by their first terminal symbol only. If the above conditions are met, then the recursive descent parser consists of subroutines, corresponding to each non-terminal symbol in the grammar, parsing the language of this non-terminal. For the given grammar rule, the recursive descent procedure will be (in C notation): void A(); { B(); if (nexttoken == "#") { lex(); C(); lex(); return; } else error(); } 5

6. (10 pts.) For the grammar of the arithmetic expressions discussed in class: 1. E E + T 2. E T 3. T T F 4. T F 5. F (E) 6. F a an LR-parser is built using the shift-reduce algorithm and the following table: What are the next states of the input and the stack, if their current states are a$ and 0E1 + 6F3, respectively? Explain your answer in full. The next states of the input and the stack are a$ and 0E1 + 6T9, respectively. To illustrate the reasoning, consider the current symbol in the input,, and the current symbol/state on top of the stack, 3. We have to look up the Action table, and Action[*,3]=R4. Therefore, we have to reduce, using production 4 of the grammar. The length of (the body of) production 4 is 1, so 2 top symbols from the stack (3, and F) are popped. Then, symbol T is pushed onto the stack, as it is the head of production 4. Further, we have to determine the new state to be pushed onto the stack. For this, we refer to the Goto table. Since Goto[6,T]=9, we push 9 onto the stack. No input symbol is consumed in a reduction step, so the input stays the same. 6

7. (10 pts.) Explain the concept of scope. Outline the difference between static and dynamic scoping. For the given program determine which variables are visible and what are their values at the point *** assuming dynamic scoping. program scopezz(input,output); var A,B: integer; procedure p1(); var I, J, K: integer; begin for I := 1 to 10 do B := B + I; K := I + J end; procedure p2(); var I, J, K: integer; begin J := A + B; p1(); K := J + I; I := K + J end; begin A := 10; B := -5; p2(); point *** writeln(a+b); writeln(a-b) end. The scope of a name in a program is the range of all statements (in the program) where this name is visible, i.e. a valid reference to it can occur. Static scoping depends on the textual layout of the program alone. A name is visible, if it is a local name or it is defined in one of the (static) ancestors of the current program unit. Static parent of a unit is the unit that immediately encloses it in the program. Under dynamic scoping, a name is visible if it is declared by the time it is referenced. The valid declaration is the last one for that name. Or, as it is said in the book, the dynamic scoping depends on the sequence of the calls made during the program execution. Under dynamic scoping rules, all 5 variables are visible at point ***, and their values are: A=10, the value of A is not changed in any statement after its initialization B=50, the value of B changes from the initial -5 in the loop inside p1() I=20, the value of I changes many times, for the last time in the last line of p2() J=5, J is assigned value only once, at the beginning of p2() K=15, the last assignment statement for K is in p2(), and I=10, J=5 there. 7

8. (15 pts.) Define the concept of binding. Discuss the differences between static and dynamic binding. Binding is the association between an attribute of a variable and its value. For example, when a variable is initialized, a binding to its value occurs. If the binding occurs before runtime AND it does not change during the program execution (runtime), it is a static binding. If the binding occurs during runtime OR if it changes, it is a dynamic binding. As it was discussed in class, it is important to understand that some of the attributes of a name can be bound statically and others dynamically. For example, a variable that changes its value during a program execution, but not its type (which is enforced in strongly typed languages), can be statically bound to its type, but dynamically bound to its value. 8