Lecture 6: The Declarative Kernel Language Machine. September 13th, 2011

Similar documents
Lecture 5: Declarative Programming. The Declarative Kernel Language Machine. September 12th, 2011

Lecture 4: The Declarative Sequential Kernel Language. September 5th 2011

Programming Language Concepts, cs2104 Lecture 04 ( )

Lecture 21: Relational Programming II. November 15th, 2011

Lecture 8: Recursion and Iteration. Exceptions. Declarative Programming.

On Academic Dishonesty. Declarative Computation Model. Single assignment store. Single assignment store (2) Single assignment store (3)

Tail Calls. CMSC 330: Organization of Programming Languages. Tail Recursion. Tail Recursion (cont d) Names and Binding. Tail Recursion (cont d)

Semantic Analysis. Outline. The role of semantic analysis in a compiler. Scope. Types. Where we are. The Compiler so far

COMP-421 Compiler Design. Presented by Dr Ioanna Dionysiou

Semantic Analysis. Outline. The role of semantic analysis in a compiler. Scope. Types. Where we are. The Compiler Front-End

Compiler Theory. (Semantic Analysis and Run-Time Environments)

The role of semantic analysis in a compiler

CMSC 330: Organization of Programming Languages. Formal Semantics of a Prog. Lang. Specifying Syntax, Semantics

LECTURE 3. Compiler Phases

Operational Semantics. One-Slide Summary. Lecture Outline

Parsing Scheme (+ (* 2 3) 1) * 1

CS Lecture 2. The Front End. Lecture 2 Lexical Analysis

6.184 Lecture 4. Interpretation. Tweaked by Ben Vandiver Compiled by Mike Phillips Original material by Eric Grimson

Functional Programming. Pure Functional Programming

Closures. Mooly Sagiv. Michael Clarkson, Cornell CS 3110 Data Structures and Functional Programming

Anatomy of a Compiler. Overview of Semantic Analysis. The Compiler So Far. Why a Separate Semantic Analysis?

CS 11 Ocaml track: lecture 7

CMSC 330: Organization of Programming Languages. Operational Semantics

COMPILER CONSTRUCTION LAB 2 THE SYMBOL TABLE. Tutorial 2 LABS. PHASES OF A COMPILER Source Program. Lab 2 Symbol table

CS 4240: Compilers and Interpreters Project Phase 1: Scanner and Parser Due Date: October 4 th 2015 (11:59 pm) (via T-square)

Introduction to Scheme

CS558 Programming Languages

Declarative Computation Model

CSE341: Programming Languages Lecture 17 Implementing Languages Including Closures. Dan Grossman Autumn 2018

Syntax Errors; Static Semantics

UNIT 3

Static Semantics. Lecture 15. (Notes by P. N. Hilfinger and R. Bodik) 2/29/08 Prof. Hilfinger, CS164 Lecture 15 1

CPL 2016, week 10. Clojure functional core. Oleg Batrashev. April 11, Institute of Computer Science, Tartu, Estonia

The Compiler So Far. CSC 4181 Compiler Construction. Semantic Analysis. Beyond Syntax. Goals of a Semantic Analyzer.

CS 360 Programming Languages Interpreters

Don t write anything in the field marked Eventuell ekstra ID. For each subtask, choose the answer you think is the most correct one.

MIDTERM EXAM (Solutions)

Programming Languages Third Edition

Project 2: Scheme Interpreter

flex is not a bad tool to use for doing modest text transformations and for programs that collect statistics on input.

CSE413: Programming Languages and Implementation Racket structs Implementing languages with interpreters Implementing closures

CS4215 Programming Language Implementation. Martin Henz

Run-Time Data Structures

Declarative concurrency. March 3, 2014

CS558 Programming Languages

EDAN65: Compilers, Lecture 06 A LR parsing. Görel Hedin Revised:

Compiler Design 1. Bottom-UP Parsing. Goutam Biswas. Lect 6

CS 536 Introduction to Programming Languages and Compilers Charles N. Fischer Lecture 11

Compiler Construction

A Small Interpreted Language

CA Compiler Construction

CSE 413 Languages & Implementation. Hal Perkins Winter 2019 Structs, Implementing Languages (credits: Dan Grossman, CSE 341)

Note that in this definition, n + m denotes the syntactic expression with three symbols n, +, and m, not to the number that is the sum of n and m.

News and remarks. CS68 Principles of Programming Languages. Contents today. Concurrency. Declarative concurrency. Threads

6.001 Notes: Section 15.1

The Compiler So Far. Lexical analysis Detects inputs with illegal tokens. Overview of Semantic Analysis

Closures. Mooly Sagiv. Michael Clarkson, Cornell CS 3110 Data Structures and Functional Programming

1. true / false By a compiler we mean a program that translates to code that will run natively on some machine.

UNIVERSITY OF CALIFORNIA Department of Electrical Engineering and Computer Sciences Computer Science Division

Compiler Construction

Procedures. EOPL3: Section 3.3 PROC and App B: SLLGEN

Programs as data first-order functional language type checking

Principle of Complier Design Prof. Y. N. Srikant Department of Computer Science and Automation Indian Institute of Science, Bangalore

COMPUTER SCIENCE TRIPOS

for (i=1; i<=100000; i++) { x = sqrt (y); // square root function cout << x+i << endl; }

6.037 Lecture 4. Interpretation. What is an interpreter? Why do we need an interpreter? Stages of an interpreter. Role of each part of the interpreter

Declarative Programming Techniques

CMSC 330: Organization of Programming Languages

Compiler Construction

CS 406/534 Compiler Construction Putting It All Together

The PCAT Programming Language Reference Manual

Sardar Vallabhbhai Patel Institute of Technology (SVIT), Vasad M.C.A. Department COSMOS LECTURE SERIES ( ) (ODD) Code Optimization

Run-time Environments - 2

G Programming Languages Spring 2010 Lecture 4. Robert Grimm, New York University

Declarative Concurrency (CTM 4)

The Environment Model. Nate Foster Spring 2018

Compiling and Interpreting Programming. Overview of Compilers and Interpreters

Typical workflow. CSE341: Programming Languages. Lecture 17 Implementing Languages Including Closures. Reality more complicated

Programming Language Concepts, cs2104 Lecture 09 ( )

CSE341, Spring 2013, Final Examination June 13, 2013

Lecture 15 CIS 341: COMPILERS

RYERSON POLYTECHNIC UNIVERSITY DEPARTMENT OF MATH, PHYSICS, AND COMPUTER SCIENCE CPS 710 FINAL EXAM FALL 96 INSTRUCTIONS

CS 314 Principles of Programming Languages

Week - 03 Lecture - 18 Recursion. For the last lecture of this week, we will look at recursive functions. (Refer Slide Time: 00:05)

CS 415 Midterm Exam Spring 2002

Syntax-Directed Translation. Lecture 14

CSE450. Translation of Programming Languages. Lecture 11: Semantic Analysis: Types & Type Checking

The Structure of a Syntax-Directed Compiler

1 Lexical Considerations

Compiler Design (40-414)

Tiny Compiler: Back End

5. Semantic Analysis!

UNIVERSITY OF CALIFORNIA Department of Electrical Engineering and Computer Sciences Computer Science Division. P. N. Hilfinger

Draw a diagram of an empty circular queue and describe it to the reader.

Problem with Scanning an Infix Expression

A Functional Evaluation Model

Semantic Analysis. Lecture 9. February 7, 2018

COMPILER CONSTRUCTION Seminar 02 TDDB44

Final Examination May 5, 2005

CS4215 Programming Language Implementation

Transcription:

Lecture 6: The Declarative Kernel Language Machine September 13th, 2011

Lecture Outline Computations contd Execution of Non-Freezable Statements on the Abstract Machine The skip Statement The Sequential Composition Statement The local Statement The Variable-Variable Binding Statement The Value-Creation Statement Summary Execution of Freezable Statements on the Abstract Machine The if Statement The case Statement The Procedure Application Statement Oz Oz Implementing Oz in Oz Summary

Computations contd How does a computation proceed? Let statement 0 be a program. 1 The abstract machine is initialized as follows: 1. Create an empty environment: E 0 = {} 2. Create a semantic statement: s 0 = ( statement 0, E 0 ) 3. Create a semantic stack and push onto it the initial semantic statement: S 0 = [s 0 ] 4. Create an empty single assignment store: σ 0 = {} 5. Create the initial state of the abstract machine: M 0 = ( S 0, σ 0 ) 1 A program is a statement, possibly a sequence of statements recall the grammar for DSKL.

Computations contd How does a computation proceed? contd Start the machine in the initial execution state M 0 as described above. Use the following algorithm: 1. If in the current execution state, M i = ( S i, σ i ), the semantic stack S i is empty, stop and report success. 2. Otherwise, 2.1 take the first semantic statement, s i 1 = ( statement i 1, E i 1 ), from the stack; 2.2 execute statement i 1 using, if necessary, mappings from E i 1 and bindings from σ i ; 2.3 update the state of the semantic stack from S i to S i+1 by removing the topmost semantic statement (the one just executed) and pushing onto it new semantic statements if needed; 2.4 update the state of the single assignment store from σ i to σ i+1 by adding new variables and performing bindings, if needed; 2.5 in case of problems, stop and report failure. 3. Update the state of the machine from M i to M i+1 = ( S i+1, σ i+1 ), and go to 1.

Lecture Outline Computations contd Execution of Non-Freezable Statements on the Abstract Machine The skip Statement The Sequential Composition Statement The local Statement The Variable-Variable Binding Statement The Value-Creation Statement Summary Execution of Freezable Statements on the Abstract Machine The if Statement The case Statement The Procedure Application Statement Oz Oz Implementing Oz in Oz Summary

Execution of Non-Freezable Statements The skip statement Before: ( [s 1, s 2, s 3,...], σ ), s 1 = (skip, E 1 ) After: ( [s 2, s 3,...], σ ) Example (The skip statement) 1. ( [(skip, {})], {} ) 2. ( [], {} )

Execution of Non-Freezable Statements contd Sequential composition Before: ( [s 1, s 2, s 3,...], σ ) s 1 = ( statement 1.1 statement 1.2, E 1 ) After: ( [s 1.1, s 1.2, s 2, s 3,...], σ ) s 1.1 = ( statement 1.1, E 1 ) s 1.2 = ( statement 1.2, E 1 ) Example (Sequential composition) 1. ( [(skip skip, {})], {} ) 2. ( [(skip, {}), (skip, {})], {} )

Execution of Non-Freezable Statements contd Exercise Show how the computation proceeds if the state is: ( [(skip skip skip, E)], σ )

Execution of Non-Freezable Statements contd The local statement Before: ( [s 1, s 2, s 3,...], σ ) s 1 = (local id 1 in statement 1 end, E 1 ) After: ( [s 1, s 2, s 3,...], σ ) s 1 = ( statement 1, E 1 ) σ = σ {v} E 1 = E 1 + { id 1 v} v is a new, unbound variable (v / σ) Example (The local statement) 1. ( [(local X in skip end, {})], {} ) 2. ( [(skip, {X v 1 })], {v 1 } )

Execution of Non-Freezable Statements contd Variable-variable binding (naive version) Before: ( [s 1, s 2, s 3,...], σ ) s 1 = ( id 1 = id 2, E 1 ) After: ( [s 2, s 3,...], σ ) σ = σ {E 1 ( id 1 ) = E 1 ( id 2 )} The rule is fine if both E 1 ( id 1 ) and E 1 ( id 2 ) are unbound in σ: the variables are made equivalent, and both remain unbound in σ. Example (Variable-variable binding) 1. ( [(X = Y, {X v 1,Y v 2 })], {v 1, v 2 } ) 2. ( [], {v 1, v 2, v 1 = v 2 } )

Execution of Non-Freezable Statements contd Other situations are possible. Let v 1 = E 1 ( id 1 ) and v 2 = E 1 ( id 1 ); then: 1. If v 1 is unbound but v 2 is bound in σ: make v 1 equivalent to v 2, as above, or bind v 1 directly to the value of v 2. 2. If both v 1 and v 2 are bound, unify them, recursively if necessary: 2 perform additional bindings, if neeed; report a failure if, at any step in the recursive process, two values do not unify; otherwise, optionally make v 1 and v 2 equivalent in σ (logically useless, but may have practical effects). 3 Note: It is still unspecified what should happen if additional bindings are needed, and yet the values of v 1 and v 2 are nonunifiable. See Ch. 13 in CTMCP for more details on this issue. 2 Complex values (e.g., nested records) require unification to proceed recursively on their components. 3 E.g., next time v 1 and v 2 are compared there is no need to examine their content.

Execution of Non-Freezable Statements contd Example (Variable-variable binding) 1. ( [(X = Y, {X v 1,Y v 2 })], {v 1 = tree(leaf leaf), v 2 = tree(v 3 bird), v 3 } ) 2. ( [], {v 1 = tree(leaf leaf), v 2 = tree(leaf worm), v 3 = leaf} ) a unification failure is reported: leaf bird (but v 3 becomes bound). Note: If we allow additional bindings even if there is a unification failure, what bindings will actually be made depends on the order of unification of components in complex values. 4 Example (Variable-variable binding) 1. ( [(X = Y, {X v 1,Y v 2 })], {v 1 = tree(leaf leaf), v 2 = tree(bird v 3 ), v 3 } ) 2. ( [], {v 1 = tree(leaf leaf), v 2 = tree(bird v 3 ), v 3 } ) a unification failure is reported: leaf bird (and v 3 remains unbound). 4 If the semantics of the language do not make it clear, then it is an implementational detail.

Execution of Non-Freezable Statements contd Value creation (naive version) Before: ( [s 1, s 2, s 3,...], σ ) s 1 = ( id 1 = value 1, E 1 ) After: ( [s 2, s 3,...], σ ) σ = σ {E 1 ( id 1 ) = o} o is an object (a value) that results from an evaluation of the expression value 1 within the context of E 1 and σ. Note: The rule does not account for the more complex case where E 1 ( id ) is an already bound variable in such cases we perform unification, and the scheme is analogous to that for variable-variable binding of already bound variables. Example (Value creation) 1. ( [(X = tuple(y X), {X v 1,Y v 2 })], {v 1, v 2 } ) 2. ( [], {v 1 = tuple(v 2, v 1 ), v 2 } )

Execution of Non-Freezable Statements contd The semantics for value creation statements given in CTMCP differ slightly from the above: 1. a new, unbound variable (say, v x ) is created; 2. the value is actually created (the memory allocated and initialized), and v x is bound to the value; 3. the variable that id 1 is mapped to is bound to v x. Example (Value creation) 1. ( [(X = tuple(y X), {X v 1,Y v 2 })], {v 1, v 2 } ) 2. ( [], {v 1 = v x, v x = tuple(v 2, v 1 ), v 2 } ) The difference between these two variants is inessential, and we can safely adopt the earlier, more memory-efficient scheme.

Execution of Non-Freezable Statements contd Example (A simple kernel language program) local X in local Y in X = Y Y = true end end

Execution of Non-Freezable Statements contd Example (Execution of non-freezable statements contd) 1. ( [(local X in local Y in X=Y Y=true end end, {})], {} ) 2. ( [(local Y in X=Y Y=true end, {X v 1 })], {v 1 } ) 3. ( [(X=Y Y=true, {X v 1,Y v 2 })], {v 1, v 2 } ) 4. ( [(X=Y, {X v 1,Y v 2 }), (Y=true, {X v 1,Y v 2 })], {v 1, v 2 } ) 5. ( [(Y=true, {X v 1,Y v 2 })], {v 1, v 2, v 1 = v 2 } ) 6. ( [], {v 1 = true, v 2 = true, v 1 = v 2 } ) Note: Following CTMCP, the final state of the machine would look like ( [], {v 1 = v 3, v 2 = v 3, v 1 = v 2, v 3 = true} )

Execution of Non-Freezable Statements contd Value creation procedure values Before: ( [s 1, s 2, s 3,...], σ ) s 1 = ( id 1 = value 1, E 1 ) value 1 = proc {$ id p 1... id p n} statement p end id p 1,..., id p n are formal parameters After: ( [s 2, s 3,...], σ ) σ = σ {E 1 ( id 1 ) = c} c = (d c, E c ) is a closure (a procedure object, procedure value) a pair of a procedure definition d c and a closure environment E c d c corresponds, in some form, 5 to value 1 E c = E 1 { id c 1, id c 2,...}, where all id c i are free identifiers in statement 1 (identifiers in identifiers in statement 1 which are not in { id p 1,..., id p n}) 5 As the actual source code, as in CTMCP; as some intermediate internal form, e.g., a partial parse tree; or as compiled intermediate or target code.

Execution of Non-Freezable Statements contd Free identifiers An identifier is free in a statement if the statement does not declare it. Note: An identifier can have both free and bound (non-free) occurences in a statement. Example (Free identifiers) X is free in the following statements: local Y in Y = X + 1 end Y = proc {$ Z} Z = X + 1 end X is not free in the following statements: local X in X = Y + 1 end Y = proc {$ X} Z = X + 1 end X is both free and bound in the following statement: X = proc {$ X} Y = X + 1 end

Execution of Non-Freezable Statements contd Example (Procedure values) 1. ( [ (P = proc {$ A} A=B end, {P v 1,A v 2,B v 3 }) ], {v 1, v 2, v 3 } ) 2. ( [ ], {v 1 = (proc {$ A} A=B end, {B v 3 }), v 2, v 3 } ) Note: E c = E 1 {B} = {B v 3 } Example (Procedure values contd) 1. ( [ (P = proc {$ A} A=P end, {P v 1,A v 2,B v 3 }) ], {v 1, v 2, v 3 } ) 2. ( [ ], {v 1 = (proc {$ A} A = P end, {P v 1 }), v 2, v 3 } ) Note: E c = E 1 {P} = {P v 1 }

Execution of Non-Freezable Statements contd The skip statement, summary The semantic statement is: (skip, E) The execution rule is: Leave the rest of the stack and the store unchanged.

Execution of Non-Freezable Statements contd The sequence statement, summary The semantic statement is: ( statement 1 statement 2, E) The execution rule is: 1. Push ( statement 2, E) onto the stack. 2. Push ( statement 2, E) onto the stack. Leave the store unchanged.

Execution of Non-Freezable Statements contd The local statement, summary The semantic statement is: (local id in statement end, E) The execution rule is: 1. Allocate a new variable v i in the store. 2. Create E by adding the mapping id v i to E. 3. Push ( statement, E ) onto the stack.

Execution of Non-Freezable Statements contd The variable-variable binding statement, summary The semantic statement is: ( id 1 = id 2, E) The execution rule is: 1. Bind E( id 1 ) and E( id 2 ) in the store. 6 Leave the rest of the stack unchanged. 6 If both E( id 1 ) and E( id 1 ) are already bound, then their values must be unified. The result is either successful unification, or a runtime error.

Execution of Non-Freezable Statements contd The value-creation statement, summary The semantic statement is: The execution rule is: ( id = value, E) 1. Compute the value value of value. For any free identifier id in value, use E( id ) in the computation. 2. Bind E( id ) to value. 7 Leave the rest of the stack unchanged. 7 If E( id ) is already bound, its value must be unified with value. The result is either successful unification, or a runtime error.

Lecture Outline Computations contd Execution of Non-Freezable Statements on the Abstract Machine The skip Statement The Sequential Composition Statement The local Statement The Variable-Variable Binding Statement The Value-Creation Statement Summary Execution of Freezable Statements on the Abstract Machine The if Statement The case Statement The Procedure Application Statement Oz Oz Implementing Oz in Oz Summary

Execution of Freezable Statements contd The if statement The semantic statement is: The execution rule is: 1. Check E( id ). (if id then statement 1 else statement 2 end, E) If E( id ) is unbound, stop. If E( id ) is bound to true, push ( statement 1, E) on the stack. If E( id ) is bound to false, push ( statement 2,E) on the stack. Otherwise, raise an error. Leave the store unchanged.

Execution of Freezable Statements contd Example (The if statement) 1. ( [(if X then Y=1 else Y=2 end, {X v 1,Y v 2 })], {v 1 = true, v 2 } ) 2. ( [(Y=1, {X v 1,Y v 2 })], {v 1 = true, v 2 } ) 3. ( [], {v 1 = true, v 2 = 1} ) Example (The if statement) 1. ( [(if X then Y=1 else Y=2 end, {X v 1,Y v 2 })], {v 1, v 2 = 3} ) Note: The computation suspends, since E(X) is unbound. If the computation resumes, a unification failure is reported (but computation cannot resume in this model, we need concurrency).

Execution of Freezable Statements contd The case statement The semantic statement is: The execution rule is: 1. Check E( id 0 ). (case id 0 of literal ( feature 1 : id 1... feature n : id n ) then statement 1 else statement 2 end, E) If E( id 0 ) is unbound, stop. If E( id 0 ) is bound to a record with the label literal and arity of length n, containing exactly the features feature 1,..., feature n, then push onto the stack the semantic statement ( statement 1, E ), where E is created by adding to E the mapping id i E( id 0 ). feature i for i = 1,..., n. Otherwise, push ( statement 2, E) onto the stack. Leave the store unchanged.

Execution of Freezable Statements contd Example (The case statement) We shall examine step by step an execution of the following program: local X in local V in X = x(f:v) case X of x(f:x) then V = X % (1) else skip end end end (1) What is the content of X? What is the content of V? Is there a unification failure? Is this statement executed at all?

Execution of Freezable Statements contd Example (The case statement contd) 1. ( [(local X in... end, {})], {} ) 2. ( [(local V in... end, {X v 1 })], {v 1 } ) 3. ( [(X=x(f:V) case... end, {X v 1,V v 2 })], {v 1, v 2 } ) 4. ( [(X=x(f:V), {X v 1,V v 2 }),...], {v 1, v 2 } ) 5. ( [(case X of x(f:x)... end, {X v 1,V v 2 })], {v 1 = x(f:v 2 ), v 2 } ) 6. ( [(V=X, {X v 2,V v 2 })], {v 1 = x(f:v 2 ), v 2 } ) 7. ( [], {v 1 = x(f:v 2 ), v 2 } ) Note: Observe how the mapping for X changes from v 1 in (5) to v 2 in (6).

Execution of Freezable Statements contd The procedure application statement The semantic statement is: The execution rule is: 1. Check E( id 0 ). ({ id 0 id 1... id n }, E) If E( id 0 ) is unbound, stop. If E( id 0 ) is bound to a closure (p c,e c ) where the enclosed procedure definition has the form proc {$ id p 1... id p n} statement p end then push onto the stack the semantic statement ( statement p, E ), where E is created from E by adding the mapping id p i E( id i) for i = 1,..., n. Otherwise, raise an error. Leave the store unchanged. Note: This mechanism of passing arguments to procedures is typically called call by name or call by reference (the terminology is not consistent; we return to this issue later).

Execution of Freezable Statements contd Formal and actual parameters Formal parameter: An identifier in a procedure definition. Actual parameter: An identifier in a procedure application. Note: It is common to call formal parameters parameters, and actual parameters arguments. Example (Formal and actual parameters) P = proc {$ A B} A = B + 1 end {P X Y} A and B are formal parameters in a definition. X and Y are actual parameters in an application.

Execution of Freezable Statements contd Example (Procedure call) We shall examine step by step an execution of the following program: 8 local P in P = proc {$ X} local T in T = X == 0 if T then skip else local Y in Y = X - 1 {P Y} end end end end local Y in Y = 2 {P Y} end end 8 It s going to be boring, isn t it?

Execution of Freezable Statements contd Example (Procedure call contd) 1. ( [(local P in... end, {})], {} ) 2. ( [(P = proc {$ X}... end local X in... end, {P v 1 })], {v 1 } ) 3. ( [(P = proc {$ X}... end, {P v 1 }),...], {v 1 } ) 4. ( [(local Y in... end, {P v 1 })], {v 1 = (proc... end,{p v 1 })} ) 5. ( [(Y = 2 {P Y}, {P v 1,Y v 2 })], {v 1 = (...), v 2 } ) 6. ( [(Y = 2, {P v 1,Y v 2 }),...], {v 1 = (...), v 2 } ) 7. ( [({P Y}, {P v 1,Y v 2 })], {v 1 = (...), v 2 = 2} ) 8. ( [(local T in... end, {P v 1,X v 2 })], {v 1 = (...), v 2 = 2} ) 9. ( [(T = X == 0 if... end,{...,x v 2,T v 3 })], {..., v 2 = 2, v 3 } ) 10. ( [(T = X == 0,{...,X v 2,T v 3 }),...], {..., v 2 = 2, v 3 } ) 11. ( [(if T... end,{...,t v 3 })], {..., v 3 = false} ) 12. ( [(local Y in... end, {...})], {...} ) 13. homework: do the rest, try not to get lost...

The Abstract Kernel Language Machine Artificial intelligence meets natural stupidity. Drew McDermott Real stupidity beats artificial intelligence every time. Terry Pratchett

Lecture Outline Computations contd Execution of Non-Freezable Statements on the Abstract Machine The skip Statement The Sequential Composition Statement The local Statement The Variable-Variable Binding Statement The Value-Creation Statement Summary Execution of Freezable Statements on the Abstract Machine The if Statement The case Statement The Procedure Application Statement Oz Oz Implementing Oz in Oz Summary

Oz Oz Oz in Oz We have had a look at the Declarative Sequential Kernel Language (DSKL), a small subset of Oz: We have defined its syntax the form of source code constructs. We have defined its the meaning of programs in DSKL by means of a simple operational semantics rules for the execution of statements on an abstract machine. We have seen a few convenient syntactic shortcuts and linguistic abstractions that lead from the kernel language to a practical language. We have seen a few examples of use of the language, explained with handwaving semantics where necessary. We are now able to utilize this knowledge and build a metalinguistic abstraction: an interpreter for (a fragment of) Oz implemented in Oz.

Oz Oz Oz in Oz contd Interpretation To interpret a program given as source code, we need to 1. scan the code to produce a sequence of tokens; 9 2. parse the list of tokens to produce a parse tree; 3. process the parse tree, performing actions corresponding to its parts in the correct order. Here we shall implement only the last part (interpretation) assuming that we will have programs in an already parsed form. We can separately build a scanner and a parser (e.g., using the Gump tool) we shall return to this later. 10 Instead, we will play the role of a scanner and a parser, and hand-code parse trees not funny, but perfectly doable. 9 In practice, lexing and tokenizing are done in one pass over the code. 10 Please read http://www.mozart-oz.org/documentation/gump/index.html if you re interested.

Oz Oz Oz in Oz contd How does the interpreter work? The interpreter is a procedure that takes as input a parse tree corresponding to a program; initializes the virtual machine; iteratively processes the input moving the machine from a state to a state. Note: Each state of the machine is completely specified by the content of its semantic stack and single assignment store. We will use the following template: fun {Interpret Program} fun {Iterate Stack Store}... end in {Iterate <initial stack> <initial store>} end

Oz Oz Oz in Oz contd Inside Interpret we define an internal function Iterate. The code above can be translated into a (semi)kernel form as follows: local Interpret in Interpret = proc {$ Program Result} local Iterate InitialStack InitialStore in 11 Iterate = proc {$ Stack Store Result}... end InitialStack =... InitialStore =... {Iterate InitialStack InitialStore Result} end end {Interpret Program} 12 end Note: We implement a subset of Oz in Oz, but the implementation language could be any other language C, Java, etc. 11 Multiple declaration not in the kernel language. 12 Program is a free identifier here.

Oz Oz Oz in Oz contd Representation Before we implement anything we need to settle: the encoding of parse trees; the structure of the semantic stack; the structure of the single assignment store. Parse trees are trees; we have already seen that trees can be easily encoded using nested records let s adopt this solution: Each parsed statement is a partial parse tree represented as a record. 13 Each parsed statement is of certain type, and may have some content. We can encode the type in the record s label. We can encode the content in the record s fields. 13 The whole program is parsed to a statement which is the complete parse tree.

Oz Oz Oz in Oz contd Example (Parse trees as nested records) This (trivial) program: skip can be (trivially) encoded as: skip Note: We need to quote the skip otherwise there would be a syntax error

Oz Oz Oz in Oz contd Example (Parse trees as nested records contd) This program: local X in local Y in X = Y Y = 2 end end can be encoded as: 14 loc(id: X stat:loc(id: Y stat:seq(1:uni(id: X val:id( Y )) 2:uni(id: Y val:int(2))))) Note: We need to quote the identifiers they are identifiers in the implemented language and should not be confused with identifiers in the implementation language. 14 The features 1, 2, etc. are optional, and we will omit them in further discussion.

Oz Oz Oz in Oz contd Example (Parse trees as nested records contd) This partial program: X = 1 Y = 2 Z = z(x:x y:y) can be encoded as: seq(uni(id: X val:int(1)) seq(uni(id: Y val:int(2)) uni(id: Z val:rec(lbl:z fts:[x y] ids:[ X Y ])))) Record values are encoded (in the parse tree) as records with fields named lbl, for storing the label, fts, for storing the features (as a list), ids, for storing the identifiers (as a list) used to create the record value in the program.

Oz Oz Oz in Oz contd We will assume that record value expressions are parsed so that the list of features is sorted lexicographically, and the list of identifiers is sorted in the corresponding order. Example (Record values) This record value expression: tree(right:x left:y) is parsed into this: rec(lbl:tree fts:[left right] ids:[ Y X ]) and not into any of these: rec(lbl:tree fts:[right left] ids:[ X Y ]) rec(lbl:tree fts:[left right] ids:[ X Y ])

Oz Oz Oz in Oz contd Example (Parse trees as nested records contd) This partial program: case X of label(feature:value) then Y = 1 else Y = 0 end can be encoded as: patm(id: X patt:rec(lbl:label fts:[feature] ids:[ Value ]) cons:uni(id: Y val:int(1)) alt:uni(id: Y val:int(0)))

Oz Oz Oz in Oz contd During execution of programs variables are kept in the single assignment store. We can represent variables the easy way: as Oz variables. 15 We can represent the store as a list of variables. An environment is a set of mappings from identifiers to variables. We can represent an environment as a list of identifier-variable index pairs. The identifiers correspond to the program code, and the indices corespond to the position of variables in the store-list. For simplicity, adding mappings to an environment is realized as adding them at the front of the list; a mapping placed earlier in an environment hides (shadows) mappings for the same identifier placed later in the same environment. 15 In case of a different implementation language, we might need to provide a complete implementation of dataflow variables from scratch.

Oz Oz Oz in Oz contd Example (Environments) The following list: [ X #1 Y #3 Var #7 X #2] represents an environment E, where: E = {X v 1,Y v 3,Var v 2 } Note: The mapping X v 2 is shadowed by the mapping X v 1, placed earlier in the environment list.

Oz Oz Oz in Oz contd Example (Single assignment store) The following list: [_ 1 record(feature:value)] represents the single assignment store σ with the following content: {v 1, v 2 = 1,v 3 = record(feature:value)} Note: An unbound variable in the implementation language represents an unbound variable in the implemented language.

Oz Oz Oz in Oz contd Semantic statements are pairs of (parsed) statements and environments. We can represent semantic statements as two-field records. Example (Semantic statements) The following record: semstat(stat:seq(uni(id: X val:int(1)) uni(id: X val:id( Y ))) env:[ X #1 Y #1]) represents the semantic statement s, where: s = (X = 1 X = Y, {X v 1,Y v 1 }) Note: Semantic statements are stored on the semantic stack. We can represent the semantic stack as a list of semantic statements.

Oz Oz Oz in Oz contd When we start the virtual machine with a (parsed) program, we create a new empty environment represented as the empty list; create a semantic statement containing the parsed statement and the initial environment; represent the initial semantic stack as a list containing the semantic statement; represent the initial single assignment store as the empty list.

Oz Oz Oz in Oz contd Example (States of the virtual machine) An execution of the following program: local X in X = 1 end starts in a state specified by: Stack = [semstat(stat:loc(id: X stat:uni(id: X val:int(1))) env:nil)] Store = nil and ends in a state specified by: Stack = nil Store = [1]

Oz Oz Oz in Oz contd How does Iterate work? Iterate tries to pop a semantic statement from the semantic stack by pattern matching, and then tests the parsed statement: fun {Iterate Stack Store} case Stack of semstat(stat:stat env:env) Stack then case Stat of... []...... else error(stat) end else true end

Oz Oz Oz in Oz contd How does Iterate work? If there is a semantic statement on the stack, the semantic statement is decomposed into a statement and an environment, and the computation proceeds with the stack and store updated according to the content of the semantic statement. Otherwise (the stack is empty), there is nothing to do, and the computation has been completed and success is reported.

Oz Oz Oz in Oz contd Example (The skip statement) case Stat of skip then {Iterate Stack Store} If the statement is skip, there is nothing to do, and the computation proceeds with the rest of the stack and the store unchanged. 16 16 Note that after the initial pattern matching (case Stack...), the identifier Stack points to the rest of the stack, not including the just-popped skip.

Oz Oz Oz in Oz contd Example (The seq statement) case Stat... [] seq(stat1 Stat2) then {Iterate semstat(stat:stat1 env:env) semstat(stat:stat2 env:env) Stack Store} If the statement is a seq, the sequence statement is split into two parts which are pushed onto the stack with the same environment, and the computation proceeds with the store unchanged.

Oz Oz Oz in Oz contd Example (The loc statement) case Stat... [] loc(id:id stat:stat) then Index = {List.length Store} + 1 in {Iterate semstat(stat:stat env:(id#index) Env) Stack {List.append Store [_]}} If the statement is a loc, a new variable is appended to the store, and the statement inside the loc is pushed onto the stack with the environment extended with a mapping from the identifier to the variable.

Oz Oz Oz in Oz contd Try it! The file ozi.oz contains an implementation of the interpreter that supports a subset of DSKL except for procedure creation and application, arithmetics, etc. Load the file ozi-test.oz into the OPI and execute it. It will load the interpreter and execute a few test programs (statements are executed with some delay, to let you see how the computation proceeeds).... implement it in another language (Oz Java? Oz C++? Oz Ruby? Oz dc?)

Oz Oz Oz in Oz contd While implementing Oz in Oz, we made use of declarative variables included in the implementation language to implement declarative variables in the implemented language. How can we implement declarative variables in a language that does not have such a feature? The file code/oz/kernel/variable.java contains a naive implementation of dataflow variables (yes, in Java). 17 You can compile and try the code as follows: 18 $ javac oz/kernel/*.java $ java oz.kernel.test You can also test the compiled Java classes in, e.g., Jython or Scala as follows: 19 $ jython variable-test.jy $ scala variable-test.scala 17 Among other problems, the implementation is not thread safe which we can ignore for the moment, as we focus on sequential programming. 18 Or type make javac java. 19 Or type make jython or make scala.

Lecture Outline Computations contd Execution of Non-Freezable Statements on the Abstract Machine The skip Statement The Sequential Composition Statement The local Statement The Variable-Variable Binding Statement The Value-Creation Statement Summary Execution of Freezable Statements on the Abstract Machine The if Statement The case Statement The Procedure Application Statement Oz Oz Implementing Oz in Oz Summary

Summary This time Next time Homework Pensum Further reading Execution of statements on the DSKL abstract machine. Demonstration of an implementation of a subset of the DSKL machine. Memory management. Scope. Read Ch. 2 in CTMCP (focus on the semantic of the kernel language). All of today s slides, except for the implementation of Oz Oz. Ch. 4 from Structure and Interpretation of Computer Programs (SICP) by Abelson and Sussman shows how Scheme can be implemented in Scheme.