CS 4110 Programming Languages & Logics. Lecture 35 Typed Assembly Language

Similar documents
CS 4110 Programming Languages & Logics

COMP 303 Computer Architecture Lecture 3. Comp 303 Computer Architecture

Code Generation. The Main Idea of Today s Lecture. We can emit stack-machine-style code for expressions via recursion. Lecture Outline.

We can emit stack-machine-style code for expressions via recursion

Compilers and computer architecture: A realistic compiler to MIPS

Subroutines. int main() { int i, j; i = 5; j = celtokel(i); i = j; return 0;}

Lecture 5. Announcements: Today: Finish up functions in MIPS

Lectures 5. Announcements: Today: Oops in Strings/pointers (example from last time) Functions in MIPS

Scope, Functions, and Storage Management

G Programming Languages - Fall 2012

Course Administration

CSCI 402: Computer Architectures. Instructions: Language of the Computer (3) Fengguang Song Department of Computer & Information Science IUPUI.

CS143 - Written Assignment 4 Reference Solutions

Stack Frames. September 2, Indiana University. Geoffrey Brown, Bryce Himebaugh 2015 September 2, / 15

Functions in MIPS. Functions in MIPS 1

EE 361 University of Hawaii Fall

The plot thickens. Some MIPS instructions you can write cannot be translated to a 32-bit number

Run-time Environments. Lecture 13. Prof. Alex Aiken Original Slides (Modified by Prof. Vijay Ganesh) Lecture 13

CS64 Week 5 Lecture 1. Kyle Dewey

Instruction Set Architecture

Control Instructions. Computer Organization Architectures for Embedded Computing. Thursday, 26 September Summary

Control Instructions

CS356: Discussion #6 Assembly Procedures and Arrays. Marco Paolieri

COMP2611: Computer Organization MIPS function and recursion

CSE Lecture In Class Example Handout

Topic 3-a. Calling Convention 2/29/2008 1

Today. Putting it all together

CS153: Compilers Lecture 8: Compiling Calls

CS 61C: Great Ideas in Computer Architecture (Machine Structures) More MIPS Machine Language

CS153: Compilers Lecture 15: Local Optimization

G Programming Languages Spring 2010 Lecture 4. Robert Grimm, New York University

Code Generation. Lecture 12

CSC 2400: Computing Systems. X86 Assembly: Function Calls"

Lecture Outline. Topic 1: Basic Code Generation. Code Generation. Lecture 12. Topic 2: Code Generation for Objects. Simulating a Stack Machine

Function Calling Conventions 1 CS 64: Computer Organization and Design Logic Lecture #9

CS 316: Procedure Calls/Pipelining

The remote testing experiment. It works! Code Generation. Lecture 12. Remote testing. From the cs164 newsgroup

Princeton University Computer Science 217: Introduction to Programming Systems. Assembly Language: Function Calls

Function Calls. Tom Kelliher, CS 220. Oct. 24, SPIM programs due Wednesday. Refer to homework handout for what to turn in, and how.

SPIM Procedure Calls

CSC 8400: Computer Systems. Using the Stack for Function Calls

Implementing Subprograms

Code Generation. Lecture 19

This Lecture. G53CMP: Lecture 14 Run-Time Organisation I. Example: Lifetime (1) Storage Areas

G53CMP: Lecture 14. Run-Time Organisation I. Henrik Nilsson. University of Nottingham, UK. G53CMP: Lecture 14 p.1/37

Memory Usage 0x7fffffff. stack. dynamic data. static data 0x Code Reserved 0x x A software convention

Code Generation Super Lectures

The plot thickens. Some MIPS instructions you can write cannot be translated to a 32-bit number

CIT Week13 Lecture

IR Generation. May 13, Monday, May 13, 13

EN164: Design of Computing Systems Lecture 11: Processor / ISA 4

Function Calls. 1 Administrivia. Tom Kelliher, CS 240. Feb. 13, Announcements. Collect homework. Assignment. Read

Code Generation. Lecture 30

Final Exam Review Questions

MIPS Assembly (Functions)

CS 61C: Great Ideas in Computer Architecture Strings and Func.ons. Anything can be represented as a number, i.e., data or instruc\ons

Assembly Language: Function Calls

Three-Address Code IR

CSCI 334: Principles of Programming Languages. Computer Architecture (a really really fast introduction) Lecture 11: Control Structures II

We defined congruence rules that determine the order of evaluation, using the following evaluation

Assembly Language: Function Calls" Goals of this Lecture"

CSC 2400: Computer Systems. Using the Stack for Function Calls

CSC 2400: Computer Systems. Using the Stack for Function Calls

Type checking. Jianguo Lu. November 27, slides adapted from Sean Treichler and Alex Aiken s. Jianguo Lu November 27, / 39

Functions in C. Lecture Topics. Lecture materials. Homework. Machine problem. Announcements. ECE 190 Lecture 16 March 9, 2011

Run-time Environments - 2

Assembly Language: Function Calls" Goals of this Lecture"

COMP 181. Prelude. Intermediate representations. Today. High-level IR. Types of IRs. Intermediate representations and code generation

CA Compiler Construction

Programming Studio #9 ECE 190

Language Techniques for Provably Safe Mobile Code

CS 61C: Great Ideas in Computer Architecture More MIPS, MIPS Functions

An Overview to Compiler Design. 2008/2/14 \course\cpeg421-08s\topic-1a.ppt 1

The Activation Record (AR)

MIPS Procedure Calls. Lecture 6 CS301

Lecture Outline. Code Generation. Lecture 30. Example of a Stack Machine Program. Stack Machines

System Software Assignment 1 Runtime Support for Procedures

Operational Semantics of Cool

Code Generation. Lecture 31 (courtesy R. Bodik) CS164 Lecture14 Fall2004 1

Functions and the MIPS Calling Convention 2 CS 64: Computer Organization and Design Logic Lecture #11

Run-time Environments

Run-time Environment

Run-time Environments

Stack-Based Typed Assembly Language

Optimization Prof. James L. Frankel Harvard University

Announcements. Scope, Function Calls and Storage Management. Block Structured Languages. Topics. Examples. Simplified Machine Model.

Review of Activation Frames. FP of caller Y X Return value A B C

12/4/18. Outline. Implementing Subprograms. Semantics of a subroutine call. Storage of Information. Semantics of a subroutine return

Function Calling Conventions 2 CS 64: Computer Organization and Design Logic Lecture #10

Procedure Call and Return Procedure call

Subprograms, Subroutines, and Functions

MIPS Functions and Instruction Formats

Programming Languages: Lecture 12

CS 2210 Programming Project (Part IV)

COL728 Minor2 Exam Compiler Design Sem II, Answer all 5 questions Max. Marks: 20

Comp 204: Computer Systems and Their Implementation. Lecture 22: Code Generation and Optimisation

Stacks and Frames Demystified. CSCI 3753 Operating Systems Spring 2005 Prof. Rick Han

Midterm II CS164, Spring 2006

MIPS Assembly (FuncDons)

CS 110 Computer Architecture Lecture 6: More MIPS, MIPS Functions

Transcription:

CS 4110 Programming Languages & Logics Lecture 35 Typed Assembly Language 1 December 2014

Announcements 2 Foster Office Hours today 4-5pm Optional Homework 10 out Final practice problems out this week

Schedule 3 One and a half weeks ago... Typed Assembly Language Last Monday Polymorphism Stack Types Today Compilation

Certified Compilation 4 An automatic way to generate certifiable low-level code Type safety implies memory, security properties

Certified Compilation 4 An automatic way to generate certifiable low-level code Type safety implies memory, security properties Tiny Source language

Certified Compilation 4 An automatic way to generate certifiable low-level code Type safety implies memory, security properties Tiny Source language Typed Assembly Language Compiler target

Certified Compilation 4 An automatic way to generate certifiable low-level code Type safety implies memory, security properties Tiny Source language Typed Assembly Language Compiler target Optimized Typed Assembly Language Optimizer target

The Tiny Language 5 Features Integer expressions Conditionals Recursive functions Function pointers (but no closures) A strong, static type system

The Tiny Language 5 Features Integer expressions Conditionals Recursive functions Function pointers (but no closures) A strong, static type system Example program letrec fun fact (n:int) : int = if n = 0 then 1 else n * fact (n 1) in fact (42)

Tiny Type System 6 Expressions Φ x : Φ(x)

Tiny Type System 6 Expressions Φ x : Φ(x) Φ f : Φ(f)

Tiny Type System 6 Expressions Φ x : Φ(x) Φ f : Φ(f) Φ n : int

Tiny Type System 6 Expressions Φ x : Φ(x) Φ f : Φ(f) Φ n : int Φ e 1 : int Φ e 2 : int Φ e 1 + e 2 : int

Tiny Type System 6 Expressions Φ x : Φ(x) Φ f : Φ(f) Φ n : int Φ e 1 : int Φ e 1 + e 2 : int Φ e 2 : int Φ e 1 : τ 1 τ 2 Φ e 2 : τ 1 Φ e 1 e 2 : τ 2

Tiny Type System 6 Expressions Φ x : Φ(x) Φ f : Φ(f) Φ n : int Φ e 1 : int Φ e 1 + e 2 : int Φ e 2 : int Φ e 1 : τ 1 τ 2 Φ e 2 : τ 1 Φ e 1 e 2 : τ 2 Φ e 1 : int Φ e 2 : τ Φ e 3 : τ Φ if e 1 = 0 then e 2 else e 3 : τ

Tiny Type System 6 Expressions Φ x : Φ(x) Φ f : Φ(f) Φ n : int Φ e 1 : int Φ e 1 + e 2 : int Φ e 2 : int Φ e 1 : τ 1 τ 2 Φ e 2 : τ 1 Φ e 1 e 2 : τ 2 Φ e 1 : int Φ e 2 : τ Φ e 3 : τ Φ if e 1 = 0 then e 2 else e 3 : τ Φ e 1 : τ 1 Φ, x : τ 1 e 2 : τ 2 Φ let x = e 1 in e 2 : τ 1

Tiny Type System 7 Declarations Φ, x : τ 1 e : τ 2 Φ fun f(x : τ 1 ) : τ 2 = e : (f : τ 1 τ 2 )

Tiny Type System 7 Declarations Φ, x : τ 1 e : τ 2 Φ fun f(x : τ 1 ) : τ 2 = e : (f : τ 1 τ 2 ) Programs Φ = f 1 : τ 11 τ 12,..., f k : τ k1 τ k2 Φ d i : (f i : τ i1 τ i2 ) Φ e : int letrec d 1 d k in e

Tiny Type System 7 Declarations Φ, x : τ 1 e : τ 2 Φ fun f(x : τ 1 ) : τ 2 = e : (f : τ 1 τ 2 ) Programs Φ = f 1 : τ 11 τ 12,..., f k : τ k1 τ k2 Φ d i : (f i : τ i1 τ i2 ) Φ e : int letrec d 1 d k in e Exercise: Check that fact is well typed

Type-preserving Compilation 8 Most compilers consist of a series of transformations In our compiler, each step will propagate types A transformation consists of two sub-translations: From source types to target types From source terms to target terms After each step, can typecheck the intermediate output code to detect errors in the compiler Key correctness property will be that transformations preserve types (as well as semantics)

Type Translation 9 T [[ ]] maps Tiny types to TAL Types

Type Translation 9 T [[ ]] maps Tiny types to TAL Types Integers: T [[int]] = int

Type Translation 9 T [[ ]] maps Tiny types to TAL Types Integers: T [[int]] = int Functions: Caller pushes argument and return address on stack Callee pops the return address and argument and places result in ra T [[τ 1 τ 2 ]] = ρ.{sp : K[[τ 2, ρ]] :: T [[τ 1 ]] :: ρ} {} where K[[τ, σ]] = {sp : σ, r a : T [[τ]]} {} K[[τ]] = {r a : T [[τ]]} {}

Expression Translation 10 E[[ ]] maps Tiny expressions (really typing derivations) to labeled code blocks

Expression Translation 10 E[[ ]] maps Tiny expressions (really typing derivations) to labeled code blocks We use the stack extensively: To evaluate an expression, evaluate subexpressions then push them onto the stack Return final value in ra M maps expression variables to stack offsets I(M) increments the offset associated with each variable in M Overall, uses just two registers, r a and r t, and the stack Shape of the translation is E[[e]] M,σ = J, where J is a sequence of typed, labeled blocks. Write L f for the TAL label of function f

Expression Translation 11 Expression variables E[[x]] M,σ = sld r a, M(x)

Expression Translation 11 Expression variables E[[x]] M,σ = sld r a, M(x) Function variables E[[f]] M,σ = mov r a, L f

Expression Translation 11 Expression variables E[[x]] M,σ = sld r a, M(x) Function variables E[[f]] M,σ = mov r a, L f Integers E[[n]] M,σ = mov r a, n

Expression Translation 11 Expression variables E[[x]] M,σ = sld r a, M(x) Function variables E[[f]] M,σ = mov r a, L f Addition E[[e 1 + e 2 ]] M,σ = E[[e 1 ]] M,σ push r a E[[e 2 ]] I(M),int :: σ pop r t add r a, r t, r a Integers E[[n]] M,σ = mov r a, n

Expression Translation 12 Application E[[e 1 e 2 ]] M,σ = E[[e 1 ]] M,σ push r a E[[e 2 ]] I(M),T [[τ1 τ 2 ]] :: σ pop r t push r a push L r [ρ] jmp r t [σ] L r : ρ. K[[τ 2, σ]] where Φ e 1 : τ 1 τ 2 and L r fresh

Expression Translation 13 Conditional E[[if e 1 = 0 then e 2 else e 3 ]] M,σ = E[[e 1 ]] M,σ bneq r a, L else [ρ] E[[e 2 ]] M,σ jmp L end [ρ] L else : ρ.{sp : σ} {} E[[e 3 ]] M,σ jmp L end [ρ] L end : ρ.k[[τ, σ]] where Φ e 2 : τ and Φ e 2 : τ and L else and L end fresh

Declaration Translation 14 Function F[[fun f (x : τ 1 ) : τ 2 = e]] = L f : T [[τ 1 τ 2 ]] E[[e 2 ]] x := 2,K[[τ2,ρ]] :: T [[τ 1 ]] :: ρ pop r t sfree 1 jmp r t

Program Translation 15 Program P[[letrec d 1 d k in e]] = F[[d 1 ]]. F[[d k ]] L main : ρ.{sp : K[[int, ρ]] :: ρ} {} E[[e]],K[[int,ρ]] :: ρ pop r t jmp r t To run the program, push return address on stack and jump to L main Result is returned in r a

Factorial L fact : ρ. {sp : K[[int]] :: int :: ρ} sld r a,2 bneq r a,l else [ρ] mov r a,1 jmp L end L else : ρ. {sp : K[[int]] :: int :: ρ} sld r a,2 push r a mov r a,l fact push r a sld r a,4 push r a mov r a,1 pop r t sub r a,r t,r a pop r t push L r [ρ] jmp r t [int :: K[[int, ρ]] :: int :: ρ] 16

Factorial L fact : ρ. {sp : K[[int]] :: int :: ρ} sld r a,2 bneq r a,l else [ρ] mov r a,1 jmp L end L else : ρ. {sp : K[[int]] :: int :: ρ} sld r a,2 push r a mov r a,l fact push r a sld r a,4 push r a mov r a,1 pop r t sub r a,r t,r a pop r t push L r [ρ] jmp r t [int :: K[[int, ρ]] :: int :: ρ] L r : ρ. {sp : K[[int]] :: int :: ρ, r a : int} pop r t mul r a,r t,r t jmp L end [ρ] L end : ρ. {sp : K[[int]] :: int :: ρ, r a : int} pop r t sfree 1 jmp r t 16

Compiler properties 17 Theorem If P then there exists Ψ such that P[[P]] : Ψ Proof by induction on P... Main idea: We don t have to trust the compiler because we can typecheck the assembly code emitted as output!

Optimizations Almost any compiler will produce better code than ours! (But how many compilers fit on four slides?) The type system makes it possible to implement many standard optimizations to generate better code: Instruction selection and scheduling Register allocation Common subexpression elimination Redundant load/store elimination Strength reduction Loop-invariant removal Tail-calls and others... Types do not interfere with the most common optimizations 18

Tail-call Optimization 19 A crucial optimization for functional languages Applies when the final operation in a function f is a call to another function g Instead of having f push the return address onto the stack and engage in the normal calling sequence, f pops all of its temporary values, jumps directly to g and never returns!

Tail-call Optimization 20 Unoptimized code L f : ρ. {sp : K[[τ, ρ]] :: τ f :: ρ, r a : τ g } {} salloc 2 % setup stack frame sst L r % push return address sst r a,2 % push argument jmp L g [τ :: τ f :: ρ] L r : ρ. {sp : τ :: τ f :: ρ, r a : τ} {} pop r t % pop return address sfree 1 % discard f s argument jmp r t % return Optimized code L f : ρ. {sp : K[[τ, ρ]] :: τ f :: ρ, r a : τ g } {} sst r a,2 jmp L g [ρ] % g returns to f s caller