Dynamic Types , Spring March 21, 2017

Similar documents
Recitation 8: Dynamic and Unityped Languages : Foundations of Programming Languages

At build time, not application time. Types serve as statically checkable invariants.

CS152: Programming Languages. Lecture 11 STLC Extensions and Related Topics. Dan Grossman Spring 2011

Review. CS152: Programming Languages. Lecture 11 STLC Extensions and Related Topics. Let bindings (CBV) Adding Stuff. Booleans and Conditionals

IR Generation. May 13, Monday, May 13, 13

Types and Programming Languages. Lecture 5. Extensions of simple types

CS-XXX: Graduate Programming Languages. Lecture 9 Simply Typed Lambda Calculus. Dan Grossman 2012

CSE 341: Programming Languages

Announcements. Working on requirements this week Work on design, implementation. Types. Lecture 17 CS 169. Outline. Java Types

Goal. CS152: Programming Languages. Lecture 15 Parametric Polymorphism. What the Library Likes. What The Client Likes. Start simpler.

Concepts of Programming Languages

Lecture 1: Overview

Three-Address Code IR

Administration CS 412/413. Why build a compiler? Compilers. Architectural independence. Source-to-source translator

Closures. Mooly Sagiv. Michael Clarkson, Cornell CS 3110 Data Structures and Functional Programming

Semantic Analysis. Lecture 9. February 7, 2018

Meeting14:Denotations

Type Inference. Prof. Clarkson Fall Today s music: Cool, Calm, and Collected by The Rolling Stones

CMSC 330: Organization of Programming Languages

Computer Science 21b (Spring Term, 2015) Structure and Interpretation of Computer Programs. Lexical addressing

COS 320. Compiling Techniques

CS 370 The Pseudocode Programming Process D R. M I C H A E L J. R E A L E F A L L

Functional abstraction. What is abstraction? Eating apples. Readings: HtDP, sections Language level: Intermediate Student With Lambda

Functional abstraction

step is to see how C++ implements type polymorphism, and this Exploration starts you on that journey.

COMP 105 Homework: Type Systems

CS152: Programming Languages. Lecture 7 Lambda Calculus. Dan Grossman Spring 2011

Gradual Typing for Functional Languages. Jeremy Siek and Walid Taha (presented by Lindsey Kuper)

More Lambda Calculus and Intro to Type Systems

Topics Covered Thus Far. CMSC 330: Organization of Programming Languages. Language Features Covered Thus Far. Programming Languages Revisited

CS558 Programming Languages

Lecture 2: SML Basics

CS3: Introduction to Symbolic Programming. Lecture 5:

CS558 Programming Languages

Harvard School of Engineering and Applied Sciences Computer Science 152

CS558 Programming Languages

6.001 Recitation 23: Register Machines and Stack Frames

Semantics of programming languages

CSE 413 Languages & Implementation. Hal Perkins Winter 2019 Structs, Implementing Languages (credits: Dan Grossman, CSE 341)

CS115 - Module 10 - General Trees

Intro. Scheme Basics. scm> 5 5. scm>

CSE P 501 Compilers. Static Semantics Hal Perkins Winter /22/ Hal Perkins & UW CSE I-1

Induction and Semantics in Dafny

CS 565: Programming Languages. Spring 2008 Tu, Th: 16:30-17:45 Room LWSN 1106

Typing Control. Chapter Conditionals

Programming Languages and Techniques (CIS120)

9 R1 Get another piece of paper. We re going to have fun keeping track of (inaudible). Um How much time do you have? Are you getting tired?

Cunning Plan. One-Slide Summary. Functional Programming. Functional Programming. Introduction to COOL #1. Classroom Object-Oriented Language

Warm-up and Memoization

CS5412: TRANSACTIONS (I)

CS61A Notes Week 1A: Basics, order of evaluation, special forms, recursion

Static Semantics. Winter /3/ Hal Perkins & UW CSE I-1

Lecture 3: Recursion; Structural Induction

CS 6110 S11 Lecture 12 Naming and Scope 21 February 2011

Optimising Functional Programming Languages. Max Bolingbroke, Cambridge University CPRG Lectures 2010

CS457/557 Functional Languages

CMSC 330: Organization of Programming Languages. Operational Semantics

Lecture 13 CIS 341: COMPILERS

CS61A Discussion Notes: Week 11: The Metacircular Evaluator By Greg Krimer, with slight modifications by Phoebus Chen (using notes from Todd Segal)

6.184 Lecture 4. Interpretation. Tweaked by Ben Vandiver Compiled by Mike Phillips Original material by Eric Grimson

Lecture Notes on Data Representation

Category Item Abstract Concrete

Higher-Order Functions (Part I)

The Environment Model. Nate Foster Spring 2018

Tradeoffs. CSE 505: Programming Languages. Lecture 15 Subtyping. Where shall we add useful completeness? Where shall we add completeness?

CSE413: Programming Languages and Implementation Racket structs Implementing languages with interpreters Implementing closures

Verified compilation of a first-order Lisp language

CIS 500 Software Foundations Fall December 6

Administrivia. Existential Types. CIS 500 Software Foundations Fall December 6. Administrivia. Motivation. Motivation

CS 415 Midterm Exam Spring 2002

The Dynamic Typing Interlude

CSCI B522 Lecture 11 Naming and Scope 8 Oct, 2009

CIS 500 Software Foundations Midterm I

T H E I N T E R A C T I V E S H E L L

CMSC 336: Type Systems for Programming Languages Lecture 5: Simply Typed Lambda Calculus Acar & Ahmed January 24, 2008

CS3110 Spring 2017 Lecture 10 a Module for Rational Numbers

CS153: Compilers Lecture 15: Local Optimization

ML Type Inference and Unification. Arlen Cox

Whereweare. CS-XXX: Graduate Programming Languages. Lecture 7 Lambda Calculus. Adding data structures. Data + Code. What about functions

Functional Programming

EECS 470 Lecture 7. Branches: Address prediction and recovery (And interrupt recovery too.)

Programming Languages Lecture 15: Recursive Types & Subtyping

Recursive Types and Subtyping

CSE450. Translation of Programming Languages. Lecture 11: Semantic Analysis: Types & Type Checking

Outline. Introduction Concepts and terminology The case for static typing. Implementing a static type system Basic typing relations Adding context

CSE 341: Programming Languages

Who are we? Andre Platzer Out of town the first week GHC TAs Alex Crichton, senior in CS and ECE Ian Gillis, senior in CS

CS2112 Fall Assignment 4 Parsing and Fault Injection. Due: March 18, 2014 Overview draft due: March 14, 2014

Semantics of programming languages

Chapter 13: Reference. Why reference Typing Evaluation Store Typings Safety Notes

Idris. Programming with Dependent Types. Edwin Brady University of St Andrews, Scotland,

Meeting13:Denotations

CIS 500 Software Foundations Fall October 2

Compiler Construction Lent Term 2013 Lectures 9 & 10 (of 16)

CS 4349 Lecture August 21st, 2017

Typing Data. Chapter Recursive Types Declaring Recursive Types

SML Style Guide. Last Revised: 31st August 2011

Part III. Chapter 15: Subtyping

Design Principles of Programming Languages. Recursive Types. Zhenjiang Hu, Haiyan Zhao, Yingfei Xiong Peking University, Spring Term, 2014

Fundamentals and lambda calculus

Transcription:

Dynamic Types 15-312, Spring 2017 March 21, 2017 Announcements Homework 4 will be released shortly. Most of it is new, so it s hard to tell how hard we made it. Please start early! Look at what I made over Spring Break! https://github.com/jez/vim-better-sml New features: * Get the type of the variable under the cursor * Jump to the definition of a variable * Show type errors when they occur * Warn when a variable is unused Come to office hours with midterm questions. Questions What are the differences between dynamic typing and static typing? How do JIT compilers work? Dynamic Types & Static Types An important thing to keep in mind: Dynamic types are a mode of use of static types. When we talk about dynamic types, we re really talking about using a normal, static type system in a special way: with only one type. Every expression has this type. 1

We touched on this topic in lecture, but let s revisit it: what are the tradeoffs associated with dynamic types? Pro: Dynamic types have an easy safety proof. Con: The safety proof doesn t give you nearly as many guarantees. Pro: Con: Runtime tagging and untagging are slow, especially in loops. Pro: Omitting type annotations is makes certain code easier to write. Con: Type annotations are not possible, meaning they can t serve as documentation to readers. Pro: You can worry less about the types of arguments, because functions are generally liberal in what they accept. Con: Thinking about the types of expressions is a powerful programming tool for structuring code. Of course, there are other axes I ve omitted here. Feel free to consider more. Of these, I find that the Con s of dynamic typing outweigh the Pro s (and even that some of the Pro s are in fact non-pro s ). So the question still remains: why dynamic types at all? My guess is that it s easier to implement dynamic types than it is to implement ML modules. When you re designing a practical programming language, you want generic and polymorphism. The type-theoretic way to do this is to implement modules, which is hard. You also probably want to implement type inference on modules, which is harder 1. Alternatively, you can just give up and implement dynamic typing. 2 From a usability perspective, once you have type inference the code looks nearly identical in SML compared with, say, Python, so the conciseness of the code doesn t factor into the argument. It s rather that type inference like SML does it is actually non-trivial. The only remaining Pro is that people believe having to think about the types when writing the code is a hindrance, rather than a tool. This is a classic example of a tradeoff between a short term and long term gain: The initial cost to learning how types work, vs The longterm payout payout from using types to reason about your code. How highly do you value this tradeoff? 1 On the contrary, type inference for PCF (or similar languages) is rather straighforward. See for example https://github.com/jozefg/hm. 2 When designing languages, many people focus first on the concrete syntax of their language, rather than it s semantics. To see this, look at any canonical compilers textbook. The first subject covered is parsing, and usually so for disproportionately long. In this class, one of the main lessons is that language design should value the semantics above the syntax. 2

DPCF & HPCF In our discussions of dynamic types, we have two choices: treat all expressions as dyns treat dyn as yet another type that can be added to any other language DPCF is the former. It has the same computational power as PCF, but all expressions create dyns. Another language we can consider is HPCF. It is just PCF, but we add in a type dyn and operations on that type: τ ::= dyn d ::= x l ::= num fun τ ::= nat τ τ dyn e ::= x z z s d s e ifz { z d s x d ifz { z d s x d λx. d λx. e d d e e fix x is d fix x is e num[n] l! e tag e @ l cast l? e is-a? Optimizing Dynamic Code Let s consider the plus function in DPCF. // Note how DPCF requires no type annotations... // everything's a dyn! fix plus is λn. λm. If we were to dynamically evaluate this function, we would see a lot of runtime checks for things like is_num and is_fun. These checks are compounded by the fact that we have to run them on every recursive call, oftentimes for things which we already know the types of. 3

Let s optimize this optimize this function this by translating it to HPCF, and then moving around some of the checks that we know are no-ops. // In HPCF, we have to annotate types, and we have // to introduce dyns with! and eliminate them with @ fix plus : dyn is fun! λn : dyn. fun! λm : dyn. ifz (n @ num) { s n' num! (s ((((plus @ fun) (num! n'))@fun m) @ num)) But we know that plus is actually always a dyn -> dyn -> dyn. Let s rewrite our code to make use of this fact: fix plus : dyn -> dyn -> dyn is λn : dyn. λm : dyn. ifz (n @ num) { s n' num! (s (plus (num! n') m) @ num) But wait, we know that every invocation of plus recursively will essentially use it like nat -> nat -> nat. Let s make that optimization: fix plus : nat -> nat -> nat is λn : nat. λm : nat. Hmm This looks suspiciously like normal PCF code. We don t use any of our new HPCF constructs here! However, we ve changed the type. External call sites expect that plus is a dyn, so let s wrap it back up as such: let val fastplus = in end fix plus : nat -> nat -> nat is λn : nat. λm : nat. fun! λx : dyn. fun! λy : dyn. num! fastplus (x @ num) (y @ num) 4

And voilà. So in the process of optimizing the code, we really just re-wrote it with types. This allowed us to get rid of the excessive tagging and untagging operations, so our inner loop ran really fast. This is how JIT compilers work. They start out by interpretting your code, but then after a while make note when a function is being called with only certain types of arguments. When this happens, they compile the function in question to native (i.e., typed) code, and dispatch to the compiled code, instead of the interpreted code. To learn more: https://hacks.mozilla.org/2017/02/a-crash-course-in-just-in-time-jit-compilers/ 5