Where We Are. Lexical Analysis. Syntax Analysis. IR Generation. IR Optimization. Code Generation. Machine Code. Optimization.

Similar documents
Announcements. My office hours are today in Gates 160 from 1PM-3PM. Programming Project 3 checkpoint due tomorrow night at 11:59PM.

Runtime Environment. Part II May 8th Wednesday, May 8, 13

Run-time Environments

Run-time Environments

9/5/17. The Design and Implementation of Programming Languages. Compilation. Interpretation. Compilation vs. Interpretation. Hybrid Implementation

CSE 504. Expression evaluation. Expression Evaluation, Runtime Environments. One possible semantics: Problem:

COP4020 Programming Languages. Compilers and Interpreters Robert van Engelen & Chris Lacher

G Programming Languages - Fall 2012

CS 330 Lecture 18. Symbol table. C scope rules. Declarations. Chapter 5 Louden Outline

Run-time Environments - 2

Run-time Environments. Lecture 13. Prof. Alex Aiken Original Slides (Modified by Prof. Vijay Ganesh) Lecture 13

Compiling and Interpreting Programming. Overview of Compilers and Interpreters

CS 432 Fall Mike Lam, Professor. Code Generation

Design Issues. Subroutines and Control Abstraction. Subroutines and Control Abstraction. CSC 4101: Programming Languages 1. Textbook, Chapter 8

Procedure and Object- Oriented Abstraction

CS 415 Midterm Exam Spring 2002

Subroutines. Subroutine. Subroutine design. Control abstraction. If a subroutine does not fit on the screen, it is too long

G Programming Languages Spring 2010 Lecture 4. Robert Grimm, New York University

Assignment 11: functions, calling conventions, and the stack

What is a compiler? Xiaokang Qiu Purdue University. August 21, 2017 ECE 573

CS1622. Semantic Analysis. The Compiler So Far. Lecture 15 Semantic Analysis. How to build symbol tables How to use them to find

The role of semantic analysis in a compiler

Language Translation. Compilation vs. interpretation. Compilation diagram. Step 1: compile. Step 2: run. compiler. Compiled program. program.

CS356: Discussion #6 Assembly Procedures and Arrays. Marco Paolieri

Special Topics: Programming Languages

Parsing Scheme (+ (* 2 3) 1) * 1

Short Notes of CS201

Motivation was to facilitate development of systems software, especially OS development.

The Compiler So Far. CSC 4181 Compiler Construction. Semantic Analysis. Beyond Syntax. Goals of a Semantic Analyzer.

CS201 - Introduction to Programming Glossary By

UNIT 3

Semantic Analysis. Outline. The role of semantic analysis in a compiler. Scope. Types. Where we are. The Compiler Front-End

What is a compiler? var a var b mov 3 a mov 4 r1 cmpi a r1 jge l_e mov 2 b jmp l_d l_e: mov 3 b l_d: ;done

Chapter 1 INTRODUCTION SYS-ED/ COMPUTER EDUCATION TECHNIQUES, INC.

Motivation was to facilitate development of systems software, especially OS development.

Agenda. CSE P 501 Compilers. Java Implementation Overview. JVM Architecture. JVM Runtime Data Areas (1) JVM Data Types. CSE P 501 Su04 T-1

Principles of Programming Languages Topic: Scope and Memory Professor Louis Steinberg Fall 2004

Compiling Techniques

Compiler Construction

Chapter 11 Introduction to Programming in C

CSE P 501 Compilers. Java Implementation JVMs, JITs &c Hal Perkins Winter /11/ Hal Perkins & UW CSE V-1

CS 314 Principles of Programming Languages

Concepts Introduced in Chapter 7

Intermediate Representations

SEMANTIC ANALYSIS TYPES AND DECLARATIONS

CS 261 Fall C Introduction. Variables, Memory Model, Pointers, and Debugging. Mike Lam, Professor

CSC 2400: Computer Systems. Using the Stack for Function Calls

Intermediate Representations

LECTURE 18. Control Flow

Computer Components. Software{ User Programs. Operating System. Hardware

INTERMEDIATE REPRESENTATIONS RTL EXAMPLE

Introduction to Programming Using Java (98-388)

Today's Topics. CISC 458 Winter J.R. Cordy

EECE.3170: Microprocessor Systems Design I Summer 2017 Homework 4 Solution

Chapter 1 GETTING STARTED. SYS-ED/ Computer Education Techniques, Inc.

Names, Bindings, Scopes

System Software Assignment 1 Runtime Support for Procedures

CS 230 Programming Languages

High-Level Language VMs

INF 212 ANALYSIS OF PROG. LANGS FUNCTION COMPOSITION. Instructors: Crista Lopes Copyright Instructors.

IR Generation. May 13, Monday, May 13, 13

Topic 7: Activation Records

G Programming Languages - Fall 2012

Lexical Considerations

Attributes of Variable. Lecture 13: Run-Time Storage Management. Lifetime. Value & Location

Three-Address Code IR

Code Generation. Dragon: Ch (Just part of it) Holub: Ch 6.

COE318 Lecture Notes Week 3 (Week of Sept 17, 2012)

C Introduction. Comparison w/ Java, Memory Model, and Pointers

Intermediate Code Generation

CS321 Languages and Compiler Design I. Winter 2012 Lecture 1

An Overview to Compiler Design. 2008/2/14 \course\cpeg421-08s\topic-1a.ppt 1

12/4/18. Outline. Implementing Subprograms. Semantics of a subroutine call. Storage of Information. Semantics of a subroutine return

SE352b: Roadmap. SE352b Software Engineering Design Tools. W3: Programming Paradigms

Intermediate Code & Local Optimizations

Lecture 3: C Programm

Principles of Programming Languages

2. Reachability in garbage collection is just an approximation of garbage.

Semantic Analysis. Outline. The role of semantic analysis in a compiler. Scope. Types. Where we are. The Compiler so far

Accelerating Ruby with LLVM

Programming Languages Third Edition. Chapter 10 Control II Procedures and Environments

Why are there so many programming languages?

Compilers and computer architecture: A realistic compiler to MIPS

Run Time Environments

CSE 307: Principles of Programming Languages

CSE443 Compilers. Dr. Carl Alphonce 343 Davis Hall

Run-time Program Management. Hwansoo Han

Administration CS 412/413. Why build a compiler? Compilers. Architectural independence. Source-to-source translator

Lexical Considerations

Functions CHAPTER 5. FIGURE 1. Concrete syntax for the P 2 subset of Python. (In addition to that of P 1.)

Programming Languages Third Edition. Chapter 7 Basic Semantics

Principles of Compiler Design

Intermediate Code Generation

Chapter 9 :: Subroutines and Control Abstraction

Why are there so many programming languages? Why do we have programming languages? What is a language for? What makes a language successful?

LLVM IR Code Generations Inside YACC. Li-Wei Kuo

a data type is Types

Welcome to CSE131b: Compiler Construction

Summary of Last Class. Processes. C vs. Java. C vs. Java (cont.) C vs. Java (cont.) Tevfik Ko!ar. CSC Systems Programming Fall 2008

Pointers (continued), arrays and strings

Transcription:

Where We Are Source Code Lexical Analysis Syntax Analysis Semantic Analysis IR Generation IR Optimization Code Generation Optimization Machine Code

Where We Are Source Code Lexical Analysis Syntax Analysis Semantic Analysis IR Generation IR Optimization Code Generation Optimization Machine Code

What is IR Generation? Intermediate Representation Generation. The final phase of the compiler front-end. Goal: Translate the program into the format expected by the compiler back-end. Generated code need not be optimized; that's handled by later passes. Generated code need not be in assembly; that can also be handled by later passes.

Why Do IR Generation? Simplify certain optimizations. Machine code has many constraints that inhibit optimization. (Such as?) Working with an intermediate language makes optimizations easier and clearer. Have many front-ends into a single back-end. gcc can handle C, C++, Java, Fortran, Ada, and many other languages. Each front-end translates source to the GENERIC language. Have many back-ends from a single front-end. Do most optimization on intermediate representation before emitting code targeted at a single machine.

Designing a Good IR IRs are like type systems they're extremely hard to get right. Need to balance needs of high-level source language and low-level target language. Too high level: can't optimize certain implementation details. Too low level: can't use high-level knowledge to perform aggressive optimizations. Often have multiple IRs in a single compiler.

Architecture of gcc

Source Code Architecture of gcc

Architecture of gcc Source Code AST

Architecture of gcc Source Code AST GENERIC

Architecture of gcc Source Code AST GENERIC High GIMPLE

Architecture of gcc Source Code AST GENERIC High GIMPLE SSA

Architecture of gcc Source Code AST GENERIC High GIMPLE SSA Low GIMPLE

Architecture of gcc Source Code AST GENERIC High GIMPLE SSA Low GIMPLE RTL

Architecture of gcc Source Code AST GENERIC High GIMPLE SSA Low GIMPLE RTL Machine Code

Another Approach: High-Level IR Examples: Java bytecode CPython bytecode LLVM IR Microsoft CIL. Retains high-level program structure. Try playing around with javap vs. a disassembler. Allows for compilation on target machines. Allows for JIT compilation or interpretation.

Runtime Environments

An Important Duality Programming languages contain high-level structures: Functions Objects Exceptions Dynamic typing Lazy evaluation (etc.) The physical computer only operates in terms of several primitive operations: Arithmetic Data movement Control jumps

Runtime Environments We need to come up with a representation of these high-level structures using the low-level structures of the machine. A runtime environment is a set of data structures maintained at runtime to implement these high-level structures. e.g. the stack, the heap, static area, virtual function tables, etc. Strongly depends on the features of both the source and target language. (e.g compiler vs. crosscompiler) Our IR generator will depend on how we set up our runtime environment.

Data Representations What do different types look like in memory? Machine typically supports only limited types: Fixed-width integers: 8-bit, 16-bit- 32-bit, signed, unsigned, etc. Floating point values: 32-bit, 64-bit, 80-bit IEEE 754. How do we encode our object types using these types?

Encoding Primitive Types Primitive integral types (byte, char, short, int, long, unsigned, uint16_t, etc.) typically map directly to the underlying machine type. Primitive real-valued types (float, double, long double) typically map directly to underlying machine type. Pointers typically implemented as integers holding memory addresses. Size of integer depends on machine architecture; hence 32-bit compatibility mode on 64-bit machines.

Encoding Arrays C-style arrays: Elements laid out consecutively in memory. Arr[0] Arr[1] Arr[2]... Arr[n-1] Java-style arrays: Elements laid out consecutively in memory with size information prepended. n Arr[0] Arr[1] Arr[2]... Arr[n-1] D-style arrays: Elements laid out consecutively in memory; array variables store pointers to first and past-the-end elements. Arr[0] Arr[1] Arr[2]... Arr[n-1] First Past-End (Which of these works well for Decaf?)

Encoding Arrays C-style arrays: Elements laid out consecutively in memory. Arr[0] Arr[1] Arr[2]... Arr[n-1] Java-style arrays: Elements laid out consecutively in memory with size information prepended. n Arr[0] Arr[1] Arr[2]... Arr[n-1] D-style arrays: Elements laid out consecutively in memory; array variables store pointers to first and past-the-end elements. Arr[0] Arr[1] Arr[2]... Arr[n-1] First Past-End (Which of these works well for Decaf?)

Encoding Arrays C-style arrays: Elements laid out consecutively in memory. Arr[0] Arr[1] Arr[2]... Arr[n-1] Java-style arrays: Elements laid out consecutively in memory with size information prepended. n Arr[0] Arr[1] Arr[2]... Arr[n-1] D-style arrays: Elements laid out consecutively in memory; array variables store pointers to first and past-the-end elements. Arr[0] Arr[1] Arr[2]... Arr[n-1] First Past-End (Which of these works well for Decaf?)

Encoding Arrays C-style arrays: Elements laid out consecutively in memory. Arr[0] Arr[1] Arr[2]... Arr[n-1] Java-style arrays: Elements laid out consecutively in memory with size information prepended. n Arr[0] Arr[1] Arr[2]... Arr[n-1] D-style arrays: Elements laid out consecutively in memory; array variables store pointers to first and past-the-end elements. Arr[0] Arr[1] Arr[2]... Arr[n-1] First Past-End (Which of these works well for Decaf?)

Encoding Arrays C-style arrays: Elements laid out consecutively in memory. Arr[0] Arr[1] Arr[2]... Arr[n-1] Java-style arrays: Elements laid out consecutively in memory with size information prepended. n Arr[0] Arr[1] Arr[2]... Arr[n-1] D-style arrays: Elements laid out consecutively in memory; array variables store pointers to first and past-the-end elements. Arr[0] Arr[1] Arr[2]... Arr[n-1] First Past-End (Which of these works well for Decaf?)

Encoding Multidimensional Arrays Often represented as an array of arrays. Shape depends on the array type used. C-style arrays: int a[3][2];

Encoding Multidimensional Arrays Often represented as an array of arrays. Shape depends on the array type used. C-style arrays: int a[3][2]; a[0][0] a[0][1] a[1][0] a[1][1] a[2][0] a[2][1]

Encoding Multidimensional Arrays Often represented as an array of arrays. Shape depends on the array type used. C-style arrays: int a[3][2]; a[0][0] a[0][1] a[1][0] a[1][1] a[2][0] a[2][1] Array of size 2 Array of size 2 Array of size 2

Encoding Multidimensional Arrays Often represented as an array of arrays. Shape depends on the array type used. C-style arrays: int a[3][2]; How do you know where to look for an element in an array like this? a[0][0] a[0][1] a[1][0] a[1][1] a[2][0] a[2][1] Array of size 2 Array of size 2 Array of size 2

Encoding Multidimensional Arrays Often represented as an array of arrays. Shape depends on the array type used. Java-style arrays: int[][] a = new int [3][2];

Encoding Multidimensional Arrays Often represented as an array of arrays. Shape depends on the array type used. Java-style arrays: int[][] a = new int [3][2]; 3 a[0] a[1] a[2]

Encoding Multidimensional Arrays Often represented as an array of arrays. Shape depends on the array type used. Java-style arrays: int[][] a = new int [3][2]; 3 2 a[0][0] a[0][1] a[0] a[1] 2 a[1][0] a[1][1] a[2] 2 a[2][0] a[2][1]

Encoding Functions Many questions to answer: What does the dynamic execution of functions look like? Where is the executable code for functions located? How are parameters passed in and out of functions? Where are local variables stored? The answers strongly depend on what the language supports.

Review: The Stack Function calls are often implemented using a stack of activation records (or stack frames). Calling a function pushes a new activation record onto the stack. Returning from a function pops the current activation record from the stack. Questions: Why does this work? Does this always work?

Activation Trees An activation tree is a tree structure representing all of the function calls made by a program on a particular execution. Depends on the runtime behavior of a program; can't always be determined at compile-time. (The static equivalent is the call graph). Each node in the tree is an activation record. Each activation record stores a control link to the activation record of the function that invoked it.

Activation Trees

Activation Trees int main() { Fib(3); int Fib(int n) { if (n <= 1) return n; return Fib(n 1) + Fib(n 2);

Activation Trees int main() { Fib(3); main int Fib(int n) { if (n <= 1) return n; return Fib(n 1) + Fib(n 2);

Activation Trees int main() { Fib(3); int Fib(int n) { if (n <= 1) return n; return Fib(n 1) + Fib(n 2); main Fib n = 3

Activation Trees int main() { Fib(3); int Fib(int n) { if (n <= 1) return n; return Fib(n 1) + Fib(n 2); main Fib n = 3 Fib n = 2

Activation Trees int main() { Fib(3); int Fib(int n) { if (n <= 1) return n; return Fib(n 1) + Fib(n 2); main Fib n = 3 Fib n = 2 Fib n = 1

Activation Trees int main() { Fib(3); int Fib(int n) { if (n <= 1) return n; return Fib(n 1) + Fib(n 2); main Fib n = 3 Fib n = 2 Fib n = 1 Fib n = 0

Activation Trees int main() { Fib(3); int Fib(int n) { if (n <= 1) return n; return Fib(n 1) + Fib(n 2); main Fib n = 3 Fib n = 2 Fib n = 1 Fib n = 1 Fib n = 0

An activation tree is a spaghetti stack.

The runtime stack is an optimization of this spaghetti stack.

Why Can We Optimize the Stack? Once a function returns, its activation record cannot be referenced again. We don't need to store old nodes in the activation tree. Every activation record has either finished executing or is an ancestor of the current activation record. We don't need to keep multiple branches alive at any one time. These are not always true!

Breaking Assumption 1 Once a function returns, its activation record cannot be referenced again. Any ideas on how to break this?

Breaking Assumption 1 Once a function returns, its activation record cannot be referenced again. Any ideas on how to break this? One option: Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter;

Breaking Assumption 1 Once a function returns, its activation record cannot be referenced again. Any ideas on how to break this? One option: Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter;

Closures

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; function MyFunction() { f = CreateCounter(); print(f()); print(f());

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; function MyFunction() { f = CreateCounter(); print(f()); print(f()); >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; MyFunction function MyFunction() { f = CreateCounter(); print(f()); print(f()); >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 0 MyFunction function MyFunction() { f = CreateCounter(); print(f()); print(f()); >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 0 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 0 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); <fn> >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 0 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); <fn> >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 1 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); <fn> >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 1 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); <fn> >

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 1 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); <fn> > 1

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 1 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); > 1 <fn> <fn>

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 1 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); > 1 <fn> <fn>

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 2 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); > 1 <fn> <fn>

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 2 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); > 1 <fn> <fn>

Closures function CreateCounter() { var counter = 0; return function() { counter ++; return counter; CreateCounter counter = 2 MyFunction f = <fn> function MyFunction() { f = CreateCounter(); print(f()); print(f()); > 1 2 <fn> <fn>

Control and Access Links The control link of a function is a pointer to the function that called it. Used to determine where to resume execution after the function returns. The access link of a function is a pointer to the activation record in which the function was created. Used by nested functions to determine the location of variables from the outer scope.

Closures and the Runtime Stack Languages supporting closures do not typically have a runtime stack. Activation records typically dynamically allocated and garbage collected. Interesting exception: gcc C allows for nested functions, but uses a runtime stack. Behavior is undefined if nested function accesses data from its enclosing function once that function returns. (Why?)

Breaking Assumption 2 Every activation record has either finished executing or is an ancestor of the current activation record. Any ideas on how to break this?

Breaking Assumption 2 Every activation record has either finished executing or is an ancestor of the current activation record. Any ideas on how to break this? One idea: Coroutines def downfrom(n): while n > 0: yield n n = n - 1

Breaking Assumption 2 Every activation record has either finished executing or is an ancestor of the current activation record. Any ideas on how to break this? One idea: Coroutines def downfrom(n): while n > 0: yield n n = n - 1

Coroutines

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 myfunc def myfunc(): for i in downfrom(3): print i >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 myfunc def myfunc(): for i in downfrom(3): print i >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc downfrom n = 3 >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc downfrom n = 3 >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc downfrom n = 3 >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 3 >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 3 >

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 3 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 3 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 3 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 2 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 2 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 3 downfrom n = 2 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 2 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 2 > 3

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 2 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 2 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 2 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 1 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 1 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 2 downfrom n = 1 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 1 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 1 > 3 2

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 1 > 3 2 1

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 1 > 3 2 1

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 1 > 3 2 1

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 0 > 3 2 1

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 0 > 3 2 1

Coroutines def downfrom(n): while n > 0: yield n n = n 1 def myfunc(): for i in downfrom(3): print i myfunc i = 1 downfrom n = 0 > 3 2 1

Coroutines A subroutine is a function that, when invoked, runs to completion and returns control to the calling function. Master/slave relationship between caller/callee. A coroutine is a function that, when invoked, does some amount of work, then returns control to the calling function. It can then be resumed later. Peer/peer relationship between caller/callee. Subroutines are a special case of coroutines.

Coroutines and the Runtime Stack Coroutines often cannot be implemented with purely a runtime stack. What if a function has multiple coroutines running alongside it? Few languages support coroutines, though some do (Python, for example).

So What? Even a concept as fundamental as the stack is actually quite complex. When designing a compiler or programming language, you must keep in mind how your language features influence the runtime environment. Always be critical of the languages you use!