Do Extraterrestrials Use Functional Programming? Manuel M T Chakravarty University of New South Wales» Straight to next slide [15min Question (λ); 20min Methodology; 15min Application] mchakravarty α TacticalGrace TacticalGrace
This talk will be in three parts. (1) Discussing essence of functional programming. What makes FP tick? (2) How do FP principles influence software dev? Will propose a dev methodology for FP. (3) Look at concrete dev project, where we applied this methodology.»»let's start with The Question Part 1 The Question
» <Read question> * To visit us, they need to be on an advanced technological level with a deep understanding of science. * They won't speak one of humanity's languages, though. So, how do we establish a common basis? Do Extraterrestrials Use Functional Programming?
* How to communicate? * Common idea: universal principles may help establish a basis universal constants or universal laws.
* How to communicate? * Common idea: universal principles may help establish a basis universal constants or universal laws.
* How to communicate? * Common idea: universal principles may help establish a basis universal constants or universal laws. π?
E = mc 2 π? * How to communicate? * Common idea: universal principles may help establish a basis universal constants or universal laws.
* Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick?»» Let's look: how are they related
* Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick?»» Let's look: how are they related M, N x λx.m M N Alonzo Church
* Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick?»» Let's look: how are they related M, N x λx.m M N Alonzo Church
* Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick?»» Let's look: how are they related M, N x λx.m M N Alonzo Church
* Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick?»» Let's look: how are they related M, N x λx.m M N Alonzo Church
* Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick?»» Let's look: how are they related M, N x λx.m M N Alonzo Church
* Computer languages? Agree on a common language of computation? * In 1936, Alonzo Church introduced the lambda calculus: <Explain lambda calculus> * Serve as a common language? Like a computational Esperanto? * Also other calculi/machines. Famous: Turing machines. Which would aliens pick?»» Let's look: how are they related M, N x λx.m M N Alan Turing Alonzo Church
M, N x λx.m M N Alan Turing Alonzo Church * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics.»» This led to an important question
Lambda Calculus Turing Machine M, N x λx.m M N * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics.»» This led to an important question
Lambda Calculus Turing Machine M, N x λx.m M N By-product of a study of the foundation and expressive power of mathematics. * The lambda calculus and Turing machines have the same origin. * Beginning 20th century: group of famous mathematicians interested in formalising foundation of mathematics.»» This led to an important question
David Hilbert * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines.»» So what is the Entscheidungsproblem
Is there a solution to the David Hilbert Entscheidungsproblem? * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines.»» So what is the Entscheidungsproblem
Is there a solution to the David Hilbert Entscheidungsproblem? No! No! * Challenge posed by David Hilbert, 1928: the Entscheidungsproblem (decision problem) * Church & Turing, 1936, no solution, using lambda calculus & Turing machines.»» So what is the Entscheidungsproblem
Is there an algorithm to decide whether a given statement is provable from a set of axioms using the rules of first-order logic? * In other words: Given a world & a set of fixed rules in the world, check whether the world has a particular property.»» In turn, leads to the question
How do you prove that an algorithm does not exist? * Because we cannot solve the challenge, doesn't mean it is unsolvable? * Need systematic way to rigorously prove that a solution is impossible.»» Church & Turing proceeded as follows
* 1936, the concept of an algorithm remained to be formally defined
(1) Define a universal language or abstract machine. (2) Show that the desired algorithm cannot be expressed in the language. * 1936, the concept of an algorithm remained to be formally defined
Define a universal language or abstract machine. * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal ie, any algorithmically computable function can be expressed»» They showed
Define a universal language or abstract machine. Lambda Calculus Turing Machine M, N x λx.m M N * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal ie, any algorithmically computable function can be expressed»» They showed
Universal language Lambda Calculus Turing Machine M, N x λx.m M N * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal ie, any algorithmically computable function can be expressed»» They showed
Universal language Church-Turing thesis Lambda Calculus Turing Machine M, N x λx.m M N * Two steps <Explain> * Church & Turing used: (1) lambda term, (2) Turing machine * Hypothesis: universal ie, any algorithmically computable function can be expressed»» They showed
Computational Power Lambda Calculus Turing Machine M, N x λx.m M N = * Any program expressible in one is expressible in the other.»» However,
Generality Lambda Calculus Turing Machine M, N x λx.m M N * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept.»» This is important, as
Generality Lambda Calculus, N x λx.m M N Turing Machine * Lambda calculus: embodies concept of (functional) *abstraction* * Functional abstraction is only one embodiment of an underlying more general concept.»» This is important, as
Generality increases if a discovery is independently made in a variety of contexts.» Read the statement. * If a concept transcends one application, its generality increases.»» This is the case for the lambda calculus
Simply typed lambda calculus * Firstly, lambda calculus (no polytypes)»» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to
Lambda calculus with monotypes Simply typed lambda calculus * Firstly, lambda calculus (no polytypes)»» Mathematicians Haskell Curry & William Howard discovered: it is structurally equivalent to
Intuitionistic propositional logic»» Later, Joachim Lambek found: they correspond to
Constructive logic Intuitionistic propositional logic»» Later, Joachim Lambek found: they correspond to
Cartesian Simply Intuitionistic typed closed propositional lambda categories calculus logic * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving.»» The upshot of all this
Structure from category theory Cartesian Simply Intuitionistic typed closed propositional lambda categories calculus logic * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving.»» The upshot of all this
Simply typed lambda calculus Curry-Howard-Lambek correspondence Intuitionistic propositional logic Cartesian closed categories * Three independently discovered artefacts share the same structure! * Implies an equivalence between programming & proving.»» The upshot of all this
Alonzo Church didn't invent the lambda calulus; he discovered it.» Read the statement. * Just like Issac Newton didn't invent the Law of Gravity, but discovered it.»» Getting back to our extraterrestrials
* Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.
* Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.
M, N x λx.m M N M, N x λx.m M N * Lambda calculus: fundamental, inevitable, universal notion of computation. * In all likelihood: extraterrestials know about it, like they will know π.
* Is all this simply a academic curiosity? * Does it impact the practical use of FLs?»» It is crucial for FLs So what?
* FLs: pragmatic renderings of lambda calculus with syntactic sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC)»» Moreover, central language features
Elm Racket Clojure Scheme SASL SISAL Scala Agda F# Clean OCaml Haskell J Standard ML Erlang Id LISP ISWIM λ Hope Miranda FP * FLs: pragmatic renderings of lambda calculus with syntactic sugar etc for convenience. * Important application: compilation via extended lambda calculi as ILs (eg, GHC)»» Moreover, central language features
Elm Racket Clojure Scheme SASL SISAL Scala Agda F# Clean OCaml Haskell J Standard ML Erlang Id LISP ISWIM λ Hope Miranda FP Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard
Elm Immutable Racket structures Agda Clojure Scala Scheme SASL SISAL Higher-order functions ISWIM & closures LISP F# Clean Purity OCaml Haskell Types J Standard ML Well-defined Erlang semantics Id λ Miranda Hope FP Central language features of FLs have their origin in the lambda calculus: * HO functions & closures: lambda * Purity & immutable structures: functional semantics * Types & semantics: logic & Curry-Howard
Immutable structures Purity Types Well-defined Higher-order semantics functions & closures * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features Practical advantages * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features Concurrency & parallelism Practical advantages * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features Meta programming Concurrency & parallelism Practical advantages * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features Meta programming Reuse Concurrency & parallelism Practical advantages * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features Strong isolation Meta programming Reuse Concurrency & parallelism Practical advantages * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features Strong isolation Meta programming Safety Reuse Concurrency & parallelism Practical advantages * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Immutable structures Higher-order functions & closures Types Well-defined semantics Purity Language features Strong isolation Meta programming Safety Reuse Concurrency & parallelism Practical advantages Formal reasoning * Language features lead to practical advantages * Some examples: <explain where they come from>»» Nevertheless, we can gain even more from the foundation of FP than these advantages
Part 2 From Language to Methodology * Part 1: FP derives from natural, fundamental concept of computation... *...which is the root of language conveniences and practical advantages.»» We want to take that concept one step further
Functional programming as a development methodology, not just a language category.» We want to use <read the statement>. * Use the principles of the lambda calculus for a software development methodology. [Engineering is based on science. This is the science of programming/software.]»» To do this
The key to functional software development is a consistent focus on properties.» We need to realise that <read the statement> * These can be "logical properties" or "mathematical properties".»» More precisely,
Properties * Properties are rigorous and precise. (NB: PL is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus.»» Let's look at some examples
* Properties are rigorous and precise. (NB: PL is a formal notation.) * We are not talking about specifying the entire behaviour of an applications. (Type signatures are properties.) * In one way or another, they leverage the formal foundation of the lambda calculus.»» Let's look at some examples Properties Rigorous, formal or semi-formal specification Cover one or more aspects of a program Leverage the mathematics of the lambda calculus
» Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise. A pure function is fully specified by a mapping of argument to result values.
» Read the statement. * Menas: if you know the arguments, you know the result. * (1) Nothing else influences the result; (2) the function doesn't do anything, but provide the result. * This is semi-formal, but easy to formalise. Well known property A pure function is fully specified by a mapping of argument to result values.
map :: (a -> b) -> [a] -> [b] eval :: Expr t -> t n+m m+n : {n m : N} -> m + n n + m * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition»» Types are not just for statically typed languages
Types are properties map :: (a -> b) -> [a] -> [b] eval :: Expr t -> t n+m m+n : {n m : N} -> m + n n + m * map: well known * eval: type-safe evaluator with GADTs * Agda lemma: commutativity of addition»» Types are not just for statically typed languages
Racket (Scheme dialect) * HTDP encourages the use of function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"
Racket (Scheme dialect) The Process: [..] 2. Write down a signature, [..] * HTDP encourages the use of function signatures as part of the design process. * It also uses data definitions (reminiscent of data type definitions) * Racket also supports checked "contracts"
-- QuickCheck prop_union s1 (s2 :: Set Int) = (s1 `union` s2) ==? (tolist s1 ++ tolist s2) * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework»» And as the last example of a property
Logic formulas -- QuickCheck prop_union s1 (s2 :: Set Int) = (s1 `union` s2) ==? (tolist s1 ++ tolist s2) * In formal specifications * But also useful for testing: QuickCheck * Popular specification-based testing framework»» And as the last example of a property
-- return a >>= k == k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.
Algebraic and categorial structures -- return a >>= k == k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a * Monads: categorial structures that needs to obey certain laws. * Think of them as API patterns.
I/O in Haskell * Now that we have seen some examples of properties,...»»...let's look at an example of guiding a design by properties
Example of an uncompromising pursuit of properties I/O in Haskell * Now that we have seen some examples of properties,...»»...let's look at an example of guiding a design by properties
(not really Haskell) readname = let firstname = readstring () in let surname = readstring () in firstname ++ " " ++ surname * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed»» Problem with I/O, as the following compiler optimisations demonstrate
(not really Haskell) readname = let firstname = readstring () in let surname = readstring () in firstname ++ " " ++ surname Haskell is a non-strict language * Read two strings from stdin and combine them. * In which order will firstname and surname be read? * Non-strict (or lazy) language: compute when needed»» Problem with I/O, as the following compiler optimisations demonstrate
readname = let firstname = readstring () in let surname = readstring () in firstname ++ " " ++ surname * Two occurences of the same lambda term must have the same meaning.
Common subexpression elimination readname = let firstname = readstring () in let surname = firstname readstring () in firstname ++ " " ++ surname * Two occurences of the same lambda term must have the same meaning.
readname = let firstname = readstring () in let surname = readstring () in firstname ++ " " ++ surname * No data depencency between the two bindings
Reordering readname = let surname firstname = readstring = () () in let firstname surname = = readstring () () in firstname ++ " " ++ surname * No data depencency between the two bindings
readname = let firstname = readstring () in let surname = readstring () in firstname * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O»» They saw two options
Dead code elimination readname = let firstname = readstring () in let surname = readstring () in firstname * If a binding is not used, we should be able to eliminate it. * 1988: Haskell language committee faced the problem of mismatch between non-strictness and I/O»» They saw two options
Option ❶ Destroy purity»» To do so, they would need
» <Explain> Destroy purity
Destroy purity Prohibit those code transformations Enforce strict top to bottom evaluation of let bindings» <Explain>
Destroy purity Prohibit those code transformations Not a good idea! Enforce strict top to bottom evaluation of let bindings» <Explain>
WG 2.8, 1992 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest.»» This left them with the second option
WG 2.8, 1992 Preserve those code transformations [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest.»» This left them with the second option
We want local reasoning Preserve those code transformations WG 2.8, 1992 [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest.»» This left them with the second option
We want local reasoning Preserve those code transformations WG 2.8, 1992 Think about concurrency [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest.»» This left them with the second option
We want local reasoning Preserve those code transformations WG 2.8, 1992 Keep purity! Think about concurrency [This is not the real committee, but a large part.] * Didn't want to give up this property. * Non-strictness kept them honest.»» This left them with the second option
Option ❷ Continuation-based & Stream-based I/O»» I don't want to explain them in detail, but here is an example
readname :: [Response] -> ([Request], String) readname ~(Str firstname : ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O»» Can't we do any better
readname :: [Response] -> ([Request], String) readname ~(Str firstname : ~(Str surname : _)) = ([ReadChan stdin, ReadChan stdin], firstname ++ " " ++ surname) readname :: FailCont -> StrCont -> Behaviour readname abort succ = readchan stdin abort (\firstname -> readchan stdin abort (\surname -> succ (firstname ++ " " ++ surname))) * Rather inconvenient programming model * Due to lack of a better idea, Haskell 1.0 to 1.2 used continuation-based and stream-based I/O»» Can't we do any better
What are the properties of I/O, of general stateful operations? * Let's take a step back.» Can we use properties to understand the nature of I/O?»» Let's characterise what stateful (imperative) computing is about
Arguments Result State changing function * In addition to arguments and result... *...state is threaded through.»» In the case of I/O
Arguments Result State State changing function State' * In addition to arguments and result... *...state is threaded through.»» In the case of I/O
Arguments Result I/O function * The state is the whole world»» How can we formalise this
Arguments Result I/O function * Categorial semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects»» How can we use that to write FPs
Arguments Result I/O function * Categorial semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects»» How can we use that to write FPs
Arguments Result Monad! I/O function Eugenio Moggi * Categorial semantics of impure language features: properties of impure features * Lambda calculus with impure features * Characterise the meaning of effects»» How can we use that to write FPs
Eugenio Moggi * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs»» Finally, we can write our example program properly
Eugenio Moggi * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs»» Finally, we can write our example program properly
Eugenio Moggi -- return a >>= k == k a -- m >>= return == m -- m >>= (\x -> k x >>= h) -- == (m >>= k) >>= h class Monad m where (>>=) :: m a -> (a -> m b) -> m b return :: a -> m a Philip Wadler instance Monad IO where... * Moggi's semantics is based on the lambda calculus * So, it ought to translate to FLs»» Finally, we can write our example program properly
(Real Haskell!) readname :: IO String readname = do firstname <- readstring surname <- readstring in return (firstname ++ " " ++ surname) * Development oriented at properties * Solution has an impact well beyond Haskell I/O»» Functional software development usually doesn't mean to resort to abstract math
Part 3 Applying the Methodology * So far, we saw that the genesis of FP resolved around working with and exploiting logical & mathematical properties.»» To get a feel for using such properties, let us look at a concrete development effort, where we used properties in many flavours to attack a difficult problem
Pure data parallelism»» Good parallel programming environments are important, because of
Case study in functional software development Pure data parallelism»» Good parallel programming environments are important, because of
Ubiquitous parallelism multicore CPU multicore GPU * Today, parallelism is everywhere! <Explain>»» We would like a parallel programming environment with meeting the following goals
Goal ➀ Exploit parallelism of commodity hardware easily: * We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps.»» To this end
Goal ➀ Exploit parallelism of commodity hardware easily: Performance is important, but productivity is more important. * We are not aiming at supercomputers * Ordinary applications cannot afford the resources that go into the development of HPC apps.»» To this end
Goal ➁ Semi-automatic parallelism: * Not fully automatic: computers cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone.»» How can properties help us to achieve these two goals
Goal ➁ Semi-automatic parallelism: Programmer supplies a parallel algorithm, but no explicit concurrency (no concurrency control, no races, no deadlocks). * Not fully automatic: computers cannot parallelise algos & seq algos are inefficient on parallel hardware. * Explicit concurrency is hard, non-modular, and error prone.»» How can properties help us to achieve these two goals
Three property-driven methods Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
Three property-driven methods Types Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
Three property-driven methods Types State minimisation Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
Three property-driven methods Types State minimisation Combinators & embedded languages Types: track purity, generate array representations, guide optimisations State minimisation: localised state transformers, immutable structures Combinators: parallelisable aggregate array operations, exploit algebraic properties, restricted language for special hardware
Ubiquitious parallelism multicore CPU multicore GPU»» What kind of code do we want to write for parallel hardware
smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sump (dotp sv v) sv <- sm :]
smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sump (dotp sv v) sv <- sm :]
smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sump (dotp sv v) sv <- sm :] 2 1.5 5 3 4 1 7 6.5 sm v
smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sump (dotp sv v) sv <- sm :] 2 1.5 5 3 4 1 7 6.5 sm v
smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sump (dotp sv v) sv <- sm :] 2 1.5 5 3 4 1 7 6.5 sm v
smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sump (dotp sv v) sv <- sm :] Σ Σ Σ Σ Σ 2 1.5 5 3 4 7 1 6.5 sm v
smvm :: SparseMatrix -> Vector -> Vector smvm sm v = [: sump (dotp sv v) sv <- sm :] Σ Σ Σ Σ Σ 2 1.5 5 3 4 7 1 6.5 sm v
Types ensure purity, purity ensures non-interference. * Functions that are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off Types.]»» But we need more than a convenient notation
Types Types ensure purity, purity ensures non-interference. * Functions that are not of monadic type are pure. * Pure functions can execute in any order, also in parallel. => No concurrency control needed [Properties pay off Types.]»» But we need more than a convenient notation
High performance * Performance is not the only goal, but it is a major goal. * Explain fluid flow.»» We can get good performance
* Repa (blue) is on 7 CPU cores (two quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch
Jos Stam's Fluid Flow Solver * Repa (blue) is on 7 CPU cores (two quad-core Xenon E5405 CPUs @ 2 GHz, 64-bit) * Accelerate (green) is on a Tesla T10 processor (240 cores @ 1.3 GHz) * Repa talk: Ben Lippmeier @ Thursday before lunch * Accelerate talk: Trevor McDonell @ Friday before lunch
How do we achieve high performance from purely functional code?»» This presents an inherent tension
Unboxed, mutable arrays C-like loops»» We resolve this tension with local state
Performance Unboxed, mutable arrays C-like loops»» We resolve this tension with local state
Performance Unboxed, mutable arrays Pure C-like functions loops»» We resolve this tension with local state
Performance Parallelism & Optimisations Unboxed, mutable arrays Pure C-like functions loops»» We resolve this tension with local state
Performance Parallelism & Optimisations Unboxed, mutable arrays Pure C-like functions loops»» We resolve this tension with local state
(Pure) map :: (Shape sh, Source r a) => (a -> b) -> Array r sh a -> Array D sh b * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations»» Despite the pure interface, some combinators are internally impure
Types (Pure) map :: (Shape sh, Source r a) => (a -> b) -> Array r sh a -> Array D sh b * We use a library of pure, parallel, aggregate operations * In Repa, types guide array representations»» Despite the pure interface, some combinators are internally impure
Local state <Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion
<Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion Local state Allocate mutable array
<Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion Local state Allocate mutable array Initialise destructively
<Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion Local state Allocate mutable array Initialise destructively Freeze!
<Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion State minimisation Local state Allocate mutable array Initialise destructively Freeze!
<Explain> * Program transformations and parallelisation on pure level * Then, unfold and optimise imperative program * Type system helps to get this right * Fusion Combinators Local state State minimisation Allocate mutable array Initialise destructively Freeze!
Special hardware 12 THREADS 24,576 THREADS Core i7 970 CPU NVIDIA GF100 GPU * Straight forward code generation is not suitable for all architectures»» GPUs are highly parallel, but also restricted in which operations are efficient
GPU's don't like * We won't compile all of Haskell to GPUs anytime soon.
GPU's don't like SIMD divergence (conditionals) * We won't compile all of Haskell to GPUs anytime soon.
GPU's don't like SIMD divergence (conditionals) Recursion * We won't compile all of Haskell to GPUs anytime soon.
GPU's don't like SIMD divergence (conditionals) Recursion Function pointers * We won't compile all of Haskell to GPUs anytime soon.
GPU's don't like SIMD divergence (conditionals) Recursion Function pointers Automatic garbage collection * We won't compile all of Haskell to GPUs anytime soon.
dotpacc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpacc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipwith (*) xs' ys') * We special purpose compile embedded code.
Acc marks embedded computations dotpacc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpacc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipwith (*) xs' ys') * We special purpose compile embedded code.
Acc marks embedded computations dotpacc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpacc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipwith (*) xs' ys') use embeds values * We special purpose compile embedded code.
Acc marks embedded computations Embedded language dotpacc :: Vector Float -> Vector Float -> Acc (Scalar Float) dotpacc xs ys = let xs' = use xs ys' = use ys in fold (+) 0 (zipwith (*) xs' ys') use embeds values * We special purpose compile embedded code.
types >< state languages
Functional programming is fundamental to computing Functional software development is property-driven development types >< state languages
Thank you!
Images from http://wikipedia.org http://openclipart.org http://dx.doi.org/10.1145/1238844.1238856