Lecture 21: Relational Programming II. November 15th, 2011

Similar documents
Lecture 5: Declarative Programming. The Declarative Kernel Language Machine. September 12th, 2011

Lecture 6: The Declarative Kernel Language Machine. September 13th, 2011

Lecture 4: The Declarative Sequential Kernel Language. September 5th 2011

Implementação de Linguagens 2016/2017

An Oz Subset. 1 Microsyntax for An Oz Subset. Notational Conventions. COP 4020 Programming Languages 1 January 17, 2012

Don t write anything in the field marked Eventuell ekstra ID. For each subtask, choose the answer you think is the most correct one.

1. true / false By a compiler we mean a program that translates to code that will run natively on some machine.

CS164: Midterm I. Fall 2003

Programming Language Concepts, cs2104 Lecture 04 ( )

Special Directions for this Test

Semantic Analysis. Lecture 9. February 7, 2018

Lecture 8: Recursion and Iteration. Exceptions. Declarative Programming.

Formal Semantics. Chapter Twenty-Three Modern Programming Languages, 2nd ed. 1

Programming Languages Third Edition

Tail Recursion. ;; a recursive program for factorial (define fact (lambda (m) ;; m is non-negative (if (= m 0) 1 (* m (fact (- m 1))))))

2.2 Syntax Definition

Functional Programming. Pure Functional Programming

MIT Top-Down Parsing. Martin Rinard Laboratory for Computer Science Massachusetts Institute of Technology

CSE341 Autumn 2017, Final Examination December 12, 2017

Syntactic Analysis. CS345H: Programming Languages. Lecture 3: Lexical Analysis. Outline. Lexical Analysis. What is a Token? Tokens

Declarative Computation Model

Program Abstractions, Language Paradigms. CS152. Chris Pollett. Aug. 27, 2008.

A Simple Syntax-Directed Translator

Semantic Analysis. Outline. The role of semantic analysis in a compiler. Scope. Types. Where we are. The Compiler so far

Hard deadline: 3/28/15 1:00pm. Using software development tools like source control. Understanding the environment model and type inference.

Working with recursion. From definition to template. Readings: HtDP, sections 11, 12, 13 (Intermezzo 2).

Functional programming with Common Lisp

Programming Language Concepts, cs2104 Lecture 09 ( )

Logic Programming (CTM ) Constraint Programming: Constraints and Computation Spaces

CSE341 Spring 2017, Final Examination June 8, 2017

Lexical Analysis. Lexical analysis is the first phase of compilation: The file is converted from ASCII to tokens. It must be fast!

Lexical and Syntax Analysis. Top-Down Parsing

LECTURE 3. Compiler Phases

In Our Last Exciting Episode

Data Types The ML Type System

Topic B: Backtracking and Lists

MIDTERM EXAM (Solutions)

Principles of Programming Languages COMP251: Syntax and Grammars

Lecture 2: SML Basics

CMSC 330: Organization of Programming Languages. Formal Semantics of a Prog. Lang. Specifying Syntax, Semantics

CSE341 Spring 2017, Final Examination June 8, 2017

ML 4 A Lexer for OCaml s Type System

Lecture Notes on Program Equivalence

CMSC 330: Organization of Programming Languages. Operational Semantics

PL Revision overview

Project 2: Scheme Interpreter

Introduction to Lexical Analysis

Racket. CSE341: Programming Languages Lecture 14 Introduction to Racket. Getting started. Racket vs. Scheme. Example.

Programming Languages and Techniques (CIS120)

Abstract Syntax Trees & Top-Down Parsing

Abstract Syntax Trees & Top-Down Parsing

Working with recursion

lci Manual Kostas Chatzikokolakis 12 March 2006

Lexical Analysis. Dragon Book Chapter 3 Formal Languages Regular Expressions Finite Automata Theory Lexical Analysis using Automata

Processing lists in Prolog - 2

1 Lexical Considerations

On Academic Dishonesty. Declarative Computation Model. Single assignment store. Single assignment store (2) Single assignment store (3)

Compiler Construction

CS Lecture 2. The Front End. Lecture 2 Lexical Analysis

Notes on Higher Order Programming in Scheme. by Alexander Stepanov

A simple syntax-directed

Introduction to Parsing. Lecture 8

COMPUTER SCIENCE TRIPOS

Summer 2017 Discussion 10: July 25, Introduction. 2 Primitives and Define

COSC252: Programming Languages: Semantic Specification. Jeremy Bolton, PhD Adjunct Professor

Operational Semantics. One-Slide Summary. Lecture Outline

Introduction to Prolog Paper Refs. Prolog tutor. Julian Verdurmen. Seminar Softw. tech. for teaching and learning. September 30, 2009

CS558 Programming Languages

Introduction to Scheme

COMP 181 Compilers. Administrative. Last time. Prelude. Compilation strategy. Translation strategy. Lecture 2 Overview

Declarative Concurrency (CTM 4)

UNIVERSITY OF CALIFORNIA Department of Electrical Engineering and Computer Sciences Computer Science Division. P. N. Hilfinger

Compiler Construction

Parser Tools: lex and yacc-style Parsing

Discussion 11. Streams

Fall 2018 Discussion 8: October 24, 2018 Solutions. 1 Introduction. 2 Primitives

Static Semantics. Lecture 15. (Notes by P. N. Hilfinger and R. Bodik) 2/29/08 Prof. Hilfinger, CS164 Lecture 15 1

Recursion, Structures, and Lists

Faculty of Electrical Engineering, Mathematics, and Computer Science Delft University of Technology

Tail Calls. CMSC 330: Organization of Programming Languages. Tail Recursion. Tail Recursion (cont d) Names and Binding. Tail Recursion (cont d)

SEMANTIC ANALYSIS TYPES AND DECLARATIONS

CS61A Notes Disc 11: Streams Streaming Along

Building Compilers with Phoenix

Contents. Chapter 1 SPECIFYING SYNTAX 1

Lecture 2: Big-Step Semantics

Parsing II Top-down parsing. Comp 412

Syntax-Directed Translation. Lecture 14

n n Try tutorial on front page to get started! n spring13/ n Stack Overflow!

Semantics. A. Demers Jan This material is primarily from Ch. 2 of the text. We present an imperative

CS 415 Midterm Exam Fall 2003

Compilers - Chapter 2: An introduction to syntax analysis (and a complete toy compiler)

SMURF Language Reference Manual Serial MUsic Represented as Functions

Midterm II CS164, Spring 2006

Programming Paradigms

CS422 - Programming Language Design

Lexical and Syntax Analysis

Compiler Design Concepts. Syntax Analysis

A programming language requires two major definitions A simple one pass compiler

Lecture 1 Contracts : Principles of Imperative Computation (Fall 2018) Frank Pfenning

6.184 Lecture 4. Interpretation. Tweaked by Ben Vandiver Compiled by Mike Phillips Original material by Eric Grimson

Transcription:

Lecture 21: Relational Programming II November 15th, 2011

Lecture Outline Relational Programming contd The Relational Model of Computation Programming with choice and Solve Search Strategies Relational Parsing Summary

Relational Programming contd How about Append is it relational? Example (Procedure Append) proc {Append List1 List2 List3} case List1 of nil then List3 = List2 [] Head Tail then List4 in List3 = Head List4 {Append Tail List2 List4} end end Append is not relational: 1 it will work if List2 or List3 (or both) are (partially) unbound; it won t work if List1 is unbound or is (partially) unbound, because of blocking during pattern matching. 1 The smart Append the one with pattern matching involving both List1 and List2; see lecture 5 wouldn t work if List1 or List2 (or both) were partial lists.

Relational Programming contd We can try to improve Append using, e.g., exceptions. Example (Procedure Append improved ) proc {Append List1 List2 List3} try List1 = nil List2 = List3 catch _ then Head Tail1 Tail3 in List1 = Head Tail1 List3 = Head Tail3 {Append Tail1 List2 Tail3} end end The idea is to try to unify List1 with nil and then List2 with List3, or do the recursive call when the unification fails: instead of branching with the blocking case, we do branching with try/catch. Unfortunately, this version doesn t handle correctly the case where List1 is (partially) unbound.

Relational Programming contd Example (Procedure Append improved contd) {Browse {Append [1] [2 3] $}} % prints [1 2 3] {Browse {Append [1] $ [1 2 3]}} % prints [2 3] {Browse List2 List3} {Append [1] List2 List3} List3 = [1 2 3] % prints List2#List3 % updates to List2#(1 List2) % updates to [2 3]#(1 2 3 nil) {Browse {Append $ [2 3] [1 2 3]}} % fails at nil = _ _ If List1 is unbound, it becomes immediately bound to nil. This binding is not undone if List2 = List3 fails. 2 For Append to be relational, we need a tool for undoing bindings, or for performing multiple independent tentative bindings for the same variable. 2 Recall, searching for a catch clause we are unwinding the stack, but bindings are not undone.

Relational Programming contd In Prolog, implementing the relational append is trivial. Example (Relational append, Prolog) code/append.pl append([], List, List). append([head Tail1], List, [Head Tail2]) :- append(tail1, List, Tail2). consult(append). % load the program append([1], [2,3], List). % List = [1,2,3] append([1], Tail, [1,2,3]). % Tail = [2,3] append(head, [2,3], [1,2,3]). % Head = [1] append is specified declaratively, by means of two patterns. If the query matches any of the patterns, substitutions are made and either a solution is suggested, or the search loops. If the query does not match any pattern (or if previously found solutions were rejected by the user), failure is reported.

Relational Programming contd Example (Relational Append with or) code/append-or.oz proc {Append List1 List2 List3} or List1 = nil List2 = List3 [] Head Tail1 Tail3 in List1 = Head Tail1 List3 = Head Tail3 {Append Tail1 List2 Tail3} end end The operator or 3 is used to specify alternative computations, each performed within its own computation space. The alternative bindings of List1 (to nil or to Head Tail1) are done within two separate spaces, and are not visible to each other. When one of the computations succeeds, the binding is merged ( imported ) into the enclosing computation space. 3 Note, unlike in many other languages, or is not a logical operator (in Oz, orelse is).

Relational Programming contd Example (Relational Append with or contd) {Browse {Append [1] [2 3] $}} % prints [1 2 3] {Browse {Append [1] $ [1 2 3]}} % prints [2 3] {Browse {Append $ [2 3] [1 2 3]}} % prints [1] The or-based Append behaves the relational way. Try it! Execute append-test.oz (try append.pl as well). Note: The operator or is not explained in the book, and is not explained in CTMCP. or is out of pensum!

Relational Programming contd We can have still more fun with the Prolog append. Example (append, Prolog) consult(append). findall(solution(list1, List2), append(list1, List2, [1,2,3]), Solutions). findall will find all solutions to a query. The query is append(list1, List2, [1,2,3]), and asks for two lists that, when appended, match [1,2,3]. Each solution is reported in the form solution(list1, List2). Solutions is a list of all solutions.

Relational Programming contd In Oz, we can t do that with or-based Append, because or succeeds if at most one alternative succeeds. But we can do it with the non-deterministic choice statement choice. Example (Relational Append with choice) code/append-choice.oz proc {Append List1 List2 List3} choice List1 = nil List2 = List3 [] Head Tail1 Tail3 in List1 = Head Tail1 List3 = Head Tail3 {Append Tail1 List2 Tail3} end end Unlike the or-based Append, this version can report all successful computations done in separate spaces; but we need some extra functionality for this to work.

Relational Programming contd Example (Relational Append with choice contd) Solutions = {SolveAll fun {$} List1 List2 in {Append List1 List2 [1 2 3]} solution(list1 List2) end} SolveAll is an analogue to Prolog s findall. {Append List1 List2 [1 2 3]} is the query (with List1 and List2 explicitly declared, unlike in Prolog). solution(list1 List2) is the form in which each solution is reported. Solutions is a variable which SolveAll binds to a list of all possible solutions. Try it! Execute append-test.oz.

The Relational Model of Computation The relational model of computation covered in Ch. 9 is an extension of the declarative sequential model of computation from Ch. 2 with: non-deterministic choice statements, failure statements. We introduce new syntax, but explain it only with handwaving semantics. The relational model of computation is based on the broader model of computation used in constraint propagation programming. The constraint-based model of computation is explained in Ch. 12 in CTMCP, and is out of pensum. You need a rough, intuitive understanding of the relational model, with no details on how computation spaces are created and used.

The Relational Model of Computation contd The syntax of the kernel language is extended as follows. The choice and fail statements statement ::= choice statement { [] statement } end statement ::= fail choice sets a point of non-deterministic choice between a number of possible execution paths. A choice statement freezes the current thread; results from the child computation spaces are examined and collected by a search engine. If fail is executed by any thread in a computation space, the computation in that space fails (a unification failure has the same effect).

The Relational Model of Computation contd To execute a program containing a choice statement, the code has to be wrapped into a no-parameter function, and passed as an argument to a one-parameter function implementing a search strategy. Chapter 12 in CTMCP describes a depth-first search strategy, implemented in the function Solve. 4 Solve takes as input a no-parameter function that specifies the problem to be solved. Solve returns a lazy list of solutions; laziness is essential here, because a problem may, in general, have an infinite number of different solutions. The convenience function SolveOne calls Solve and forces the head of the solutions list; it then returns the first solution found by Solve. The convenience function SolveAll calls Solve and forces the entire solutions list; its output is the forced list of all solutions. 4 Solve is implemented in the file solve.oz included in the code directory for this lecture.

Programming with choice and Solve We can use choice to implement a function that non-deterministically returns 5 a single digit. Example (Picking a digit) code/digit.oz fun {Digit} choice 0 [] 1 [] 2 [] 3 [] 4 [] 5 [] 6 [] 7 [] 8 [] 9 end end {Browse {SolveOne Digit} % prints [0] {Browse {SolveAll Digit} % prints [0 1 2 3 4 5 6 7 8 9] The function Digit is trivial, but such functions will be useful when we implement more complicated stuff, e.g., a parser. 5 When executed inside a program passed to a search engine. Non-deterministically in the sense that the order in which different values are returned is not, in principle, fixed within the function, but rather depends on the actual implementation.

Programming with choice and Solve A similar program could easily be written in Prolog. Example (Picking a digit, Prolog) code/digit.pl digit(0). digit(1). digit(2). digit(3). digit(4). digit(5). digit(6). digit(7). digit(8). digit(9). consult(digit). findall(digit, digit(digit), Digits). % Digits = [0,1,2,3,...]

Programming with choice and Solve contd Digit is specialized: it returns a digit. But its domain is handcoded in the body of the function; it would be nice to parameterize it away. Example (Generic non-deterministic pick) code/pick.oz fun {Pick List} Head Tail = List in choice Head [] {Pick Tail} end end fun {Digit} {Pick {Enumerate 0 9}} end {Browse {SolveAll Digit} % prints [0 1 2 3 4 5 6 7 8 9] Given a list, Pick unifies the list with two unbound variables wrapped into a record. If the unification fail, Pick is done (no solution can be produced). Otherwise, Pick chooses between the header of the list and a value recursively picked from the tail.

Programming with choice and Solve contd Again, this is easy to do in Prolog. Example (Generic non-deterministic pick, Prolog) code/pick.pl pick(head, [Head Tail]). pick(item, [_ Tail]) :- pick(item, Tail). digit(digit) :- pick(digit, [0,1,2,3,4,5,6,7,8,9]). findall(digit, digit(digit), Digits). % Digits = [0,1,2,3,...] Given a variable Item and a list [Head Tail], pick either unifies Item with Head or recursively unifies it with some other element of the list. digit unifies a variable with whatever pick picks from the list of digits.

Programming with choice and Solve contd We can use Digit (and Pick) for some more interesting tasks. Example (Pairs of one-digit numbers summing up to 10) code/ten.oz fun {Ten} N#M = {Digit}#{Digit} in N + M = 10 ten(n M) end {Browse {SolveAll Ten}} % prints [ten(1 9) ten(2 8) ten(3 7)...] The function Ten calls {Digit} twice to pick two arbitrary one-digit integers, unifies the sum of these with 10, returns the integers wrapped into a record if the unification succeeds. Try it! Execute ten-test.oz.

Programming with choice and Solve contd Consider the following problem of computing change: given a finite set N of coin nominals n m, m = N, find all possible ways to give change for a specific amount a of money, using as many coins as you like. Example (Computing change) nominals: amount: change: nickel (5 cents), dime (10 cents), quarter (25 cents) 25 cents 5 nickels 3 nickels and 1 dime 1 nickel and 2 dimes 1 quarter

Programming with choice and Solve contd Example (Computing change) code/change.oz fun {Change Amount Purse} if Amount > 0 then Coin = {Purse} Rest = {Change Amount-Coin Purse} in {Sum Rest} + Coin = Amount Coin Rest else nil end end The function Change takes an amount to be changed and a coin generator; if the amount is not positive, returns an empty list (no-coin change or change impossible); otherwise, picks one coin, computes the change for the rest of the amount, asserts that the coin and the rest of the change add up to the amount, and returns the coin together with the rest.

Programming with choice and Solve contd Example (Computing change, alternative) code/change-alternative.oz fun {Change Amount Purse} choice Amount = 0 nil [] Amount > 0 = true Coin = {Purse} Rest = {Change Amount-Coin Purse} in Coin Rest end end The alternative version of Change makes two choices: the amount is positive, and given a coin it is possible to change the rest; the coin is then returned together with the rest of the change; the amount is zero; the change is no coins. If the amount is negative, the branch of the search simply fails (change impossible).

Programming with choice and Solve contd Example (Computing change contd) Nickel = 5 Dime = 10 Quarter = 25 fun {Purse} {Pick [Nickel Dime Quarter]} end {Browse {SolveAll fun {$} {Change Quarter Purse} end}} The two versions of Change give identical results, but the results list is redundant; some of its elements represent the same change, they just differ in the order of coin nominals: [[5 5 5 5 5] [5 5 5 10] [5 5 10 5] [5 10 5 5] [5 10 10] [10 5 5 5] [10 5 10] [10 10 5]] Try it! Execute change-test.oz.

Programming with choice and Solve contd Example (Computing change, search space) [] [5] [10] [5 5] [5 10] [10 5] [10 10] [5 5 5] [5 5 10] [5 10 5] [5 10 10] [10 5 5] [10 5 10] [10 10 5] [5 5 5 5] [5 5 5 10] [5 5 10 5] [5 10 5 5] [10 5 5 5] [5 5 5 5 5] The search is redundant, because after having chosen the second nominal, we can choose the first one again. If many more nominals were available, the search space would be exponentially larger. 6 6 There are more nodes examined than visible here; only nodes on successful paths are shown.

Programming with choice and Solve contd We can prevent the redundance by demanding that a nominal that has been skipped over in a search branch can never be chosen again. Example (Non-redundant change computation) code/change-nonredundant.oz fun {Change Amount Nominals} choice Amount = 0 nil [] Amount > 0 = true Coin Coins = Nominals in choice Rest = {Change Amount-Coin Nominals} in Coin Rest [] {Change Amount Coins} end end end Note: Unlike in the case where Purse were used, if the first nominal on the list is not used, the recursive call uses only the remaining nominals (second option in the internal choice). Try it! Execute change-test.oz.

Programming with choice and Solve contd Example (Non-redundant change, search space) [] [5] [5 5] [5 10] [5 5 5] [5 10 10] [5 5 5 5] [5 5 5 10] [5 5 5 5 5] The search is no longer redundant, because after having chosen the second nominal, we cannot choose the first one any more.

Search Strategies The Solve we have used so far implements the depth-first search strategy. Whenever a node does not correspond to a final solutions, further choices are made, and the node s children are examined before the node s siblings. Depth-first search is the default strategy built-in into a Prolog interpreter (there are ways to make Prolog perform, e.g., breadth-first search). The relational model of computation in Oz uses depth-first seach, but it s just because Solve is implemented this way. It is possible to reimplement Solve so that it uses, e.g., breadth-first search, or even allows the user to choose the strategy when Solve is called.

Search Strategies contd Example (Non-redundant change, depth-first search) [] 1: [5] 2: [5 5] 7: [5 10] 3: [5 5 5] 8: [5 10 10] 4: [5 5 5 5] 6: [5 5 5 10] 5: [5 5 5 5 5] Complete solutions are found at nodes 5, 6, and 8, in this order.

Search Strategies contd Example (Non-redundant change, breadth-first search) [] 1: [5] 2: [5 5] 3: [5 10] 4: [5 5 5] 5: [5 10 10] 6: [5 5 5 5] 7: [5 5 5 10] 8: [5 5 5 5 5] Complete solutions are found at nodes 5, 7, and 8, in this order.

Search Strategies contd The reimplemented Solve takes an additional argument, which is a flag denoting the strategy to be used. Example (Computing change, depth- or breadth-first) {Browse {SolveAll depth fun {$} {Change Quarter Nominals]} end}} {Browse {SolveAll breadth fun {$} {Change Quarter Nominals]} end}} Depth-first search: [[5 5 5 5 5] [5 5 5 10] [5 10 10]]. Breadth-first search: [[5 10 10] [5 5 5 10] [5 5 5 5 5]]. 7 7 For some reason the implementation of Solve used for demonstration reverses the order of the last two solutions; the implementation is taken from http://cs.calstatela.edu/ wiki/index.php/courses/cs 460/Fall 2005/SolveAll.

Relational Parsing One of the principal motivations for the development of Prolog was to be able to build parsers for artificial and natural languages by specifying the parse rules in a declarative rather than imperative way. The syntax (macrosyntax, as opposed to microsyntax, the lexical structure of lexemes) of a language is typically specified using a grammar. The idea is to write programs that closely resemble grammars in their structure (up to syntactic details of the notation), and have an interpreter use such programs to parse lists of tokens obtained from a tokenizer.

Relational Parsing contd A parser takes as input a linear sequence of tokens (classified lexemes), and returns, in general, a non-linear branching structure, the parse tree. We shall develop a simple syntax checker for a small subset of the Scheme programming language. A syntax checker is a stripped-down version of a parser; it takes as input a sequence of tokens, and checks whether the sequence is valid according to the grammar, i.e., whether it is parsable. Instead of a parse tree, a syntax checker returns true for syntactically valid tokens sequences, and false for invalid ones.

Relational Parsing contd Example (Grammar for a subset of Scheme) program ::= { instruction } instruction ::= variable definition function definition expression variable definition ::= ( DEFINE identifier expression ) function definition ::= ( DEFINE ( identifier identifier ) expression ) expression ::= identifier number application identifier ::= ID(λ) number ::= NUM(λ) application ::= ( identifier expression ) DEFINE, ID, and NUM are token class indicators. Tokens of the latter two classes are reported (by the tokenizer) together with the corresponding lexemes (λ).

Relational Parsing contd We can represent tokens as records and atoms (no-content records), and programs as lists of records. Example (Tokenized programs) (define a 1) is tokenized to [ ( define id( a ) num(1) ) ] (define (f x) x) is tokenized to [ ( define ( id( f ) id( x ) ) id( x ) ) ] (define (g x) (f (f x))) is tokenized to [ ( define ( id( g ) id( x ) ) ( id( f ) ( id( f ) id( x ) ) ) ) ]

Relational Parsing contd For each rule in the grammar, we implement a procedure (or function) that takes as input a sequence of tokens, tries to parse from the sequence a construct corresponding to the rule it implements, and returns the remaining tokens on success, and fails otherwise. Example (Parsing an identifier) proc {Identifier Tokens Rest} Tokens = id(_) Rest end Identifier unifies the head of Tokens with an identifier record (ignoring the lexeme), and unifies Rest with the tail of Tokens.

Relational Parsing contd Example (Parsing an expression) fun {Expression Tokens} choice {Identifier Tokens} [] {Number Tokens} [] {Application Tokens} end end Expression non-deterministically tries to parse an identifier, a number, or an application construct from Tokens, and returns the remaining tokens received from any of the nested calls.

Relational Parsing contd Example (Parsing variable definitions) fun {VariableDefinition Tokens} Rest in Tokens = ( define Rest {Expression {Identifier Rest} ) $} end VariableDefinition unifies the first two tokens in Tokens with ( and define, parses an identifier from the remaning tokens, parses an expression from what Identifier returns, and parses ) from what Expression returns and returns the remaining tokens.

Relational Parsing contd Example (Parsing function definitions) fun {FunctionDefinition Tokens} Rest in Tokens = ( define ( Rest {Expression {Identifier {Identifier Rest} ) $} ) $} end FunctionDefinition works like VariableDefinition, just that the parsed construct is more complex.

Relational Parsing contd Example (Parsing sequences of tokens) fun {Sequence Tokens} choice Tokens = nil nil [] {Sequence {Instruction Tokens}} end end Sequence non-deterministically choses between Tokens being an empty list and Tokens being a list containing an instruction possibly followed by further tokens.

Relational Parsing contd Example (Parsing programs) fun {Check Tokens} fun {Sequence}... end fun {Instruction}... end... in {SolveAll fun {$} {Sequence Tokens nil} true end} \= nil end Check uses SolveAll to find all solutions to the parse problem, and returns true if the solutions list is non-empty. The function passed to SolveAll succeeds if Tokens can be parsed as a valid program with no trailing tokens; on success, true is returned. 8 8 Any other value could be used; in particular, it could be a parse tree.

Relational Parsing contd Try it! Execute checker-test.oz. Try it! Examine parser.oz and execute parser-test.oz to see how to easily extend the code to have a parser (which returns a list of parse trees) rather than a checker (which returns just true or false).

Automated Translation? From the Bible, Matthew 26:41: The spirit is willing, but the flesh is weak. From an automated English to Russian to English military translation machine: The vodka is good, but the meat is rotten. Note: Parsing is not all.

Lecture Outline Relational Programming contd The Relational Model of Computation Programming with choice and Solve Search Strategies Relational Parsing Summary

Summary This time Next time Homework Pensum Further reading Relational programming. Object oriented programming. Reading Secc. 9.1 9.3 in CTMCP. You should have a rough understanding of today s material, without details of non-oz code. Sterling and Shapiro The Art of Prolog