Lecture Notes on Monadic Logic Programming

Similar documents
Operational Semantics

CLF: A logical framework for concurrent systems

Lecture Notes on Computational Interpretations of Modalities

Introduction to Algorithms / Algorithms I Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/18/14

Lecture 6: Arithmetic and Threshold Circuits

Lecture Notes on Value Propagation

LECTURES 3 and 4: Flows and Matchings

Abstract Logic Programming

Lecture 24 Search in Graphs

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Approximation algorithms Date: 11/27/18

Lecture Notes on Liveness Analysis

Treewidth and graph minors

Exact Algorithms Lecture 7: FPT Hardness and the ETH

Lecture Notes on Induction and Recursion

Linear Logical Algorithms

Theorem proving. PVS theorem prover. Hoare style verification PVS. More on embeddings. What if. Abhik Roychoudhury CS 6214

Programming Languages Third Edition

Chapter 3. Set Theory. 3.1 What is a Set?

Final Exam : Constructive Logic. December 9, Prob 1 Prob 2 Prob 3 Prob 4 Prob 5 Prob 6 Prob 7 Prob 8 Total

Lecture 1 Contracts. 1 A Mysterious Program : Principles of Imperative Computation (Spring 2018) Frank Pfenning

1. Suppose you are given a magic black box that somehow answers the following decision problem in polynomial time:

Graphs and Network Flows IE411. Lecture 21. Dr. Ted Ralphs

Lecture 3: Art Gallery Problems and Polygon Triangulation

Lecture 22 Tuesday, April 10

6. Lecture notes on matroid intersection

Matching Algorithms. Proof. If a bipartite graph has a perfect matching, then it is easy to see that the right hand side is a necessary condition.

3 No-Wait Job Shops with Variable Processing Times

Advanced Algorithms Class Notes for Monday, October 23, 2012 Min Ye, Mingfu Shao, and Bernard Moret

Lecture Notes on Static and Dynamic Semantics

Summary of Course Coverage

5.4 Pure Minimal Cost Flow

Lecture 15 CIS 341: COMPILERS

Lecture 1 Contracts : Principles of Imperative Computation (Fall 2018) Frank Pfenning

Advanced Methods in Algorithms HW 5

6.001 Notes: Section 8.1

Function compose, Type cut, And the Algebra of logic

Propositional Logic. Part I

Lecture 25 Spanning Trees

The three faces of homotopy type theory. Type theory and category theory. Minicourse plan. Typing judgments. Michael Shulman.

2. BOOLEAN ALGEBRA 2.1 INTRODUCTION

Lecture 14: Lower Bounds for Tree Resolution

a b c d a b c d e 5 e 7

arxiv: v1 [math.co] 4 Sep 2017

Lecture 24 Notes Search in Graphs

AXIOMS OF AN IMPERATIVE LANGUAGE PARTIAL CORRECTNESS WEAK AND STRONG CONDITIONS. THE AXIOM FOR nop

Graph theory - solutions to problem set 1

Lecture Notes on Contracts

Byzantine Consensus in Directed Graphs

Beluga: A Framework for Programming and Reasoning with Deductive Systems (System Description)

The Inverse of a Schema Mapping

Meta-programming with Names and Necessity p.1

CS261: Problem Set #1

The Geodesic Integral on Medial Graphs

Solution for Homework set 3

Propositional Calculus: Boolean Algebra and Simplification. CS 270: Mathematical Foundations of Computer Science Jeremy Johnson

Instructions. Notation. notation: In particular, t(i, 2) = 2 2 2

1 Connected components in undirected graphs

Lecture 5: Graphs. Rajat Mittal. IIT Kanpur

On Resolution Proofs for Combinational Equivalence Checking

Lecture 9 - Matrix Multiplication Equivalences and Spectral Graph Theory 1

Chordal deletion is fixed-parameter tractable

Finite Set Theory. based on Fully Ordered Lists. Jared Davis UT Austin. ACL2 Workshop 2004

PARSIFAL Summer 2011 Internship Report Logically validating logic programs

CONVENTIONAL EXECUTABLE SEMANTICS. Grigore Rosu CS522 Programming Language Semantics

Building a system for symbolic differentiation

FOUR EDGE-INDEPENDENT SPANNING TREES 1

2 Introduction to operational semantics

PROBLEM SET 1 SOLUTIONS MAS341: GRAPH THEORY 1. QUESTION 1

Encryption as an Abstract Datatype:

CONVENTIONAL EXECUTABLE SEMANTICS. Grigore Rosu CS422 Programming Language Semantics

Joint Entity Resolution

(a) (4 pts) Prove that if a and b are rational, then ab is rational. Since a and b are rational they can be written as the ratio of integers a 1

Lecture 25 Notes Spanning Trees

Lecture 10 Notes Linked Lists

Functional Programming with Names and Necessity

On median graphs and median grid graphs

Proving Theorems with Athena

Mathematical Logic Prof. Arindama Singh Department of Mathematics Indian Institute of Technology, Madras. Lecture - 9 Normal Forms

Lecture Notes on Prolog

Homework 3 Solutions

Lecture 3 Notes Arrays

Multi Domain Logic and its Applications to SAT

Homework 2. Sample Solution. Due Date: Thursday, May 31, 11:59 pm

3.4 Deduction and Evaluation: Tools Conditional-Equational Logic

Discrete Optimization. Lecture Notes 2

6.034 Notes: Section 11.1

MATH 682 Notes Combinatorics and Graph Theory II

Lecture Notes on Static Single Assignment Form

1 Variations of the Traveling Salesman Problem

Reductions and Satisfiability

Lecture Notes on Register Allocation

Extracting the Range of cps from Affine Typing

STABILITY AND PARADOX IN ALGORITHMIC LOGIC

Some Hardness Proofs

Midterm 2 Solutions. CS70 Discrete Mathematics and Probability Theory, Spring 2009

PROPOSITIONAL LOGIC (2)

Bipartite Roots of Graphs

H1 Spring C. A service-oriented architecture is frequently deployed in practice without a service registry

The Satisfiability Problem [HMU06,Chp.10b] Satisfiability (SAT) Problem Cook s Theorem: An NP-Complete Problem Restricted SAT: CSAT, k-sat, 3SAT

Tutorial for Algorithm s Theory Problem Set 5. January 17, 2013

Transcription:

Lecture Notes on Monadic Logic Programming 15-816: Linear Logic Frank Pfenning Lecture 20 We have discussed both forward and backward chaining at length; in this lecture we address the question how they can be combined effectively. The central idea is the use of a monad to separate forward chaining from backward chaining. Monads originated in category theory and have been used successfully in functional programming. In logic programming, they were first proposed in their present incarnation in the LolliMon language [LPPW05]. 1 Monads We introduce a new type {A} (sometimes written as A in logic or T A in category theory). It is weaker than truth in the sense that A {A}, but not the other way around. It may look like the dual to!a, but as we will see in a later lecture, this is not quite the case. In order to explain the meaning of {A} judgmentally we need a new judgment, A lax, which is somehow a counterpart of A pers. But it will only appear explicitly on the right-hand side of a sequent. This is the opposite of A pers, which only appears explicitly on the left-hand side. The judgment A lax is weaker than truth, so it can be derived from truth: 1 A true A lax lax 1 In this lecture we will uniformly write A true for ephemeral truth, be it on the left-hand or the right-hand side of a sequent.

Monadic Logic Programming L20.2 The connection in the other direction is given by a new cut rule. A lax, A true C lax, C lax cut{ } It expresses that we can view A lax as representing an ephemeral truth, but only while we try to prove the lax truth of C. Originally, in lax logic [FM97] {A} was conceived to stand for the truth of A under some constraints, where the constraints remain completely abstract. It is true only if the constraints are satisfied. We can see this in the two rules above. The rule lax allows the empty (or always true) constraint. The cut{ } rule expresses that if A is true under some constraint which may assume it is true, but only if we are trying to prove C under some constraint. In essence, the cut{ } rule relies on the conjunction of the constraints in the two derivations. The two rules match in the sense that if the first premise of the cut{ } was derived using the lax rule, then the cut{ } is immediately reduced to a cut. This illustrates, however, that all cut and left rules of the sequent calculus we have introduced so far must allow the succedent of the sequent to be either A true (which we have generally written as just A) or A lax. This is an analogous step to systematically adding persistent resources Γ to each rule in the calculus already. Once the judgmental principles are settled, it is almost trivial to internalize A lax as a proposition. A lax {A} true { }R, A true C lax, {A} true C lax { }L 2 Cut and Identity Let s check the cut reduction: A lax {A} true { }R, C lax, A true C lax, {A} true C lax { }L cut R A lax, A true C lax, C lax cut{ }

Monadic Logic Programming L20.3 In the broader proof of the admissibility of cut, when considering a cut{ } A lax, A true C lax, C lax cut{ } we push up the cut into the inference in the proof of the first premise, keeping the second premise fixed. The only way this strategy does not apply is if the first premise is the lax rule, with the premise of A true. But then we can just reduce it to an ordinary cut of the two premises. This is reminiscent of the way cut! always pushed up the cut into the second premise, until a copy rule was encountered. The lax judgment affords a simplification since there is no copying. In the proof of the identity expansion the steps are more or less forced: {A} true {A} true id {A} E A true A true id A A true A lax lax {A} true A lax { }L {A} true {A} true { }R Because of the judgmental transition, we need three rule applications: two propositional, and one judgmental. We ll have to remember this when considering focusing. 3 Examples: Unit and Bind In functional programming, monads [Mog91] are presented via two functions, unit and bind: unit : A {A} bind : {A} (A {B}) {B} These characterize lax logic axiomatically, although we do not pursue this here. But we can prove linear versions of them in the sequent calculus. First unit (omitting the standard true judgments as usual, but keeping lax

Monadic Logic Programming L20.4 explicit): bind is only slight more complicated: A A id A A lax lax A {A} { }R A {A} R B B id A A id B B lax lax {B} B lax { }L L A, A {B} B lax {A}, A {B} B lax { }L {A}, A {B} {B} { }R {A} (A {B}) {B} R2 This demonstrates that monads are compatible with linearity. 4 Focusing We recall that the identity expansion shown above starts with the { }R rule. This is a hint that the lax modality should be considered negative, and its right rule should be invertible. For focused sequents A when then have the following right rule: A lax {A} { }R Correspondingly, the left rule should be applied in focus. The question is whether we can retain focus on A., [A] C lax, [{A}] C lax { }L? Because there is an implicit judgmental transition (from {A} true to A lax and then to A true) we suspect we may need to lose focus. To construct a

Monadic Logic Programming L20.5 counterexample it is always useful to reconsider the identity, so let s try to construct a focused proof of {P } {P }. [P ] P lax [{P }] P lax { }L? {P } P lax focusl {P } {P } { }R Notice that at this point we fail, because [P ] γ only succeeds if γ = P true. So, as suspected, we need to lose focus and the correct rule is, A C lax, [{A}] C lax { }L Finally, we consider the A lax judgment. On the left, it does not occur because the judgment itself should be considered positive. It occurs on the right, but represents the end of an inversion phase, waiting for focusing to occur. This means we have the additional rule [A] A lax focr{ } As before, the focusing rules are restricted. If we are just modeling chaining (allowing inversion anywhere), the rule is restricted to the case where there is no focus in. In focusing, must also be stable, that is, consist entirely of negative propositions and positive atoms. 5 Polarization As a step towards the logic programming implementation, we now polarize the propositions, including the lax modality. This will tell us where inversion and focusing phases begin and end, and it will restrict us to a fragment that has a sensible operational interpretation. We have considered these before the crucial issue is how to polarize the lax modality. We already saw that the proposition {A} is negative, but the judgment hiding underneath, A lax, is positive. We say that { } is negative on the outside and positive on the inside. Like the exponential modality, it

Monadic Logic Programming L20.6 is a polarity-changing connective. Negative Props. A ::= P A + 1 A 2 A 1 A 2 x:τ. A {A + } Positive Props. A + ::= P + A + 1 A+ 2 1 A+ 1 A+ 2 0 A!A x:τ. A + The inclusion of negative propositions in the positive ones is often written a A, where is like a modality. However, it does not change the logical meaning, just the focusing behavior of the propositions. 6 Forward and Backward Chaining We have two forms of stable sequents, one represents forward chaining, the other backward chaining. P true C + lax Backward Chaining ( stable) Forward Chaining ( stable) Notice in particular that we do not have sequents C + true, because the only way that positives are included in negatives is through the lax modality. A prototypical clause that participates in backward chaining has the form A + P Focusing on such a clause will only succeed if the succedent will be P true. If it is C + lax, focusing will fail because there is no rule (fails), [P ] C + lax Conversely, a prototypical clause that participates in forward chaining has the form A + {B + } Focusing on such a clause will only succeed if the succedent is C + lax. If it is P true it will fail, because there is no rule (fails), [{B + }] P true

Monadic Logic Programming L20.7 If the conclusion is indeed C + lax, then focusing on the clause will never fail at the end, since we have the general transition, B + C + lax, [{B + }] C + lax { }L which initiates an inversion phase. How can we move between forward and backward chaining? When we decide to focus on a succedent C + lax [C + ] C + lax focr{ } we break down [C + ] until we reach a negative, and this may break down all the way to an atom, initiating backward chaining. So in this case, we may leave forward chaining altogether. Similarly, if we are focused on then a subgoal will be, [A + {B + }] C lax [A + ] which may devolve into a negative atom, starting forward chaining. In this case, the backward chaining is an auxiliary subgoal that enables a forwardchaining step to proceed. During backward chaining, we switch to forward chaining if the succedent is {A} in the inversion phase of backward chaining. A lax {A} { }R This may also happen when working on a subgoal during backward chaining. 7 Example: Testing Bipartiteness We develop a small implementation of testing whether a graph is bipartite, that is, can be colored with two colors such that no two adjacent vertices have the same color. The algorithm we want to implement is quite simple. In each phase we traverse a connected component of graph. If there is no uncolored node, we

Monadic Logic Programming L20.8 are finished and can check if there are two adjacent nodes with the same color. If not, pick an arbitrary node and color it an arbitrary color. Then we iterate: pick a node that has not yet been colored and has an adjacent colored node. Given it the opposite color of its neighbor. When no such node exists any more, we move to the next phase, again arbitrarily picking an uncolored node. We start with some type declarations in CLF. vertex : type. edge : vertex -> vertex -> type. color : type. a : color. b : color. We have two predicates: node x, of which we initially have an linear one for every vertex x in the graph, and a persistent edge x y for every edge in the graph. We assume it is already saturated to be symmetric. Then the forward-chaining rules that constitute the inner loop, propagating the color constraints would be: clr : vertex -> color -> type. node : vertex -> type. cb : node X *!edge X Y *!clr Y a -o {!clr X b}. ca : node X *!edge X Y *!clr Y b -o {!clr X a}. These rules consume node x, so that each node is colored exactly one. Next we need to implement the outer loop. We assume we are given a list of vertices, nil : vlist. cons : vertex -> vlist -> vlist. and have to assume each one into the context. This happens in a backwardchaining program. The previously mentioned convention and backwardchaining clauses are written as A B and forward-chaining ones as A {B} comes into effect. nbp : vlist -> type. nbp/cons : nbp (cons X L) o- (node X -o nbp L).

Monadic Logic Programming L20.9 We use the name nbp (short for not bipartite) because we will succeed if we find a contradiction. Once the context has been initialized, we have to pick a node, assign it an arbitrary color (say a), and then use the two rules above until we have reached quiescence. nbp/nil : nbp nil o- node X * (!clr X a -o {nbp nil}). This rule switches from forward chaining to backward chaining, once the persistent assumption!clr x a has been made. When we have reached quiescence, we recurse (subgoal nbp nil), picking another node x that has not yet been colored. At the end, once all nodes have been colored, we look for a contradiction: adjacent nodes with the same color. nbp/fail : nbp nil o-!clr X C *!edge X Y *!clr Y C. We can improve this program by making the node predicate affine. Recall that affine resources (marked with the affine modality @ ) are used at most once. This allows us to shirt-circuit the algorithm and exit as soon as we have found two adjacent nodes with the same color. The complete program is below. vertex : type. edge : vertex -> vertex -> type. color : type. a : color. b : color. clr : vertex -> color -> type. node : vertex -> type. cb : @node X *!edge X Y *!clr Y a -o {!clr X b}. ca : @node X *!edge X Y *!clr Y b -o {!clr X a}. vlist : type. nil : vlist. cons : vertex -> vlist -> vlist. nbp : vlist -> type.

Monadic Logic Programming L20.10 nbp/fail : nbp nil o-!clr X C *!edge X Y *!clr Y C. nbp/nil : nbp nil o- @node X * (!clr X a -o {nbp nil}). nbp/cons : nbp (cons X L) o- (@node X -o nbp L).

Monadic Logic Programming L20.11 References [FM97] M. Fairtlough and M.V. Mendler. Propositional lax logic. Information and Computation, 137(1):1 33, August 1997. [LPPW05] Pablo López, Frank Pfenning, Jeff Polakow, and Kevin Watkins. Monadic concurrent linear logic programming. In A.Felty, editor, Proceedings of the 7th International Symposium on Principles and Practice of Declarative Programming (PPDP 05), pages 35 46, Lisbon, Portugal, July 2005. ACM Press. [Mog91] Eugenio Moggi. Notions of computation and monads. Information and Computation, 93(1):55 92, 1991.