A fuzzy constraint assigns every possible tuple in a relation a membership degree. The function

Similar documents
Example: Bioinformatics. Soft Constraint Processing. From Optimal CSP to Soft CSP. Overview. From Optimal CSP to Soft CSP.

Constraint Satisfaction Problems

Constraint Satisfaction Problems (CSPs)

CS 771 Artificial Intelligence. Constraint Satisfaction Problem

Constraint Satisfaction Problems Part 2

Space of Search Strategies. CSE 573: Artificial Intelligence. Constraint Satisfaction. Recap: Search Problem. Example: Map-Coloring 11/30/2012

Outline. Best-first search

Constraint Satisfaction Problems

10/11/2017. Constraint Satisfaction Problems II. Review: CSP Representations. Heuristic 1: Most constrained variable

Announcements. Reminder: CSPs. Today. Example: N-Queens. Example: Map-Coloring. Introduction to Artificial Intelligence

Reading: Chapter 6 (3 rd ed.); Chapter 5 (2 nd ed.) For next week: Thursday: Chapter 8

Constraint Satisfaction Problems. A Quick Overview (based on AIMA book slides)

CSE 473: Artificial Intelligence

Lecture 9 Arc Consistency

CS 188: Artificial Intelligence Fall 2008

Announcements. CS 188: Artificial Intelligence Fall Large Scale: Problems with A* What is Search For? Example: N-Queens

Outline. Best-first search

Lecture 6: Constraint Satisfaction Problems (CSPs)

Constraint Satisfaction Problems. Chapter 6

ARTIFICIAL INTELLIGENCE (CS 370D)

A Soft Constraint-Based Approach to QoS-Aware Service Selection

Constraint Satisfaction

Constraint Satisfaction Problems. slides from: Padhraic Smyth, Bryan Low, S. Russell and P. Norvig, Jean-Claude Latombe

Constraint Satisfaction Problems

Example: Map coloring

Constraint Satisfaction Problems

Constraint Satisfaction Problems

Module 4. Constraint satisfaction problems. Version 2 CSE IIT, Kharagpur

CS 4100/5100: Foundations of AI

CS 188: Artificial Intelligence. What is Search For? Constraint Satisfaction Problems. Constraint Satisfaction Problems

Week 8: Constraint Satisfaction Problems

Announcements. Homework 4. Project 3. Due tonight at 11:59pm. Due 3/8 at 4:00pm

Constraint Satisfaction Problems. Chapter 6

Constraint Satisfaction Problems

Constraint Satisfaction. AI Slides (5e) c Lin

CS 188: Artificial Intelligence Fall 2011

Preference Modeling and Reasoning

What is Search For? CS 188: Artificial Intelligence. Constraint Satisfaction Problems

General Methods and Search Algorithms

CS 188: Artificial Intelligence. Recap: Search

Announcements. Homework 1: Search. Project 1: Search. Midterm date and time has been set:

Recap: Search Problem. CSE 473: Artificial Intelligence. Space of Search Strategies. Constraint Satisfaction. Example: N-Queens 4/9/2012

CS 188: Artificial Intelligence Fall 2011

1 Inference for Boolean theories

Arc Consistency and Domain Splitting in CSPs

What is Search For? CSE 473: Artificial Intelligence. Example: N-Queens. Example: N-Queens. Example: Map-Coloring 4/7/17

Propagate the Right Thing: How Preferences Can Speed-Up Constraint Solving

Announcements. CS 188: Artificial Intelligence Spring Today. Example: Map-Coloring. Example: Cryptarithmetic.

DIT411/TIN175, Artificial Intelligence. Peter Ljunglöf. 30 January, 2018

Games and Adversarial Search II Alpha-Beta Pruning (AIMA 5.3)

Example: Map-Coloring. Constraint Satisfaction Problems Western Australia. Example: Map-Coloring contd. Outline. Constraint graph

Lecture 18. Questions? Monday, February 20 CS 430 Artificial Intelligence - Lecture 18 1

CS 343: Artificial Intelligence

6.034 Quiz 1, Spring 2004 Solutions

Announcements. CS 188: Artificial Intelligence Fall 2010

Artificial Intelligence

Artificial Intelligence

Constraint Satisfaction Problems

1 Tree Search (12 points)

CSPs: Search and Arc Consistency

Australia Western Australia Western Territory Northern Territory Northern Australia South Australia South Tasmania Queensland Tasmania Victoria

CS 4100 // artificial intelligence

Lars Schmidt-Thieme, Information Systems and Machine Learning Lab (ISMLL), University of Hildesheim, Germany, Course on Artificial Intelligence,

Constraint Programming

Announcements. CS 188: Artificial Intelligence Spring Today. A* Review. Consistency. A* Graph Search Gone Wrong

CS 188: Artificial Intelligence Spring Announcements

Chapter 6 Constraint Satisfaction Problems

Constraint Satisfaction Problems

Constraint satisfaction problems. CS171, Winter 2018 Introduction to Artificial Intelligence Prof. Richard Lathrop

Foundations of Artificial Intelligence

Constraint (Logic) Programming

Constraint Satisfaction Problems

Soft constraints: Polynomial classes, Applications

CS W4701 Artificial Intelligence

Overview. Overview. Example: Full Adder Diagnosis. Example: Full Adder Diagnosis. Valued CSP. Diagnosis using Bounded Search and Symbolic Inference

Constraint Programming

Constraint Satisfaction Problems

CONSTRAINT SATISFACTION

CMU-Q Lecture 7: Searching in solution space Constraint Satisfaction Problems (CSPs) Teacher: Gianni A. Di Caro

Integrating Probabilistic Reasoning with Constraint Satisfaction

Constraint Satisfaction Problems (CSPs) Introduction and Backtracking Search

Set 5: Constraint Satisfaction Problems Chapter 6 R&N

A Parameterized Local Consistency for Redundant Modeling in Weighted CSPs

Local Consistency in Weighted CSPs and Inference in Max-SAT

6.034 Notes: Section 3.1

What is Search For? CS 188: Artificial Intelligence. Example: Map Coloring. Example: N-Queens. Example: N-Queens. Constraint Satisfaction Problems

Mathematical Programming Formulations, Constraint Programming

Constraint Satisfaction Problems

Part 2: Algorithms. CP-2001 Tutorial - Cyprus November 2001 Algorithms 1

Today. Introduction to Artificial Intelligence COMP 3501 / COMP Lecture 5. Constraint Satisfaction Problems (CSP) CSP Definition

Artificial Intelligence

Modelling Soft Constraints: A Survey

Constraint Satisfaction Problems

Conflict based Backjumping for Constraints Optimization Problems

Set 5: Constraint Satisfaction Problems

Conflict-based Statistics

CONSTRAINT SATISFACTION

The Branch & Move algorithm: Improving Global Constraints Support by Local Search

Constraint Satisfaction Problems

A CSP Search Algorithm with Reduced Branching Factor

Transcription:

Scribe Notes: 2/13/2013 Presenter: Tony Schnider Scribe: Nate Stender Topic: Soft Constraints (Ch. 9 of CP handbook) Soft Constraints Motivation Soft constraints are used: 1. When we seek to find the best (optimal) solution to a problem, 2. When there are probabilistic constraints, or 3. When a problem is over- constrained (seek to satisfy as many constraints as possible). In order to find the optimal solution, the entire search space must be searched. This problem is at least as hard a classical CSPs (a classical CSP can always be represented by one with soft constraints). The speaker discussed two types of frameworks for soft constraints and two types of algorithms: A. Specific frameworks: fuzzy & weighted B. Local consistency C. Generic frameworks: semiring- based & valued D. Systematic search A1. Fuzzy Sets In usual set theory, any value of a set either belongs to the set or does not. In a fuzzy set, we ask "How much does this value belong to this set?" The value for the set can be 0: doesn't belong, 1: does belong, or between 0 and 1: for some degree of belonging. A fuzzy constraint assigns every possible tuple in a relation a membership degree. The function that determines the degree of membership for a value/tuple is called a membership function. Thus, in fuzzy CSPs, the weights are assigned to the individual tuples of a relation. Fuzzy constraints are combined using conjunctive combination (R V R W. ). This combination amounts to taking the Cartesian product of the two relations choosing for a tuple the minimum value among the membership functions. 1

Question from Daniel G: Why do we want the minimum value when combining fuzzy constraints? Answer: Because we want to find the weakest link to keep for the constraint. For example, if we want to maximize the happiness of all people, it is not fair to average because then some people would end up very happy, and some would end up very unhappy. We want to maximize the minimum happiness. The satisfaction rate of an assignment is determined by the least satisfied constraint (the minimum membership degree). Any assignment with a non- zero satisfaction rate is considered a solution, and the solution with the maximum satisfaction rate is optimal. Advantages: Good for applications where the system is only as good as its weakest link. A fuzzy CSP can fully represent a classical CSP Can change the comparison operator (from min) to represent other situations. Disadvantages: We can have a tie between two "optimal solutions", even though one solution is better on specific constraints. (The decision is blinded by the minimum of both.) Solution, sort individual constraints by value and then break ties in that order. A2. Weighted Constraints A weighted constraint network (WCSP) is a CSP with weighted constraints. The cost of an assignment in the WCSP is the number of violated constraints. The optimal solution with a complete assignment with minimal cost (and thus the least amount of violated constraints). k- weighted constraints are weighted constraints with the addition of an upper bound on the acceptable cost for a solution. By adding a limit, k- weighted CSPs can be tested for consistency, instead of just taking the best solution. B. Local Consistency in Soft Constraints Local consistency is more difficult in SCSPs than in (crisp) CSPs because a violated constraint does not indicate inconsistency. Thus, values can be pruned only when they are node inconsistent, not when they are arc inconsistent. A value is node consistent if it is below an upper limit T. A value is arc- consistent if it is node consistent, and it has at least one support in its neighboring variable s.t. the cost of the tuple in the constraint equals a lower limit. 2

Question by Robert: Is there ever a time when is greater than 0? Answer: Not sure, but for representation purposes you could always normalize to 0 by subtracting its value from each weight. In order to enforce arc consistency, we push the costs of binary constraints onto the unary constraints. To do this, the minimum cost of the binary variable is added to the unary constraint of one of the participating variables and then subtracted from the binary constraint. Adding and subtracting to constraint weights has a special meaning in WCSPs: a b = ( a b : a 6= k k : a = k a b = ( a + b : a + b<k k : a + b k In this way, the cost of binary constraints is passed to the unary constraints, and thus node consistency can be used to enforce arc consistency. One problem that can arise from this method is that the order of evaluation (node pruning) may produce different structures in the CSP: Figure 1: Example of difference CSP structures due to different orders of evaluation Question by Nate: Are there heuristics for selecting the order of pruning? Answer: None mentioned in the paper, but there could be work done on this topic since the paper was published. 3

W- AC3 and W- AC2001 algorithms are similar to their hard- CSP counterparts. They have an added step of making variables arc- consistent by forcing support for arc consistency. In addition, variables must be rechecked for node consistency whenever they are modified. C. Generic Soft Constraint Frameworks The goal of developing generic frameworks for SCSPs is to allow: 1. the development of generic algorithms that will work on all types of soft constraints, instead of having to create a new one for each type. And 2. the easy comparison between the frameworks. For SCSPs, two main generic frameworks have been developed: semiring- based constraints and valued constraints. Before we discuss semirings, I would like to introduce some terminology from algebra (answering instructor s question: A group (S,*) is a set S with an operation * such that the operation * satisfies the following conditions: o Closure (x,y S x*y S) o Associativity ((x*y)*z= x*(y*z)) o Identity=idempotent ( e S, x*e=e*x=x) o Invertible ( x S y S s.t. x*y=y*x=e) An Abelian group (S,*) (or a commutative group) is a group that is also commutative (x*y=y*x). A ring (S,+,*) is a set S and two operations * and + such that o (S,+) is an Abelian group (+ is closed, associative, has identity element, invertible, and commutative) o * is associative ((x*y)*z= x*(y*z)) o * distributes over + that is: x*(y+z)= (x*y)+(x*z) o Often, * is required to have an identity element A semiring (S,+,*) is a ring where o + is not invertible o * has an annihilator C1. c- Semirings A c- Semiring (constraint semiring) is a tuple E,+ s, s,0,1 with the following properties: E is a set of satisfaction degrees where 0 E, 1 E Operator + s : is associative, commutative, and idempotent; closed in E; has 0 as a neutral element and 1 as an annihilator Operator s : is associative and commutative; is closed in E; has 1 as a neutral element and 0 as an annihilator s distributes over + s 4

c- Semiring operators s is used to combine preferences Let: R C be completely satisfied R U be completely unsatisfied R l have satisfaction degree l val s (R) be the preference level of relation R Then: val s (R C ) s val s (R U )= 0 val s (R C ) s val s (R l ) = l val s (R l ) s val s ( R l ) = l + s is used to order preference levels Let a and b be two preference levels: a + s b = a b s a a + s b = b a s b a + s b = c and a c, b c, incomparable Figure 2: Hasse Diagram + s Induces a partial ordering over preferences, specifically a lattice where + s is the least upper bound (LUB) of {a,b}. c- Semirings can be partially ordered based on their level of satisfaction. Partial ordering allows multiple criteria to be optimized. Partial ordering can be represented using a Hasse diagram, as seen in Figure 2. In a Hasse diagram the higher a node is, the more satisfied it is. A semiring constraint network is a tuple X,D,C,S where: X, D : same as in classical CSPs C is a set of soft constraints S = E,+ s, s,0,1 is a c- semiring Questions Choueiry: what is the difference between c- semiring and semiring: Answer: c stands for constraints C2. Valued Constraints Valued constraints are an alternative framework to semiring- based constraints that use a valuation structure instead of a semiring. 5

A valuation structure is a tuple E,, v,, where E is a set of values totally ordered by v is a minimum element in E is a maximum element in E is an operator which o is associative, commutative, and monotonic o is closed in E o Has as a neutral element and as an annihilator Monotonicity induces total ordering: a v b ((a c) v (b c)) Valuation structures are totally ordered, and thus they are less expressive than semirings, which can be partially ordered. Thus, a valuation structure can always be represented by a semiring. Alternatively, a semiring can only be represented by a valuation structure if the semiring is totally ordered. C3. Operations on Soft Constraints Soft constraints can be combined (or joined) without information loss. In a join, every possible tuple resulting from the combination is generated, and the membership function value is the minimum of the combined tuple values. When soft constraints are projected over a variable (a variable is eliminated from the constraint) the membership function value is the maximum of the equivalent remaining tuples. C4. Soft Constraint Solutions Projecting the soft constraint over nothing, we get the maximum values, which is the network's consistency level. As long as the preference level is above 0, then an assignment is complete (i.e., acceptable). The optimal solution is one s.t. no other assignment has a better consistency level. Question Fikayo: Is it possible to have negative numbers in the tuples? Answers: No, not directly, as this would break the minimum operation having 0 as the annihilator. 6

D. Search in Soft CSPs There are two basic categories of search in CSPs. 1. Systematic search explores the entire search space and is guaranteed to find the optimal solution, but is usually too costly to perform. 2. Local search is more feasible because it is limited in resources (usually time), but it may not find the best solution. Hybrid approaches can be used to take some benefit from both approaches. A basic form of systematic search is depth- first branch- and- bound (DFBB). DFBB explores the search space in an DFS manner, while keeping track of an upper and lower bound. At any given level in the search, if a partial assignment's lower bound is greater than the upper bound (assuming a minimization task), search backtracks. Most improvement on DFBB focuses on generating better lower- bound estimates. One such method is using partial forward checking (PFC) to generate a tighter bound by adding information from unassigned variables. PFC can be further improved by applying directional arc inconsistency (DAI) to include costs from completely unassigned constraints. One caveat to DAI is that it requires a static variable ordering. 7