1
20070412 Chap5 1
Chapter5
Constraint Satisfaction Problems
20070412 Chap5 2
Constraint Satisfaction Problems (CSPs)
Standard Search Problem- State is a “black box”, can be represented by an
arbitrary data structure that can be accessed only by the problem-specific routines --- the successor functions, heuristic function, and goal test.
Constraint Satisfaction Problem- State and goal test conform to a standard, structured,
and very simple representation.
2
20070412 Chap5 3
CSP is defined by a set of variables, X1, X2, …, Xn,
with values from domain D1, D2, …, Dn, and a set of constraints, C1, C2, …, Cm,
specifying allowable combinations of values for subsets of variables
State is defined by an assignment of values to some or all of the variables,
{Xi = vi , Xj = vj , …}
Constraint Satisfaction Problems(cont.-1)
20070412 Chap5 4
Consistent (or legal) assignmentan assignment that does not violate any constraints.
Complete assignmentone in which every variable is mentioned.
Solutionis a complete assignment that satisfies all the constraints.
Some CSPs also require a solution that maximizes an objective function
Constraint Satisfaction Problems(cont.-2)
3
20070412 Chap5 5
Example: Map-Coloring
Variables: WA, NT, Q, NSW, V, SA, TDomains: Di = {red, green, blue}Constraints: adjacent regions must have different colorse.g. WA ≠ NT or(WA, NT) ∈{(red, green), (red, blue), (green, red), (green, blue), …}
20070412 Chap5 6
Example: Map-Coloring (cont.)
Solutions: assignments satisfying all constraintse.g. {WA = red, NT = green, Q = red, NSW = green, V= red,
SA = blue, T = green}
4
20070412 Chap5 7
Constraint Graph
• It is helpful to visualize a CSP as a constraint graph.- Nodes are variable, and arcs show constraints.
• CSP benefits- Standard representation pattern- Generic goal and successor functions- Generic heuristics (no domain
specific expertise).
20070412 Chap5 8
Varieties of CSPs
Discrete variables- finite domains, size d ⇒ O(dm) complete assignments
e. g. 8 Queens, Q1, …Q8, with the domain {1, 2, 3, 4, 5, 6, 7, 8}- infinite domains, (integers, strings, etc.)need a constraint language (cannot enumerate all allowed
combinations of values)e. g. job scheduling, Job1 which takes 5 days, must precede Job3
StartJob1 + 5 ≤ StartJob3
Continuous variablese g. Hubble Space Telescope observations
- Linear constraints are solvable in polynomial timeby Linear programming methods.
5
20070412 Chap5 9
Varieties of ConstraintsUnary constraints, involves a single variable
e.g. SA ≠ green
Binary constraints, involves pairs of variablese.g. SA ≠ WA
Higher-order constraints, involves 3 or more variablese.g. cryptarithmetic column constraints
Every high-order, finite-domain constraint can be reduced to a set of binary constraints if enough auxiliary variables are introduced. (Exercise 5.11).
We deal only with binary constraints in this chapter.
20070412 Chap5 10
Varieties of Constraints (cont.)
Absolute vs. Preference constraintsPreference constraints can often be encoded as
costs on individual variable assignments
e.g. red is better than greenProf. X might prefer teaching in the morning,
whereas prof. Y prefers teaching in the afternoon.
6
20070412 Chap5 11
Example: Cryptarithmetic
Variables: F T U W R O X1 X2 X3Domains: { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9}Constraints:
Alldiff(F, T, U, W, R, O)O + O = R + 10 * X1X1 + W + W = U + 10 * X2X2 + T + T = O + 10 * X3X3 = F
X3 X2 X1
Where X1, X2 and X3 are auxiliary variables
20070412 Chap5 12
Standard Search Formulation (Incremental)
Initial state: the empty assignment, { }
Successor function: assign a value to an unassigned variable that does not conflict with current assignment
Goal test: the current assignment is complete.
Path cost: a constant cost for every step.
7
20070412 Chap5 13
Backtracking Search for CSPs
In all CSPs, variable assignments are commutative.[WA = red then NT = green] same as [NT = green then WA = red ]
Only need to consider assignments to a single value at each node.
there are dn leaves (where d: domain size, n: number of variables)
Backtracking searcha form of depth-first search for CSP with single–variableassignments
20070412 Chap5 14
Backtracking Search for CSPs (cont.)
8
20070412 Chap5 15
Backtracking Example
20070412 Chap5 16
Backtracking Example (cont.-1)
9
20070412 Chap5 17
Backtracking Example (cont.-2)
20070412 Chap5 18
Backtracking Example (cont.-3)
10
20070412 Chap5 19
Improving backtracking efficiency
Previous improvements introduce heuristicsGeneral-purpose methods can give huge gains in speed:- Which variable should be assigned next?- In what order should its values be tried?- Can we detect inevitable failure early?- Can we take advantage of problem structure?
20070412 Chap5 20
Some Key Questions of Backtracking Search
Variable and Value OrderingWhich variable should be assigned next,
and in what order should its values be tried?
Propagating information through constraintsWhat are the implications of the current variable assignments
for the other unassigned variables?
Intelligent backtrackingWhen a path fails --- that is, a state is reached in which a
variable has no legal values --- can the search avoid repeating this failure in subsequent paths?
11
20070412 Chap5 21
Minimum Remaining Values
Minimum remaining value (MRV) heuristicor Most constrained variable heuristicor Fail-first heuristic
- choose the variable with the fewer legal valuesvar ← SELECT-UNASSIGNED-VARIABLE(VARIABLES[csp],assignment,csp)
After WA = red and NT = green, SA is assigned next rather than assigning Q
20070412 Chap5 22
Degree heuristic
Degree heuristic (Most Constraining Variable)- tie-breaker among most constrained variables- choose the variable with the most constraints
on remaining variables
SA (degree is 5) is assigned first
12
20070412 Chap5 23
Least Constraining Value
After WA = red and NT = green, Q is assigned to redBlue is bad choice since it eliminated
the last legal value left for Q’s neighbor, SA
• Least constraining value heuristic- try to leave the maximum flexibility for subsequent variable
assignment- prefer the value that rules out the fewest choices
for the neighboring variables in the constraint graph
20070412 Chap5 24
Constraint Propagation
Propagating the implications of a constraint on one variable onto other variables
- Forward checking - Arc consistency (more stronger)
13
20070412 Chap5 25
Forward Checking
Idea: Keep track of remaining legal values for unassigned variables
Terminate search when any variable has no legal values
Whenever a variable X is assigned, the forward checking process looks at each unassigned variable Ythat is connected to X by a constraint and deletes from Y’s domain any value that is inconsistent with the value chosen for X.
20070412 Chap5 26
Forward Checking (cont.-1)
14
20070412 Chap5 27
Forward Checking (cont.-2)
After WA = red NT can no longer be redSA can no longer be red
20070412 Chap5 28
Forward Checking (cont.-3)
After WA = red and Q = green
With MRV, NT = blue and SA = blue
15
20070412 Chap5 29
Forward Checking (cont.-4)
After WA = red, Q = green and V = blue, leaving SA with no legal values
FC has detected that partial assignment is inconsistent with the constraints and backtracking can occur.
20070412 Chap5 30
Forward Checking (cont.-5)
Forward checking propagates information from assigned to unassigned variable, but does not provide early detection for all failures.
After WA = red and Q = green
With MRV, NT = blue and SA = blue, but they cannot both be blue
16
20070412 Chap5 31
Example: 4-Queens Problem
1
3
2
4
32 41
X1{1,2,3,4}
X3{1,2,3,4}
X4{1,2,3,4}
X2{1,2,3,4}
[4-Queens slides copied from B.J. Dorr CMSC 421 course on AI]
20070412 Chap5 32
Example: 4-Queens Problem (cont.-1)
1
3
2
4
32 41
X1{1,2,3,4}
X3{1,2,3,4}
X4{1,2,3,4}
X2{1,2,3,4}
17
20070412 Chap5 33
Example: 4-Queens Problem (cont.-2)
1
3
2
4
32 41
X1{1,2,3,4}
X3{ ,2, ,4}
X4{ ,2,3, }
X2{ , ,3,4}
20070412 Chap5 34
Example: 4-Queens Problem (cont.-3)
1
3
2
4
32 41
X1{1,2,3,4}
X3{ ,2, ,4}
X4{ ,2,3, }
X2{ , ,3,4}
18
20070412 Chap5 35
Example: 4-Queens Problem (cont.-4)
1
3
2
4
32 41
X1{1,2,3,4}
X3{ , , , }
X4{ ,2, , }
X2{ , ,3,4}
20070412 Chap5 36
Example: 4-Queens Problem (cont.-5)
1
3
2
4
32 41
X1{ ,2,3,4}
X3{1,2,3,4}
X4{1,2,3,4}
X2{1,2,3,4}
19
20070412 Chap5 37
Example: 4-Queens Problem (cont.-6)
1
3
2
4
32 41
X1{ ,2,3,4}
X3{1, ,3, }
X4{1, ,3,4}
X2{ , , ,4}
20070412 Chap5 38
Example: 4-Queens Problem (cont.-7)
1
3
2
4
32 41
X1{ ,2,3,4}
X3{1, ,3, }
X4{1, ,3,4}
X2{ , , ,4}
20
20070412 Chap5 39
Example: 4-Queens Problem (cont.-8)
1
3
2
4
32 41
X1{ ,2,3,4}
X3{1, , , }
X4{1, ,3, }
X2{ , , ,4}
20070412 Chap5 40
Example: 4-Queens Problem (cont.-9)
1
3
2
4
32 41
X1{ ,2,3,4}
X3{1, , , }
X4{1, ,3, }
X2{ , , ,4}
21
20070412 Chap5 41
Example: 4-Queens Problem (cont.-10)
1
3
2
4
32 41
X1{ ,2,3,4}
X3{1, , , }
X4{ , ,3, }
X2{ , , ,4}
20070412 Chap5 42
Example: 4-Queens Problem (cont.-11)
1
3
2
4
32 41
X1{ ,2,3,4}
X3{1, , , }
X4{ , ,3, }
X2{ , , ,4}
22
20070412 Chap5 43
Arc Consistency
Simplest form of propagation makes each arc consistent.
X → Y is consistent ifffor every value x of X, there is some allowed y
Applying arc consistency has result in early detection of an inconsistency that is not detected by pure forward checking.
20070412 Chap5 44
Arc Consistency (cont.-1)
SA → NSW is consistent iffSA=blue and NSW=red
23
20070412 Chap5 45
Arc Consistency (cont.-2)
NSW → SA is consistent iffNSW=red and SA=blueNSW=blue and SA=???
Arc can be made consistent by removing bluefrom NSW
20070412 Chap5 46
Arc Consistency (cont.-3)
V → NSW
Arc can be made consistent by removing blue from NSW
If X loses a value, neighbors of X need to be rechecked.
Remove red from V
24
20070412 Chap5 47
Arc Consistency (cont.-4)
SA → NT, but SA becomes empty.Arc consistency detects failure earlier than forward checking
Can be run as a preprocessor or after each assignment
Repeated until no inconsistency remains
20070412 Chap5 48
Arc Consistency Algorithm
O(n2d3) can be reduces to O(n2d2) But cannot detect all failures in polynomial time.
25
20070412 Chap5 49
k-consistency
A CSP is k-consistent, if for any k-1 variables and for any consistent assignment to those variables, a consistent value can always be assigned to any kthvariable.
k=1 i.e. node consistentk=2 i.e. arc consistentk=3 i.e. path consistent
20070412 Chap5 50
Strongly k-consistency
A graph is strongly k-consistent if- It is k-consistent and- It is also (k-1) consistent, (k-2) consistent, … all the
way down to 1-consistent.
This is ideal since a solution can be found in time O(nd) instead of O(n2d3)YET no free lunch: any algorithm for establishing n-consistency must take time exponential in n, in the worst case.
26
20070412 Chap5 51
Further improvementsChecking special constraints- Checking Alldiff(…) constraint
If m variables involved, n possible distinct values and m > n then the constraint cannot be satisfied.e.g. {WA=red, NSW=red}
3 variables (SA, NT and Q)can be only 2 values {green, blue}
- Checking Atmost(…) constraintBounds propagation for larger value domains
e.g. atmost(10, PA1, PA2, PA3, PA4)
20070412 Chap5 52
Further improvements (cont.-1)
Intelligent backtracking- Standard form is chronological backtracking
i.e. try different value for preceding variable.
- More intelligent, backtrack to conflict set.Set of variables that caused the failure or set of
previously assigned variables that are connected to X by constraints.
Backjumping moves back to most recent element of the conflict set.
Forward checking can be used to determine conflict set.
27
20070412 Chap5 53
Local Search for CSPs(Iterative algorithms)
Hill-climbing, simulated annealing typically use complete state representation. (i.e., all variables assigned.)
To apply to CSPs- allow states with unsatisfied constraints- operators: reassign variable values- variable selection: randomly select any conflicted variable- value selection: min-conflict heuristic
Select new value that results in a minimum number of conflicts with other variables.
20070412 Chap5 54
Min-Conflict Algorithm
28
20070412 Chap5 55
An An 44--queens Problemqueens ProblemStates: 4 queens in 4 columns (44 = 256 states)Operators: move queen in columnGoal test: no attacksEvaluation: h(n) = number of attacks
20070412 Chap5 56
An An 44--queens queens ProblemProblem(cont(cont.).)
Assume one queen in each column.Which row does each one go in?
Variables Q1, Q2, Q3, Q4Domains Di = {1, 2, 3, 4}Constraints
Qi ≠ Qj (cannot be in same row)|Qi − Qj| ≠ |i − j| (or same diagonal)
Translate each constraint into set of allowable values for its variables
E.g., values for (Q1,Q2) are (1, 3) (1, 4) (2, 4) (3, 1) (4, 1) (4, 2)
29
20070412 Chap5 57
An 8An 8--queens Problemqueens ProblemAlgorithm chosen:
the min-conflicts heuristic repair method
Algorithm Characteristics:repairs inconsistencies in the current configurationselects a new value for a variable that results in the minimum number of conflicts with other variables
20070412 Chap5 58
Detailed StepsDetailed Steps1. One by one, find out the number of conflicts between
the inconsistent variable and other variables.
30
20070412 Chap5 59
2. Choose the one with the smallest number of conflicts to make a move
Detailed StepsDetailed Steps (cont.(cont.--1)1)
20070412 Chap5 60
3. Repeat previous steps until all the inconsistent variables have been assigned with a proper value.
Detailed Steps Detailed Steps (cont.(cont.--2)2)
31
20070412 Chap5 61
Performance of min-conflictsGiven random initial state, can solve n-queens in almost constant time for arbitrary n with high probability (e.g., n = 10,000,000)The same appears to be true for any randomly-generated CSP except in a narrow range of the ratio
20070412 Chap5 62
Problem Structure
How can the problem structure help to find a solution quickly?Subproblem identification is important
- Coloring Tasmania and mainland are independent subproblems- Identifiable as connected components of constrained graph.
Improves performance
32
20070412 Chap5 63
Problem Structure (cont.)
Completely independent subproblems are delicious, then, but rare.
Suppose each subproblem has c variables out of n totalWith decomposition
solution cost : n/c * dc, --- linear in nWithout decomposition
solution cost : dn, --- exponential in n
e.g. n=80, d=2, c=20280 = 4 billion years at 10 million nodes/sec4 * 220 = 0.4 sec at 10 million nodes/sec
20070412 Chap5 64
Tree-Structured CSPs
In most cases subproblems of a CSP are connected as a tree.Any tree-structured CSP can be solved in time linear in the number of variables.Theorem: if the constraint graph has no loops, the CSP can be solved in O(n d2) time.(Compare to general CSPs, where worse-case time is O(dn)
33
20070412 Chap5 65
Algorithm for Tree-Structured CSPs
1. Choose a variable as root, order variables from root to leaves, such that every node’s parent precedes it in the ordering.
2. For j from n down to 2, apply RemoveInconsistent(Parent(Xj), Xj)
3. For j from 1 to n, assign Xj consistently with Parent(Xj)
20070412 Chap5 66
Nearly Tree-Structured CSPsTo reduce constraint graph to tree- by removing nodes- by collapsing nodes
34
20070412 Chap5 67
Nearly Tree-Structured CSPs (cont.-1)
Idea: assign values to some variables so that the remaining variables form a tree.Assume that we assign {SA=x} cycle cutsetAnd remove any values from the other variables that are inconsistent.The selected value for SA could be the wrong one so we have to try
all of them
20070412 Chap5 68
Nearly Tree-Structured CSPs (cont.-2)
This approach is worthwhile if cycle cutset is small.Finding the smallest cycle cutset is NP-hard- Approximation algorithms exist
This approach is called cutset conditioning.
35
20070412 Chap5 69
Nearly Tree-Structured CSPs (cont.-3)
• Necessary requirements:Every variable appears in at least one of the subproblems.If two variables are connected in the original problem, they must
appear together in at least one subproblem.If a variable appears in two subproblems, it must appear in
every subproblem along the path connecting those subproblems
• Tree decomposition of theconstraint graph in a set ofconnected subproblems.
• Each subproblem is solved independently.
• Resulting solutions are combined