Date post: | 30-Dec-2015 |
Category: |
Documents |
Upload: | forrest-hogan |
View: | 29 times |
Download: | 2 times |
Fixed-size Minimax for Committee Elections:
Approximation and Local Search Heuristics
Rob LeGrandWashington University in St. Louis
Evangelos MarkakisUniversity of Toronto
Aranyak MehtaIBM Almaden Research Center
COMSOC ’066 December 2006
2
Electing a committee from approval ballots
11110 00011
00111
0000110111
01111
•What’s the best committee of size k = 2?
approves ofcandidates
4 and 5n = 5 candidates
m = 6 ballots
3
Sum of Hamming distances
11110 00011
00111
0000110111
01111 110004 5
2 4
4 3 sum = 22
k = 2 winners
4
Fixed-size minisum
11110 00011
00111
0000110111
01111 00011
•Minisum elects winner set with smallest sumscore•Easy to compute (pick candidates with most approvals)
2 1
4 0
2 1 sum = 10
k = 2 winners
5
Maximum Hamming distance
11110 00011
00111
0000110111
01111 000112 1
4 0
2 1 sum = 10max = 4
k = 2 winners
6
Fixed-size minimax
•Minimax elects winner set with smallest maxscore•Harder to compute?
11110 00011
00111
0000110111
01111 001102 1
2 2
2 3 sum = 12max = 3
k = 2 winners
[Brams, Kilgour & Sanver, ’04]
7
Complexity
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
Fixed-size minimax
= FSM(k) = BSM(k, k)
NP-hard
[Frances & Litman, ’97]
NP-hard
(generalization of EM)
?
8
Complexity
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
Fixed-size minimax
= FSM(k) = BSM(k, k)
NP-hard
[Frances & Litman, ’97]
NP-hard
(generalization of EM)
NP-hard
(this paper)
9
Approximability
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
Fixed-size minimax
= FSM(k) = BSM(k, k)
has a PTAS*
[Li, Ma & Wang, ’99]
no known PTAS;no known constant-
factor approx.
no known PTAS;no known constant-
factor approx.
* Polynomial-Time Approximation Scheme: algorithm with approx. ratio 1 + ε that runs in time polynomial in the input and exponential in 1/ε
10
Approximability
Endogenous minimax
= EM = BSM(0, n)
Bounded-size minimax
= BSM(k1, k2)
Fixed-size minimax
= FSM(k) = BSM(k, k)
has a PTAS*
[Li, Ma & Wang, ’99]
no known PTAS;
has a 3-approx.
(this paper)
no known PTAS;
has a 3-approx.
(this paper)
* Polynomial-Time Approximation Scheme: algorithm with approx. ratio 1 + ε that runs in time polynomial in the input and exponential in 1/ε
11
Approximating FSM
00111
00001
10111
01111
00011
11110
00111
k = 2 winners
choosea ballot
arbitrarily
12
Approximating FSM
00111
00001
10111
01111
00011
11110
0010100111coerce to
size k
k = 2 winners
choosea ballot
arbitrarily
outcome =k-completed ballot
13
Approximation ratio ≤ 3
00111
00001
10111
01111
00011
11110
00110
2
2
1
3
2
2
≤ OPT
optimalFSM set
OPT = optimal maxscore
14
Approximation ratio ≤ 3
00111
00001
10111
01111
00011
11110
00110 00111
2
2
1
3
2
2
1
≤ OPT ≤ OPT
optimalFSM set
chosenballot
OPT = optimal maxscore
15
Approximation ratio ≤ 3
00111
00001
10111
01111
00011
11110
00110 00111 00011
2
2
1
3
2
2
1 1
≤ OPT ≤ OPT ≤ OPT
≤ 3·OPT
optimalFSM set
chosenballot
k-completedballot
OPT = optimal maxscore (by triangle inequality)
16
Better in practice?
• So far, we can guarantee a winner set no more than 3 times as bad as the optimal.– Nice in theory . . .
• How can we do better in practice?– Try local search
17
Local search approach for FSM
1. Start with some c {0,1}n of weight k
010014
18
Local search approach for FSM
1. Start with some c {0,1}n of weight k
2. In c, swap up to r 0-bits with 1-bits in such a way that minimizes the maxscore of the result
01001
11000 10001
01100
01010 00011
001014
44
4
5
4
4
19
Local search approach for FSM
1. Start with some c {0,1}n of weight k
2. In c, swap up to r 0-bits with 1-bits in such a way that minimizes the maxscore of the result
010104
20
Local search approach for FSM
1. Start with some c {0,1}n of weight k
2. In c, swap up to r 0-bits with 1-bits in such a way that minimizes the maxscore of the result
010104
21
Local search approach for FSM
1. Start with some c {0,1}n of weight k
2. In c, swap up to r 0-bits with 1-bits in such a way that minimizes the maxscore of the result
3. Repeat step 2 until maxscore(c) is unchanged n times
4. Take c as the solution
01010
11000 10010
01100
01001 00011
001104
44
4
5
3
4
22
Local search approach for FSM
1. Start with some c {0,1}n of weight k
2. In c, swap up to r 0-bits with 1-bits in such a way that minimizes the maxscore of the result
3. Repeat step 2 until maxscore(c) is unchanged n times
4. Take c as the solution
001103
23
Specific FSM heuristics
• Two parameters:– where to start vector c:
1. a fixed-size-minisum solution
2. a k-completion of a ballot (3-approx.)
3. a random set of k candidates
4. a k-completion of a ballot with highest maxscore– radius of neighborhood r: 1 and 2
24
Heuristic evaluation
• Real-world ballots from GTS 2003 council election• Found exact minimax solution• Ran each heuristic 5000 times• Compared exact minimax solution with heuristics to find
realized approximation ratios– example: 15/14 = 1.0714
• maxscore of solution found = 15• maxscore of exact solution = 14
• We also performed experiments using ballots generated according to random distributions (see paper)
25
Average approx. ratios found
radius = 1 radius = 2fixed-size minimax
1.0012 1.0000
3-approx. 1.0017 1.0000
random set
1.0057 1.0000
highest-maxscore
1.0059 1.0000
performance on GTS ’03 election data
n = 24 candidates, k = 12 winners, m = 161 ballots
26
Largest approx. ratios found
radius = 1 radius = 2fixed-size minimax
1.0714 1.0000
3-approx. 1.0714 1.0000
random set
1.0714 1.0000
highest-maxscore
1.0714 1.0000
performance on GTS ’03 election data
n = 24 candidates, k = 12 winners, m = 161 ballots
27
Conclusions from all experiments
• All heuristics perform near-optimally– highest ratio found: 1.2– highest average ratio < 1.04
• When radius is larger, performance improves and running time increases
• The fixed-size-minisum starting point performs best overall (with our 3-approx. a close second)
28
Manipulating FSM
00110 00011
00111
0000110111
01111 00011
•Voters are sincere
•Another optimal solution: 00101
2 1
2 0
2 1
max = 2
29
Manipulating FSM
11110 00011
00111
0000110111
01111 00110
•A voter manipulates and realizes ideal outcome
2 1
2 2
2 3
00110
0
max = 3
30
Nonmanipulable “FSM”?
Electing a set found using our 3-approximation for FSM gives a nonmanipulable procedure:
• For the voters whose ballots are not chosen, voting insincerely cannot affect the outcome
• For the voter whose ballot is chosen, the outcome will be one of the sets of size k closest to the voter’s wishes
31
Conclusions
• BSM and FSM are NP-hard• Both can be approximated with ratio 3• Polynomial-time local search heuristics perform well
in practice– some retain ratio-3 guarantee
• Exact FSM can be manipulated• Our 3-approximation for FSM is nonmanipulable
32
Future work
• Investigate weighted version of minimax [Brams, Kilgour & Sanver, ’06]
• What is the best approximation ratio for FSM achievable in polynomial time? (Is there a PTAS?)
• What is the nonmanipulable FSM approximation algorithm with the best ratio?
Thanks!