+ All Categories
Home > Documents > Deterministic Discrepancy Minimization

Deterministic Discrepancy Minimization

Date post: 24-Feb-2016
Category:
Upload: suchin
View: 49 times
Download: 0 times
Share this document with a friend
Description:
Deterministic Discrepancy Minimization. Nikhil Bansal (TU Eindhoven) Joel Spencer (NYU). S 3. S 4. S 1. S 2. Combinatorial Discrepancy. Universe: U= [1,…,n] Subsets: S 1 ,S 2 ,…, S m Problem: Color elements red/blue so each subset is colored as evenly as possible. - PowerPoint PPT Presentation
18
1/17 Deterministic Discrepancy Minimization Nikhil Bansal (TU Eindhoven) Joel Spencer (NYU)
Transcript
Page 1: Deterministic Discrepancy Minimization

1/17

Deterministic Discrepancy Minimization

Nikhil Bansal (TU Eindhoven)Joel Spencer (NYU)

Page 2: Deterministic Discrepancy Minimization

2/17

Combinatorial DiscrepancyUniverse: U= [1,…,n] Subsets: S1,S2,…,Sm

Problem: Color elements red/blue so each subset is colored as evenly as possible.

CS: Computational Geometry, Comb. Optimization, Monte-Carlo simulation, Machine learning, Complexity, Pseudo-Randomness, …Math: Dynamical Systems, Combinatorics, Mathematical Finance, Number Theory, Ramsey Theory, Algebra, Measure Theory, …

S1

S2

S3

S4

Page 3: Deterministic Discrepancy Minimization

3/17

General Set System

Universe: U= [1,…,n] Subsets: S1,S2,…,Sm

Find : [n] ! {-1,+1} toMinimize |(S)|1 = maxS | i 2 S (i) |

For simplicity consider m=n henceforth.

Page 4: Deterministic Discrepancy Minimization

4/17

Simple AlgorithmRandom: Color each element i independently as x(i) = +1 or -1 with probability ½ each.

Thm: Discrepancy = O (n log n)1/2

Pf: For each set, expect O(n1/2) discrepancyStandard tail bounds: Pr[ | i 2 S x(i) | ¸ c n1/2 ] ¼ e-c2

Union bound + Choose c ¼ (log n)1/2

Analysis tight: Random actually incurs ((n log n)1/2).

Page 5: Deterministic Discrepancy Minimization

5/17

Better Colorings Exist![Spencer 85]: (Six standard deviations suffice) Always exists coloring with discrepancy · 6n1/2

Tight: Cannot beat 0.5 n1/2 (Hadamard Matrix, “orthogonal” sets)

Inherently non-constructive proof (pigeonhole principle on exponentially large universe)

Challenge: Can we find it algorithmically ? (Certain algorithms do not work)

Conjecture [Alon-Spencer]: May not be possible.

Page 6: Deterministic Discrepancy Minimization

6/17

Algorithmic Results

[Bansal 10]: Efficient (randomized) algorithm for Spencer’s result.

Technique: SDPs (new rounding idea) Use several SDPs over time (guided by the non-constructive method).

General: Geometric problems, Beck Fiala setting, k-permutation problem, pseudo-approximation of discrepancy, …

Thm: Deterministic Algorithm for Spencer’s (and other) results.

Page 7: Deterministic Discrepancy Minimization

This Talk•  

A

𝑥1… 𝑥𝑛Goal: Round to -1 or 1 Minimize error for each row

Chernoff: Error = Spencer: Error =

7/17

Page 8: Deterministic Discrepancy Minimization

Derandomizing Chernoff(Pessimistic estimators, exp. moments, hyp. cosine rule, …)

•  

8/17

Page 9: Deterministic Discrepancy Minimization

The Problem

9/17

Such approaches cannot get rid of

(Chooser-Pusher Games: Where each column rounded in an online manner)

Algorithm of Bansal uses a more global approach

Page 10: Deterministic Discrepancy Minimization

10/17

Algorithm (at high level)

Cube: {-1,1}nstart

finish

Each dimension: A variableEach vertex: A rounding

Algorithm: At step t, update Fix variable if reaches -1 or 1.

g: random is random Gaussian in

Each distributed as a GaussianBut the ’s are correlated.

Page 11: Deterministic Discrepancy Minimization

11/17

SDP relaxations

SDPs (LP on )“is small” 8 j |vi|2 = 1 Intended soln. vi = (+1,0,…,0) or (-1,0,…,0).

Spencer’s result (entropy method) guarantees feasibility.

Key point of Gaussian rounding: Say if Then

Page 12: Deterministic Discrepancy Minimization

12/17

Analysis (at high level)

Cube: {-1,1}n

Analysis: Progress: Few steps to reach a vertex (walk has high variance)

Low Discrepancy: For each equation, discrepancy random walk has low variance

start

finish

Each dimension: An ElementEach vertex: A Coloring

Page 13: Deterministic Discrepancy Minimization

Making it Deterministic

Need to find an update thati) Makes Progress ii) Adds low discrepancy to equations.

Recall, for Chernoff: Round one variable at a time (Progress)

Whether -1 or +1, guided by the potential. (Low Discrepancy)

13/17

Page 14: Deterministic Discrepancy Minimization

Tracking the propertiesi) For low discrepancy. Define suitable potential and bound its increase (as in Chernoff, but refined)

ii) For Progress Potential Energy should go up sufficiently

Conflicting goals (hold in expectation)No reason why such an update should even exist.

14/17

Page 15: Deterministic Discrepancy Minimization

Our fix

Add extra constraints to SDP to force a good update to exist.

Orthogonality trick: Say currently at Add SDP constraint

Ensures that update orthogonal to x. The length (potential) always increases!

Analogous constraint for low discrepancy potential(bounds increase by right amount)

15/17

origin

x(t-1)

x(t): Newposition

𝛿 (𝑡 )

Page 16: Deterministic Discrepancy Minimization

Trouble

Why should this SDP remain feasible?

In Bansal’s (randomized) algorithm SDP was feasible due to Spencer’s existential result.

Key point: New constraint is of similar type i.e.is small)Entropy method: new SDP is still feasible.

16/17

Finish off: Use k-wise vectors instead of Gaussian

Page 17: Deterministic Discrepancy Minimization

17/17

Concluding Remarks

Idea: Add new constraints to force a deterministic choice to exist.

Works more generally for other discrepancy problems.

Can potentially have other applications.

Thank You!

Page 18: Deterministic Discrepancy Minimization

Techniques

18/17

Entropy Method

Spencer’s Result

Bansal’s Result

SDPs

New “orthogonality” idea (based on entropy)+ K-wise independence, pessimistic estimators, …

This Result


Recommended