+ All Categories
Home > Science > On approximating the Riemannian 1-center

On approximating the Riemannian 1-center

Date post: 05-Dec-2014
Category:
Upload: frank-nielsen
View: 274 times
Download: 1 times
Share this document with a friend
Description:
Riemannian k-center clustering, riemannian 1-center, core-sets
39
On the Smallest Enclosing Riemannian Balls — On Approximating the Riemannian 1-Center — http://www.sonycsl.co.jp/person/nielsen/infogeo/RiemannMinimax/ Marc Arnaudon 1 Frank Nielsen 2 1 Universit´ e de Bordeaux, France 2 ´ Ecole Polytechnique & Sony CSL e-mail: [email protected] Computational Geometry 46(1): 93-104 (2013) arXiv 1101.4718 c 2013-14 Frank Nielsen, ´ Ecole Polytechnique & Sony Computer Science Laboratories 1/39
Transcript
Page 1: On approximating the Riemannian 1-center

On the Smallest Enclosing Riemannian Balls— On Approximating the Riemannian 1-Center —

http://www.sonycsl.co.jp/person/nielsen/infogeo/RiemannMinimax/

Marc Arnaudon1 Frank Nielsen2

1Universite de Bordeaux, France2Ecole Polytechnique & Sony CSLe-mail: [email protected]

Computational Geometry 46(1): 93-104 (2013)arXiv 1101.4718

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 1/39

Page 2: On approximating the Riemannian 1-center

Introduction: Euclidean Smallest Enclosing BallsGiven d -dimensional P = {p1, ..., pn}, find the “smallest”(with respect to the volume ≡ radius ≡ inclusion)ball B = Ball(c , r) fully covering P:

c∗ = minc∈Rd

nmaxi=1‖c − pi‖.

◮ unique Euclidean circumcenter c∗, SEB [19].◮ optimization problem non-differentiable [10]

c∗ lie on the farthest Voronoi diagram

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 2/39

Page 3: On approximating the Riemannian 1-center

Euclidean smallest enclosing balls (SEBs)

◮ 1857: d = 2, Smallest Enclosing Ball? of P = {p1, ..., pn}(Sylvester [16])

◮ Randomized expected linear time algorithm [19, 5] in fixed

dimension (but hidden constant exponential in d)

◮ Core-set [3] approximation: (1 + ǫ)-approximation inO(dn

ǫ2)-time in arbitrary dimension, O(dnǫ + 1

ǫ4.5log 1

ǫ ) [7]

◮ Many other algorithms and heuristics [14, 9, 17], etc.

SEB also known as Minimum Enclosing Ball (MEB), minimaxcenter, 1-center, bounding (hyper)sphere, etc.

→ Applications in computer graphics (collision detection with ballcover proxies [15]), in machine learning (Core VectorMachines [18]), etc.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 3/39

Page 4: On approximating the Riemannian 1-center

Optimization and core-sets [3]

Let c(P) denote the circumcenter of the SEB and r(P) its radius

Given ǫ > 0, ǫ-core-set C ⊂ P, such that

P ⊆ Ball(c(C), (1 + ǫ)r(C))

⇔ Expanding SEB(C) by 1 + ǫ fully covers P

Core-set of optimal size ⌈1ǫ ⌉, independent of the dimension d ,and n. Note that combinatorial basis for SEB is from 2 tod + 1 [19].

→ Core-sets find many applications for problems inlarge-dimensions.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 4/39

Page 5: On approximating the Riemannian 1-center

Euclidean SEBs from core-sets [2]

Badoiu-Clarkson algorithm based on core-sets [2, 3]:

BCA:

◮ Initialize the center c1 ∈ P = {p1, ..., pn}, and◮ Iteratively update the current center using the rule

ci+1 ← ci +fi − cii + 1

where fi denotes the farthest point of P to ci :

fi = ps , s = argmaxnj=1‖ci − pj‖

⇒ gradient-descent method⇒ (1 + ǫ)-approximation after ⌈ 1

ǫ2⌉ iterations: O(dn

ǫ2) time

⇒ Core-set: f1, ..., fl with l = ⌈ 1ǫ2⌉

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 5/39

Page 6: On approximating the Riemannian 1-center

Euclidean SEBs from core-sets: Rewriting with #a#tb: point (1− t)a+ tb = a+ t(b− a) on the line segment [ab].D(x , y) = ‖x − y‖2, D(x ,P) = miny∈P D(x , y)

Algorithm 1: BCA(P, l).c1 ← choose randomly a point in P ;for i = 2 to l − 1 do

// farthest point from ci

si ← argmaxnj=1D(ci , pj);

// update the center: walk on the segment [ci , psi ]

ci+1 ← ci# 1i+1

psi ;

end

// Return the SEB approximation

return Ball(cl , r2l = D(cl ,P)) ;

⇒ (1 + ǫ)-approximation after l = ⌈ 1ǫ2⌉ iterations.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 6/39

Page 7: On approximating the Riemannian 1-center

Bregman divergences (incl. squared Euclidean distance)

SEB extended to Bregman divergences BF (· : ·) [13]

BF (c : x) = F (c)− F (x)− 〈c − x ,∇F (x)〉,BF (c : X ) = minx∈X BF (c : x)

F

q p

p

qHq

H ′

q

BF (p, q) = Hq − H ′

q

⇒ Bregman divergence = remainder of a first order Taylorexpansion.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 7/39

Page 8: On approximating the Riemannian 1-center

Smallest enclosing Bregman ball [13]

F ∗ = convex conjugate of F with (∇F )−1 = ∇F ∗

Algorithm 2: MBC(P, l).// Create the gradient point set (η-coordinates)

P ′ ← {∇F (p) : p ∈ P};g ← BCA(P ′, l);

return Ball(cl = ∇F−1(c(g)), rl = BF (cl : P)) ;Guaranteed approximation algorithm with approximation factordepending on 1

minx∈X ‖∇2F (x)‖ , ... but poor in practice

∀s, SF (x ;∇F−1(c(g))) ≤ (1 + ǫ)2r ′∗

minx∈X ‖∇2F (x)‖with SF (c ; x) = BF (c : x) + BF (x : c)

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 8/39

Page 9: On approximating the Riemannian 1-center

Smallest enclosing Bregman ball [13]

A better approximation in practice...

Algorithm 3: BBCA(P, l).c1 ← choose randomly a point in P ;for i = 2 to l − 1 do

// farthest point from ci wrt. BF

si ← argmaxnj=1BF (ci : pj);

// update the center: walk on the η-segment[ci , psi ]η

ci+1 ← ∇F−1(∇F (ci )# 1i+1∇F (psi )) ;

end

// Return the SEBB approximation

return Ball(cl , rl = BF (cl : X )) ;

θ-, η-geodesic segments in dually flat geometry.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 9/39

Page 10: On approximating the Riemannian 1-center

Basics of Riemannian geometry

◮ (M, g): Riemannian manifold

◮ 〈·, ·〉, Riemannian metric tensor g : definite positive bilinearform on each tangent space TxM (depends smoothly on x)

◮ ‖ · ‖x : ‖u‖ = 〈u, u〉1/2: Associated norm in TxM

◮ ρ(x , y): metric distance between two points on the manifoldM (length space)

ρ(x , y) = inf

{∫ 1

0

‖ϕ(t)‖ dt, ϕ ∈ C 1([0, 1],M), ϕ(0) = x , ϕ(1) = y

}

Parallel transport wrt. Levi-Civita metric connection ∇: ∇g = 0.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 10/39

Page 11: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Exponential map

◮ Local map from the tangent space TxM to the manifolddefined with geodesics (wrt ∇).

∀x ∈ M,D(x) ⊂ TxM : D(x) = {v ∈ TxM : γv (1) is defined}

with γv maximal (i.e., largest domain) geodesic withγv (0) = x and γ′v (0) = v .

◮ Exponential map:

expx(·) : D(x) ⊆ TxM → M

expx(v) = γv (1)

D is star-shaped.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 11/39

Page 12: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Geodesics

◮ Geodesic: smooth path which locally minimizes the distancebetween two points. (In general such a curve does notminimize it globally.)

◮ Given a vector v ∈ TxM with base point x , there is a uniquegeodesic started at x with speed v at time 0: t 7→ expx(tv) ort 7→ γt(v).

◮ Geodesic on [a, b] is minimal if its length is less or equal toothers. For complete M (i.e., expx(v)), taking x , y ∈ M, thereexists a minimal geodesic from x to y in time 1.γ·(x , y) : [0, 1]→ M, t 7→ γt(x , y) with the conditionsγ0(x , y) = x and γ1(x , y) = y .

◮ U ⊆ M is convex if for any x , y ∈ U, there exists a uniqueminimal geodesic γ·(x , y) in M from x to y . Geodesic fullylies in U and depends smoothly on x , y , t.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 12/39

Page 13: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Geodesics

◮ Geodesic γ(x , y): locally minimizing curves linking x to y

◮ Speed vector γ′(t) parallel along γ:

Dγ′(t)dt

= ∇γ′(t)γ′(t) = 0

◮ When manifold M embedded in Rd , acceleration is normal to

tangent plane:γ′′(t) ⊥ Tγ(t)M

◮ ‖γ′(t)‖ = c , a constant (say, unit).

⇒ Parameterization of curves with constant speed...

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 13/39

Page 14: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Geodesics

Constant speed geodesic γ(t) so that γ(0) = x and γ(ρ(x , y)) = y(constant speed 1, the unit of length).

x#ty = m = γ(t) : ρ(x ,m) = t × ρ(x , y)

For example, in the Euclidean space:

x#ty = (1− t)x + ty = x + t(y − x) = m

ρE (x ,m) = ‖t(y − x)‖ = t‖y − x‖ = t × ρ(x , y), t ∈ [0, 1]

⇒ m interpreted as a mean (barycenter) between x and y .

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 14/39

Page 15: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Injectivity radius

Diffeomorphism from the tangent space to the manifold

◮ Injectivity radius inj(M): largest r > 0 such that for allx ∈ M, the map expx(·) restricted to the open ball in TxMwith radius r is an embedding.

◮ Global injectivity radius: infimum of the injectivity radius overall points of the manifold.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 15/39

Page 16: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Sectional curvature

Given x ∈ M, u, v two non collinear vectors in TxM, the sectionalcurvature Sect(u, v) = K is a number which gives information onhow the geodesics issued from x behave near x .More precisely, the image by expx(·) of the circle centered at 0 ofradius r > 0 in Span(u, v) has length

2πSK (r) + o(r3) as r → 0

with

SK (r) =

sin(√Kr)√K

if K > 0,

r if K = 0,sinh(

√−Kr)√

−Kif K < 0.

positive, zero or negative curvatures...

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 16/39

Page 17: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Alexandrov’s theorem

Given an upper bound α2 for sectional curvatures, comparegeodesic triangles by Alexandrov theorem:Let x1, x2, x3 ∈ M satisfy x1 6= x2, x1 6= x3 and

ρ(x1, x2) + ρ(x2, x3) + ρ(x3, x1) < 2min{

inj(M),π

α

}

where α > 0 is such that α2 is an upper bound of sectionalcurvatures. Let the minimizing geodesic from x1 to x2 and theminimizing geodesic from x1 to x3 make an angle θ at x1.Denoting by S2

α2 the 2-dimensional sphere of constant curvatureα2 (hence of radius 1/α) and ρ the distance in S2

α2 , we considerpoints x1, x2, x3 ∈ S2

α2 such that ρ(x1, x2) = ρ(x1, x2),ρ(x1, x3) = ρ(x1, x3). Assume that the minimizing geodesic fromx1 to x2 and the minimizing geodesic from x1 to x3 also make anangle θ at x1.

Then we have: ρ(x2, x3) ≥ ρ(x2, x3) .

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 17/39

Page 18: On approximating the Riemannian 1-center

Basics of Riemannian geometry: Topogonov’s theorem

Assume β > 0 is such that −β2 is a lower bound for sectionalcurvatures in M. Let x1, x2, x3 ∈ M satisfy x1 6= x2, x1 6= x3. Letthe minimizing geodesic from x1 to x2 and the minimizing geodesicfrom x1 to x3 make an angle θ at x1. Denoting by H2

−β2 the

hyperbolic 2-dimensional space of constant curvature −β2 andρ the distance in H2

−β2 , we consider points x1, x2, x3 ∈ H2−β2 such

that ρ(x1, x2) = ρ(x1, x2), ρ(x1, x3) = ρ(x1, x3). Assume that theminimizing geodesic from x1 to x2 and the minimizing geodesicfrom x1 to x3 also make an angle θ at x1.

Then we have: ρ(x2, x3) ≤ ρ(x2, x3) .

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 18/39

Page 19: On approximating the Riemannian 1-center

Basics of Riemannian geometry: First law of cosines

In spherical/hyperbolic geometries:

◮ If θ1, θ2, θ3 are the angles of a triangle in S2α2 and l1, l2, l3 are

the lengths of the opposite sides, then

cos θ3 =cos(αl3)− cos(αl1) cos(αl2)

sin(αl1) sin(αl2)

◮ If θ1, θ2, θ3 are the angles of a triangle in H2−β2 and l1, l2, l3

are the lengths of the opposite sides, then

cos θ3 =cosh(βl1) cosh(βl2)− cosh(βl3)

sinh(βl1) sinh(βl2)

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 19/39

Page 20: On approximating the Riemannian 1-center

Now ready for the “Smallest enclosing Riemannian ball”

(M, g): complete Riemannian manifoldν: probability measure on Mρ(x , y): Riemannian metric distance

Assume the measure support supp(ν) ⊆ in a geodesic ball

B(o,R).

f : M → R: measurable function

‖f ‖L∞(ν) = inf {a > 0, ν ({y ∈ M, |f (y)| > a}) = 0} .

α > 0 such that α2 upper bounds the sectional curvatures in M.

Rα =1

2min

{

inj(M),π

α

}

inj(M): injectivity radius

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 20/39

Page 21: On approximating the Riemannian 1-center

Riemannian SEB: Existence and uniqueness [1]

AssumeR < Rα

Consider farthest point map:

H : M → [0,∞]

x 7→ ‖ρ(x , ·)‖L∞(ν) (1)

c ∈ B(o,R).→ c ⊂ CH(supp(ν)) [1] (convex hull)

⇒ center: notion of centrality of the measure⇒ point set: discrete measure, center → circumcenter

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 21/39

Page 22: On approximating the Riemannian 1-center

Example of Riemannian manifold: SPD space

Space of Symmetric Positive Definite (SPD) matrices with

◮ Riemannian distance:

ρ(P ,Q) = ‖ log(P−1Q)‖F =

d∑

i=1

log2 λi

where λi are the eigenvalues of matrix P−1Q.

◮ Non-compact Riemannian symmetric space of non-positivecurvature (aka. Cartan-Hadamard manifold).

◮ Any measure ν with bounded support satisfies R < Rα

(choose α > 0).

⇒ Minimizer c of farthest point map H exists and is unique:1-center or minimax center of ν.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 22/39

Page 23: On approximating the Riemannian 1-center

Generalizing BCA to Riemannian manifolds

GeoA:

◮ Initialize the center with c1 ∈ P, and◮ Iteratively update the current minimax center as

ci+1 = Geodesic

(

ci , fi ,1

i + 1

)

where fi denotes the farthest point of P to ci , andGeodesic(p, q, t) denotes the intermediate point m onthe geodesic passing through p and q such thatρ(p,m) = t × ρ(p, q).

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 23/39

Page 24: On approximating the Riemannian 1-center

Generalizing BCA to Riemannian manifolds

a#Mt b: point γ(t) on the geodesic line segment [ab] wrt M.

Algorithm 4: GeoA

c1 ← choose randomly a point in P ;for i = 2 to l do

// farthest point from ci

si ← argmaxnj=1ρ(ci , pj );

// update the center: walk on the geodesic line

segment [ci , psi ]

ci+1 ← ci#M1

i+1

psi ;

end

// Return the SEB approximation

return Ball(cl , rl = ρ(cl ,P)) ;

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 24/39

Page 25: On approximating the Riemannian 1-center

Proof sketch

Assume supp(ν) ⊂ B(o,R) and

R < Rα =1

2min

{

inj(M),π

α

}

with α > 0 such that α2 is an upper bound for the sectionalcurvatures in M.

LemmaThere exists τ > 0 such that for all x ∈ B(o,R),

H(x) − H(c) ≥ τρ2(x , c)

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 25/39

Page 26: On approximating the Riemannian 1-center

Stochastic approximation for measuresFor x ∈ B(o,R), t 7→ γt(v(x , ν)) a unit speed geodesic fromγ0(v(x , ν)) = x to one point y = γH(x)(v(x , ν)) in supp(ν) whichrealizes the maximum of the distance from x to supp(ν).

v =1

H(x)exp−1

x (y)

RieA:Fix some δ > 0.

◮ Step 1 Choose a starting point x0 ∈ supp(ν) and letk = 0

◮ Step 2 Choose a step size tk+1 ∈ (0, δ] and letxk+1 = γtk+1

(v(xk , ν)), then do again step 2 withk ← k + 1.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 26/39

Page 27: On approximating the Riemannian 1-center

Convergence theorem for RieAa ∧ b: minimum operator a ∧ b = min(a, b).

R0 =Rα − R

2∧ R

2.

Assume α, β > 0 are such that −β2 is a lower bound and α2 anupper bound of the sectional curvatures in M. If the step sizes(tk)k≥1 satisfy

δ ≤ R0

2∧ 2

βarctanh (tanh(βR0/2) cos(αR) tan(αR0/4)) ,

limk→∞

tk = 0,∞∑

k=1

tk = +∞ and∞∑

k=1

t2k <∞.

then the sequence (xk)k≥1 generated by the algorithm satisfies

limk→∞

ρ(xk , c) = 0

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 27/39

Page 28: On approximating the Riemannian 1-center

Case study I: Hyperbolic planar manifold

In Klein disk (projective model), geodesics are straight (euclidean)lines [11].

ρ(p, q) = arccosh1− p⊤q

(1− p⊤p)(1− q⊤q)

where arccosh(x) = log(x +√x2 − 1).

Here, we choose non-constant speed curve parameterization (notconstant-speed geodesic):

γt(p, q) = (1− t)p + tq, t ∈ [0, 1].

⇒ Implement a dichotomy on γt(p, q) to get #t .

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 28/39

Page 29: On approximating the Riemannian 1-center

Initialization First iteration

Second iteration Third iteration

Fourth iteration after 104 iterations

http://www.sonycsl.co.jp/person/nielsen/infogeo/RiemannMinimax/

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 29/39

Page 30: On approximating the Riemannian 1-center

Performance

0

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0 50 100 150 200

Klein distance between current center and minimax center

"expe1.dat" using 1:2

0.5

0.55

0.6

0.65

0.7

0.75

0.8

0 50 100 150 200

Radius of the smallest enclosing Klein ball anchored at current center

"expe1.dat" using 1:3

(a) (b)Convergence rate of the GeoA algorithm for the hyperbolic disk forthe first 200 iterations. Horizontal axis: number of iterationsVertical axis: (a) the relative Klein distance between the currentcenter and the optimal 1-center, (b) the radius of the smallestenclosing ball anchored at the current center.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 30/39

Page 31: On approximating the Riemannian 1-center

Case study II: Space of SPD matrices

◮ d × d matrix M Symmetric Positive Definite (SPD) ⇔M = M⊤ and that for all x 6= 0, x⊤Mx > 0.

◮ The set of d × d SPD matrices: manifold of dimensiond(d+1)

2 [8]

◮ The geodesic linking (matrix) point P to point Q:

γt(P ,Q) = P12

(

P− 12QP− 1

2

)tP

12

where the matrix function h(M) is computed from thesingular value decomposition M = UDV⊤ (with U and Vunitary matrices and D = diag(λ1, ..., λd ) a diagonal matrixof eigenvalues) as h(M) = Udiag(h(λ1), ..., h(λd ))V

⊤. Forexample, the square root function of a matrix is computed as

M12 = U diag(

√λ1, ...,

√λd ) V

⊤.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 31/39

Page 32: On approximating the Riemannian 1-center

SPD space: Splitting the geodesic for operator #t

In this case, finding t such that

‖ log(P−1Q)t‖2F = r‖ logP−1Q‖2F , (2)

where ‖ · ‖F denotes the Frobenius norm yields to t = r . Indeed,consider λ1, ..., λd the eigenvalues of P−1Q, then

ρ(P ,Q) = ‖ log(P−1Q)‖F =√

i log2 λi amounts to find

d∑

i=1

log2 λti = t2

d∑

i=1

log2 λi = r2d∑

i=1

log2 λi .

That is t = r .

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 32/39

Page 33: On approximating the Riemannian 1-center

Case study II: Performance

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0 50 100 150 200

Riemannian distance between current SPD center and minimax SPD center

"expSPD1.dat" using 1:2

12

12.5

13

13.5

14

14.5

15

15.5

16

0 50 100 150 200

Radius of the smallest enclosing Riemannian ball anchored at current SPD center

"expSPD1.dat" using 1:3

(a) (b)Convergence rate of the GeoA algorithm for the SPD Riemannianmanifold (dimension 5) for the first 200 iterations.Horizontal axis: number of iterations iVertical axis:

◮ (a) the relative Riemannian distance between the current

center ci and the optimal 1-center c∗ (ρ(c∗,ci )r∗ )

◮ (b) the radius ri of the smallest enclosing SPD ball anchoredat the current center.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 33/39

Page 34: On approximating the Riemannian 1-center

Remark on SPD spaces and hyperbolic geometry

◮ 2D SPD(2) matrix space has dimension d = 3: A positivecone.

{

(a, b, c) : a > 0, ab − c2 > 0}

◮ Can be peeled into sheets of dimension 2, each sheetcorresponding to a constant value of the determinant of theelements [4]

SPD(2) = SSPD(2)× R+,

where SSPD(2) = {a, b, c =√1− ab) : a > 0, ab − c2 = 1}

◮ Map to (x0 =a+b2 ≥ 1, x1 = a−b

2 , x2 = c) in hyperboloid

model [12], and z = a−b+2ic2+a+b in Poincare disk [12].

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 34/39

Page 35: On approximating the Riemannian 1-center

Conclusion: Smallest Riemannian Enclosing Ball

◮ Generalize Euclidean 1-center algorithm of [2] to Riemanniangeometry

◮ Proved the convergence under mild assumptions (formeasures/point sets)

◮ Existence of Riemannian core-sets for optimization

◮ 1-center building block for k-center clustering [6]

◮ can be extended to sets of Riemannian (geodesic) balls

Reproducible research codes with interactive demos:

http://www.sonycsl.co.jp/person/nielsen/infogeo/RiemannMinimax/

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 35/39

Page 36: On approximating the Riemannian 1-center

Bibliographic references IBijan Afsari.

Riemannian Lp center of mass : existence, uniqueness, and convexity.

Proceedings of the American Mathematical Society, 139:655–674, February 2011.

Mihai Badoiu and Kenneth L. Clarkson.

Smaller core-sets for balls.

In SODA ’03: Proceedings of the fourteenth annual ACM-SIAM symposium on Discrete algorithms, pages801–802, Philadelphia, PA, USA, 2003. Society for Industrial and Applied Mathematics.

Mihai Badoiu and Kenneth L. Clarkson.

Optimal core-sets for balls.

Computational Geometry: Theory and Applications, 40:14–22, May 2008.

Pascal Chossat and Olivier P. Faugeras.

Hyperbolic planforms in relation to visual edges and textures perception.

PLoS Computational Biology, 5(12), 2009.

Kaspar Fischer and Bernd Gartner.

The smallest enclosing ball of balls: combinatorial structure and algorithms.

Int. J. Comput. Geometry Appl., 14(4-5):341–378, 2004.

Teofilo F. Gonzalez.

Clustering to minimize the maximum intercluster distance.

Theoretical Computer Science, 38(0):293 – 306, 1985.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 36/39

Page 37: On approximating the Riemannian 1-center

Bibliographic references II

Piyush Kumar, Joseph S. B. Mitchell, and E. Alper Yildirim.

Approximate minimum enclosing balls in high dimensions using core-sets.

ACM Journal of Experimental Algorithmics, 8, 2003.

Serge Lang.

Fundamentals of differential geometry, volume 191 of Graduate Texts in Mathematics.

Springer-Verlag, New York, 1999.

Thomas Larsson.

Fast and tight fitting bounding spheres.

In Proceedings of the Annual SIGRAD Conference, Stockholm, 2008.

Frank Nielsen and Richard Nock.

Approximating smallest enclosing balls with applications to machine learning.

Int. J. Comput. Geometry Appl., 19(5):389–414, 2009.

Frank Nielsen and Richard Nock.

Hyperbolic Voronoi diagrams made easy.

In International Conference on Computational Science and its Applications (ICCSA), volume 1, pages74–80, Los Alamitos, CA, USA, march 2010. IEEE Computer Society.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 37/39

Page 38: On approximating the Riemannian 1-center

Bibliographic references IIIFrank Nielsen and Richard Nock.

Visualizing hyperbolic Voronoi diagrams.

In Proceedings of the Thirtieth Annual Symposium on Computational Geometry, SOCG’14, pages90:90–90:91, New York, NY, USA, 2014. ACM.

Richard Nock and Frank Nielsen.

Fitting the smallest enclosing bregman balls.

In 16th European Conference on Machine Learning (ECML), pages 649–656, October 2005.

Jack Ritter.

An efficient bounding sphere.

In Graphics gems, pages 301–303. Academic Press Professional, Inc., 1990.

Jonas Spillmann, Markus Becker, and Matthias Teschner.

Efficient updates of bounding sphere hierarchies for geometrically deformable models.

Journal of Visual Communication and Image Representation, 18(2):101–108, 2007.

J. J. Sylvester.

A question in the geometry of situation.

Quarterly Journal of Pure and Applied Mathematics, 1:79, 1857.

Bo Tian.

Bouncing Bubble: A fast algorithm for Minimal Enclosing Ball problem.

2012.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 38/39

Page 39: On approximating the Riemannian 1-center

Bibliographic references IV

Ivor W. Tsang, Andras Kocsor, and James T. Kwok.

Simpler core vector machines with enclosing balls.

In Proceedings of the 24th International Conference on Machine Learning (ICML), pages 911–918, NewYork, NY, USA, 2007. ACM.

Emo Welzl.

Smallest enclosing disks (balls and ellipsoids).

In H. Maurer, editor, New Results and New Trends in Computer Science, LNCS. Springer, 1991.

c© 2013-14 Frank Nielsen, Ecole Polytechnique & Sony Computer Science Laboratories 39/39


Recommended