1 Optimal Staffing of Systems with Skills- Based-Routing Temporary Copy Do not circulate.

Post on 21-Dec-2015

215 views 1 download

Tags:

transcript

1

Optimal Staffing of Systems with Skills-Based-Routing

Temporary Copy

Do not circulate

2

Optimal Staffing of Systems with Skills-Based-Routing

OR seminar, July 21, 2008

Zohar Feldman

Advisor: Prof. Avishai Mandelbaum

3

Contents

Skills-Based-Routing (SBR) Models The Optimization Problem Related Work Optimization Algorithm (Stochastic

Approximation) Experimental Results Future Work

4

Introduction to SBR Systems

I – set of customer classes J – set of server pools Arrivals for class i: renewal

(e.g. Poisson) processes, rate λi

Servers in pool j: Nj, statistical identical

Service of class i by pool j: DS

i,j

(Im)patience of class i: DPi

Schematic Representation

iPiD

,Si jD

jN

5

Introduction to SBR Systems

Routing Arrival Control: upon customer arrival, which

of the available servers, if any, should be assigned to serve the arriving customer

Idleness Control: upon service completion, which of the waiting customers, if any, should be admitted to service

6

The Optimization Problem

We consider two optimization problems: Cost Optimization Constraints Satisfaction

7

1

min

. .

KT

J

k

Nk

T

c f NN

s t A N b

N

The Optimization Problem

Cost Optimization Problem

f k(N) – service level penalty functions Examples:

f k(N) = c’kλkPNabk – cost of abandonments per time unit

f k(N) = λkEN[c’k(Wk)] – waiting costs

8

The Optimization Problem

Constraints Satisfaction Problem

f k(N) – service level objective Examples

f k(N) = PNWk>Tk – probability of waiting more than T time units

f k(N) = EN[Wk] – expected wait

min

. . , 1,

T

N

k

T

J

k

c N

f Ns t k K

A N b

N

9

Related Work

Call Centers Review (Gans, Koole, Mandelbaum) V model (Gurvich, Armony, Mandelbaum) Inverted-V model (Armony, Mandelbaum) FQR (Gurvich, Whitt) Gcµ (Mandelbaum, Stolyar) Simulation & Cutting Planes (Henderson, Epelman) Staffing Algorithm (Whitt, Wallace) ISA (Feldman, Mandelbaum, Massey, Whitt) Stochastic Approximation (Juditsky, Lan, Nemirovski,

Shapiro)

10

Simulation Approach

Need to evaluate

Generate samples ω1, ω2,…

Estimate

,f N F N dP

1

1ˆ ,m

ii

f N F Nm

11

Stochastic Approximation (SA)

Uses Monte-Carlo sampling techniques to solve (approximate)

- convex set ξ – random vector, probability distribution P

supported on set Ξ - convex almost surely

min : ,x X

E xf x F

nX

:F X

12

Stochastic Approximation – Basic Assumptions f(x) is analytically intractable There is a sampling mechanism that can be

used to generate iid samples from Ξ There is an Oracle at our disposal that returns

for any x and ξ The value F(x,ξ) A stochastic subgradient G(x,ξ)

,

, , ,

x

T

F x

F y F x y x y X

G x,ξ

G x,ξ

13

SA for SBR

f(x) – analytically intractable

Sampling mechanism for generating samples ξ1, ξ2,…

Oracle returns F(x, ξi)

Oracle returns G(x, ξi)

F convex a.s. !

f(N) – analytically intractable

SBR simulation generates sample paths ω1, ω2,…

Simulation calculates performance measures F(N,ωi)

G(N,ωi) - finite differences

F convex a.s. ?

,f x E F x , dN F N Pf

14

Stochastic Approximation - Algorithms Basic SA

θ depends on strong convexity constant Objective convergence rate - O(j-1). (error of

solution O(j-0.5))

1 : ,

:

j X j j j j

j

x x G x

j

15

Stochastic Approximation - Algorithms Robust SA

θ does not depend on any parameter (robust) Objective convergence rate - O(J-0.5)

1

1

: ,

:

1

j X j j j j

j

J

N jj

x x G x

J

x xJ

16

Stochastic Approximation - Algorithms Mirror Descent SA

In order to guarantee accuracy ε with confidence level δ, one should use J of order O(ε-2) dependent on δ logarithmically

1

1

: ,

: arg min ,

:

1

jj x j j j

Tx

z X

j

J

J jj

x P G x

P y y z x V x z

J

x xJ

17

Stochastic Approximation - Algorithms Minimax Problems

which is the same as solving the Saddle Point problem

1,

min max : ,k k

x X k Kf x E F x

1

1

min max , :

: 0, 1

Kk

kx X y Y

k

KK

kk

x y y f x

Y y y y

18

Stochastic Approximation - Algorithms Mirror SA for Saddle Point Problem

In order to guarantee accuracy ε with confidence level δ, one should use J of order O(ε-2) dependent on δ logarithmically

1 1 1

1

1

1

: , , ,

,, ,

, , , ,

1

jj j j z j j j

Kk

kk

K

J

J jj

z x y P G x y

y G xG x y

F x F x

z zJ

19

Optimization Algorithms

Let Ω be the probability space formed by arrival, service and patience times.

f(N) can be represented in the form of expectation. For instance,

D(N,ω) is the number of Delayed customers A(ω) is the number of Arrivals

Use simulation to generate samples ω and calculate F(N,ω)

Subgradient is approximated by

,

: 0 : ,N

Nf N P W dP F N

D

AdP

, , ,iiG N F N e F N

20

Cost Optimization Algorithm

– Initialization i ← 0; Choose x0 from X

– Step 1 Generate Fk(xi,ξi) and Gk(xi,ξi) using simulation

– Step 2 xi+1←ΠX(xi- γGk(xi,ξi))

– Step 3 i ← i+1– Step 4 If i < J go to Step 1.

– Step 5 1

J

jj

x xJ

21

Cost Optimization Algorithm

Denote:

Theorem: using , and we achieve

0

22

2

: max

: sup ,

Xx X

x X

D x x

M E G x

2

XD MJ

XD M

J

ˆP f x f x

22

Constraints Satisfaction Algorithm

Basic concept: There exist a solution with cost C that satisfies

the Service Level constraints if”f

where Look for the minimal C in a binary search

fashion

1, ,

min max 0C

kk

x X k Kf x

: : TCX x X c x C

23

CS Algorithm – Formal Procedure

• Initialization dC ←Cmax , x←xmax/2, x* ←xmax

• Step 1 If dC<δ return the solution x*, dC ← dC/2

• Step 2 If Feasible(x)=true, C ← C-dC, x* ← x, , go to Step 1

• Step 3 x ←MirrorSaddleSA(C)• Step 4 If Feasible(x)=true, C ← C-dC, x* ← x,

, go to Step 1• Step 5 C ← C+dC,

go to Step 1

/CXx x C C dC

/CXx x C C dC

/CXx x C C dC

24

CS Algorithm – MirrorSaddle Subprocedure Sub-procedure MirrorSaddleSA

1

1

*

: ,

1ˆ ˆˆ : ,

ˆ:

jj z z j j

J

J J J tt

J

z P G z

z x y zJ

x x

25

CS Algorithm – MirrorSaddle Subprocedure Sub-procedure MirrorSaddleSA

Mapping Function

2

2

1

2

2

1

, 1, ,

, 1, ,

i

j

i

j

iJ

jj

z ii

J K

jj J

z Ji J

z J

Pz K

i J J Kz K

26

CS Algorithm – Feasible Subprocedure Sub-procedure Feasible

[lbk(n), ubk(n)] – confidence interval for fk based on n samples

1. If ubk(n)≤αk+δ for all k=1,…,K return true

2. If lbk(n)>αk+δ for some k=1,…,K return false

3. Generate sample, n ←n+batch, calculate confidence interval [lbk(n), ubk(n)]. Go to 1.

27

CS Algorithm

Denote:

Theorem: using , andwe achieve

2 2 2* *, *,

2 2*,

2 2*,

: 2 ln 2 ln

, ,

, ,

x y

kx x

ky

M M J M K

E G x M k x

E F x M k x

*

2:

5M J

2*

2 2

20MJ

maxlog

1, ,ˆmax 1 1

Ckk

k KP f x

maxlogˆ 1 1 2

CTP c x c

28

Experimental Results

Goals Examine algorithm performance Explore the geometry of the service level

functions, validate convexity Method

Construct SL functions by simulation Compare algorithm solution to optimal

29

Cost Optimization 1: Penalizing Wait

N model (I=2,J=2)

λ1= λ2=100

µ11=1, µ21=1.5, µ22=2

Static Priority: class1 customers prefer pool 1 over pool 2. Pool 2 servers prefer class 1 customers over class2.

1 1.5 2

100100

30

Cost Optimization 1: Penalizing Wait

Problem Formulation

1 2 1 2min 1 2 20 10

. .

N NN

J

N N E W E W

s t AN b

N

31

Cost Optimization 1: The Objective Function

32

Cost Optimization 1: Penalizing Wait

Algorithm solution : N=(88,47), cost=190 Optimal solution: N*=(80,54), cost*=188

33

Cost Optimization 2: Penalizing Abandonments N model (I=2,J=2)

λ1= λ2=100

µ11=1, µ21=1.5, µ22=2

θ1= θ2=1

Static Priority: class1 customers prefer pool 1 over pool 2. Pool 2 servers prefer class 1 customers over class2.

1 1.5 2

100100

1 1

34

Cost Optimization 2: Penalizing Abandonments Problem Formulation

1 2 1 2min 1 2 3 100 2 100

. .

N NN

J

N N P AB P AB

s t AN b

N

35

Cost Optimization 2: The Objective Function

36

Cost Optimization 2: Algorithm Evolution

37

Cost Optimization 2: Convergence Rate

38

Cost Optimization 2: Penalizing Abandonments Algorithm Solution: N=(98,57) cost=219 Optimal Solution: N*=(102,56) cost*=218

39

Constraint Satisfaction 1: Delay Threshold with FQR N model (I=2,J=2)

λ1= λ2=100

µ11=1, µ21=1.5, µ22=2

T1=0.1, α1=0.2; T2=0.2, α2=0.2

FQR: pool 2 admits to service customer from class i which maximize Qi - pi∑Qj, p=(1/3,2/3); Class 1 will go to pool j which maximize Ij - qj ∑ Ik q=(1/2,1/2) i i

ij j

j

Tp

T

1 1.5 2

100100

40

Constraint Satisfaction 1: Delay Threshold with FQR Problem Formulation

1 2

1

2

min1 2

. . 0.1 0.2

0.2 0.2

N

N

N

J

N N

s t P W

P W

N

41

Constraint Satisfaction 1: Delay Threshold with FQR

42

Constraint Satisfaction 1: Delay Threshold with FQR Feasible region and optimal solution

Algorithm solution: N=(91,60), cost=211

43

Constraint Satisfaction 1: Delay Threshold with FQR Comparison of Control Schemes

FQR control SP control

44

Constraints Satisfaction 2: Time-Varying Model N model

λ1(t)= λ2(t)=1000+200sin(t)

µ11=10, µ21=15, µ22=20

Static Priority: class1 customers prefer pool 1 over pool 2. Pool 2 servers prefer class 1 customers over class2.

45

Constraints Satisfaction 2: Time-Varying Model Problem Formulation

10 10

1, 2,1 1

1,

2,

min 1 2

. . 0 0.1 1, ,10

0 0.5 1, ,10

t tN

t t

N t

N t

T J

N N

s t P W t

P W t

N

46

Constraints Satisfaction 2: Time-Varying Model Arrivals and Staffing

47

Constraints Satisfaction 2: Time-Varying Model Performance Measures

48

Constraints Satisfaction 2: Time-Varying Model System Statistics

49

Realistic Example

Medium-size Call Center (US Bank: SEE lab) 2 classes of calls

Business Quick & Reilly

2 pools of servers Pool 1- Dedicated to Business Pool 2 - Serves both

50

Realistic Example

Arrival Rates

51

Realistic Example

Service Distribution (via SEE Stat) Business Quick & Reilly

LogN(3.7,3.4) LogN(3.9,4.3)

52

Realistic Example

Patience – survival analysis shows that Exponential distribution fits both classes Business: Exp(mean=7.35min) Quick: Exp(mean=19.3min)

53

Realistic Example: Hourly SLA

Problem Formulation

24 24

1, 2,1 1

1,

2,

min

. . 0 0.2 1, , 24

0 0.5 1, , 24

t tN

t t

N t

N t

T J

N N

s t P W t

P W t

N

54

Realistic Example: Hourly SLA

Solution – total cost 575

55

Realistic Example: Hourly SLA

SLA

56

Realistic Example: Daily SLA

Problem Formulation

24 24

1, 2,1 1

1

2

min

. . 0 0.2

0 0.5

t tN

t t

N

N

T J

N N

s t P W

P W

N

57

Realistic Example: Daily SLA

Solution – total cost 510 (11% reduction)

58

Realistic Example: Daily SLA

SLA

59

Future Work

Incorporating scheduling mechanism Complex models Enhance algorithms

Independent of convexity assumptions Work faster