Date post: | 25-Dec-2015 |
Category: |
Documents |
Upload: | elaine-mcbride |
View: | 220 times |
Download: | 3 times |
Renewal processes
Interarrival times
• {0,T1,T2,..} is an i.i.d. sequence with a common distribution fct. F
• Si = j=1i Tj
• {Si} is a nondecreasing, positive sequence of reneval times (point)
• The distribution of Si is F(i) • F(i) = f(i) * F = F * f(i)
• f(i) is the i-fold convolution of f (f = d/dx F)
The counting process
• N(t) = maxi {Si · t}• N(t) counts the number of renewal point
before t• M(t) = E[N(t)] is the expected number of
renevals before t• M(t) = n n P(N(t)=n)
• P(N(t)=n)=P(Sn · t and Sn+1 > t)
= P(Sn+1 > t) - P(Sn > t and Sn+1 > t)
The counting process
• P(N(t)=n)=P(Sn · t and Sn+1 > t)
= P(Sn+1 > t) - P(Sn > t and Sn+1 > t)
• Now Sn > t => Sn+1 > t so
• P(Sn > t and Sn+1 > t) = P(Sn > t)• Altogether • P(Sn · t and Sn+1 > t)
= P(Sn+1 > t) - P(Sn > t)
= P(Sn · t) - P(Sn+1 · t)
= F(n)(t) – F(n+1)(t)
The counting process
• M(t) = n n P(N(t)=n)
=n n P(Sn · t and Sn+1 > t)
= n n (F(n)(t) – F(n+1)(t))
=F(1)(t) + n=2 n F(n)(t) – (n-1)F(n)(t)
=F(1)(t) + n=2
F(n)(t)
=F(t) + n=2 f(n) * F=F(t) + f * n=1 f(n) * F=F(t) + (f * M)(t)
The renewal density
• m(t)= d/dt M(t): is called the renewal density• M(t+h)-M(t) is the expected number of
renewals in [t,t+h]• When h is small P(N(t+h)-N(t)>1)=O(h2)
and P(N(t+h)-N(t)=1)=O(h)• Thus (M(t+h)-M(t)) ¼ P(N(t+h)-N(t)=1) ¼ h ¢
m(t) • h ¢ m(t) approximates the probability of a
renewal within [t,t+h]
The renewal density
• m(t)= d/dt M(t)• M(t)=F(t) + (f * M)(t)• m(t) = f(t) + d/dt (F * M)(t)• = f(t) + d/dt(s0
t M(t-u) f(u) du)
• = f(t) + s0t m(t-u) f(u) du
• = f(t) + (f * m)(t)
Recurrence times
• Backward recurrence time (age):A(t) = t – SN(t)
• Forward recurrence time (excess):Y(t) = SN(t)+1 –t
• FA,t(a) = P(A(t) · a)
• FY,t(y) = P(Y(t) · y)
Distribution of ageF A,t(a) = P(A(t) · a) = P(t-S N(t) · a)• We condition on the first renewal, i.e. P(A(t) · a)
= s01 P(A(t) · a | S1=s) f(s) ds
= s0t P(A(t) · a | S1=s) f(s) ds
+ st1 P(A(t) · a | S1=s) f(s) ds
= s0t P(A(t-s) · a) f(s) ds
+ st1 P(A(t) · a | S1=s) f(s) ds
= s0t FA,t-s(a) f(s) ds
+ st1 I(t · a) ¢ f(s) ds
= s0
t FA,t-s(a) f(s) ds + I (t · a} R(t)
= (F A,.(a) * f)(t) + I(t · a) R(t)
Distribution of excess• FY,t(y) = P(Y(t) · y)• We condition on the first renewal, i.e. P(Y(t) · y)
= s01 P(Y(t) · y | S1=s) f(s) ds
= s0t P(Y(t) · y | S1=s) f(s) ds
+ st1 P(Y(t) · y | S1=s) f(s) ds
= s0t P(Y(t-s) · y) f(s) ds
+ st1 P(Y(t) · y | S1=s) f(s) ds
= s0t FY,t-s(y) f(s) ds
+ st1 I(s-t · y) ¢ f(s) ds
= (FY,.(y) * f)(t) + st
1 I(s · (t+y)) ¢ f(s) ds
= (FY,.(y) * f)(t) + F(t+y)-F(t)
General solutions
• Generally: Z = Q + Z * f• Laplace transform
Z(s) = Q(s) + Z(s)f(s)Z(s) (1-f(s)) = Q(s)Z(s) = Q(s) / (1-f(s))
Alternative solution
• m(t) = f(t) + (f * m)(t)• Laplace transform
m(s) = f(s) + f(s) m(s)m(s) (1-f(s))=f(s)1-f(s)=f(s)/m(s)
• Z(s) (1-f(s)) = Q(s) Z(s) f(s) = Q(s) m(s)
• Z(s) = Q(s) + Z(s) f(s) = Q(s) + Q(s) m(s)
• Z(t) = Q(t) + (Q * m)(t)
Example (Poisson)
• Poisson process: F(t)=1-exp(-¸ t)f(t) = ¸ exp(-¸ t)R(t) = exp(-¸ t)
• m = f + m * f m(s)=f(s)/(1-f(s)) • f(s) = ¸ s exp(-st) exp(-¸ t) dt = ¸ /(s+¸)• m(s) = ¸ /(s+¸)/(1- ¸ /(s+¸)) = ¸ /(s+¸- ¸ )) = ¸ / s• m(t) = ¸ !!!
Limiting renewal densityin general
• m(t) = f(t) + (f * m)(t) • m(s) = f(s) + f(s) m(s) • m(s)=f(s)/(1-f(s))• limt -> 1 m(t) = lims -> 0 s m(s) =
• lims -> 0 s f(s)/(1-f(s)) = (l’Hospital)
lims -> 0 d/ds (s f(s)) / lims -> 0 d/ds (1-f(s))
= f(0)/ ((d/ds -f(s))|s=0) = 1/E(Ti) !!!
Example (Poisson)• m(t) = ¸• FA,t(a) = (FA,.(a) * f)(t) + I(t · a) R(t)
• FY,t(y)= (FY,.(y) * f)(t) + F(t+y)-F(t)• Z = Q + Z * f Z(s) = Q(s) / (1-f(s))
or Z(t) = Q(t) + (Q * m)(t)
FA,t(a) = I(t · a) R(t) + ¸ s0t I(s · a) R(s) ds
= I(t · a) R(t) + ¸ s0min(t,a) R(s) ds
= I(t · a) exp(-¸ t)+ (1-exp(-min(t,a)))
FY,t(y) = (FY,.(y) * f)(t) + F(t+y)-F(t)
= F(t+y)-F(t) + ¸ s0t F(s+y)-F(s) ds (husk -¸)
= exp(-¸ t) - exp(-¸ (t+y)) - exp(-¸ t) + exp(-¸ (t+y)) + (1- exp(-¸ y) ) = 1-exp(-¸ y) = F(y) !!!
Alternating renewal process
• Used to model random on/off processes• Network traffic• Power consumption
Sn-1 Sn
Tn = Zn + Yn
Zn Yn
ON OFF
Alternating renewal process• I(t) = I(SN(t) < t · SN(t)+ZN(t))• I(t) indicates whether t belongs to an on-period.• P(ON at t) = P(I(t)=1)=O(t)• We condition on the first renewal
O(t) = P(I(t)=1) = s01 P(I(t)=1 | S1=s) f(s) ds
= s0t P(I(t)=1 | S1=s) f(s) ds
+ st1 P(I(t)=1 | S1=s) f(s) ds
= s0t P(I(t-s)=1) f(s) ds
+ st1 P(t · Z1 | S1=s) f(s) ds
= (O * f)(t) + st1 P(Z1 ¸ t| S1=s) f(s) ds
Alternating renewal process
O(t) = (O * f)(t) + st1 P(Z1 ¸ t| S1=s) f(s) ds
= (O * f)(t) + s01 P(Z1 ¸ t| S1=s) f(s) ds
= (O * f)(t) + P(Z1 ¸ t)
= (O * f)(t) + 1-FZ(t)
O(s)=1-FZ(s) + O(s)*f(s)
Example (2 state Markov)• 2 exponential distributions
FZ(t)=1-exp(-¸ t)
FY(t)=1-exp(-¹ t)
f(t) = ¸ ¹ s0t exp(-¸ (t-s)) exp(-¹ s) ds
E(T)=E(Y)+E(Z)=1/¹ + 1/¸ limt -> 1=1/E(T)=1/(1/¹+1/¸)
• O(s)=1-FZ(s) + O(s)*f(s) O(s)=(1-FZ(s))/(1-f(s))
or O(s) = 1-FZ(s) + (1-FZ(s)) m(s)
• limt -> 1 O(t) = lims -> 0 s O(s) =
lims -> 0 s(1-FZ(s)) + s(1-FZ(s)) m(s) =
lims -> 0 (1-FZ(s)) lims -> 0 s m(s) = s RZ(t) dt / E(T)
• s RZ(t) dt = s 1 ¢ RZ(t) dt = tRZ(t) + s t ¢ fZ(t) dt -> s t ¢ fZ(t) dt = E(Z)
• limt -> 1 O(t) = E(Z)/E(T) !!!
Autocorrelation
• CII(s) = E((It-E(I))(It+s-E(I)))
= E((It It+s) – E2(I)
• E(I) = limt -> 1 O(t) = E(Z)/E(T)
• E(It It+s)=P(It and It+s)
• Tn = Zn + Yn
• SN(t)=SN(t)-1+TN(t)
• A(t)=t-SN(t)
Autocorrelation
• CII(s) = E((It It+s) – E2(I)• t lies in the 1st renewal period• E(I) = E(Z)/E(T)• E(It It+s)=P(It and It+s)
• P(It and It+s) =
s P(t+s · Z1 | S1=x) +
P(t · Z1 and t+s ¸ S1 | S1=x) O(t+s-x) f(x) dx
AutocorrelationP(It and It+s)
= s P(t+s · Z1 | S1=x) +
P(t · Z1 and t+s ¸ S1 | S1=x) O(t+s-x) f(x) dx
=s P(t+s · Z1 | S1=x) +
P(t · Z1 and t+s ¸ x | S1=x) O(t+s-x) f(x) dx
=s P(t+s · Z1 | S1=x) +
I(t+s ¸ x) P(t · Z1| S1=x) O(t+s-x) f(x) dx
=s P(t+s · z | Z1=z, S1=x) fZ,S(z,x) dzdx +
s I(t+s ¸ x) P(t · z| Z1=z, S1=x) O(t+s-x) fZ,S(z,x) dzdx
=s I(t+s · z) fS|Z(z,x) fZ(z) dzdx +
s I(t+s ¸ x) I(t · z) O(t+s-x) fS|Z(z,x) fZ(z) dzdx
=s I(t+s · z) fY(x-z) fZ(z) dzdx +
s I(t+s ¸ x) I(t · z) O(t+s-x) fY(x-z) fZ(z) dzdx
AutocorrelationP(It and It+s)
= s I(t+s · z) fY(x-z) fZ(z) dzdx +
s I(t+s ¸ x) I(t · z) O(t+s-x) fY(x-z) fZ(z) dzdx
= sst+sx fY(x-z) fZ(z) dzdx +
stt+s st
x O(t+s-x) fY(x-z) fZ(z) dzdx
= st+s sz fY(x-z) dx fZ(z) dz + st
t+s O(t+s-x) stx fY(x-z) fZ(z) dz dx
= st+s fZ(z) dx + stt+s O(t+s-x) st
x fY(x-z) fZ(z) dz dx
= RZ(t+s) + stt+s O(t+s-x) st
x fY(x-z) fZ(z) dz dx
· RZ(t+s) + s stx fY(x-z) fZ(z) dz dx
= RZ(t+s) + P(Z1 ¸ t) = RZ(t+s)+ RZ(t) \leq 2RZ(2s) (for large s)
CII(s) = E((It It+s) – E2(I) \leq E((It It+s) · 2RZ(2s)
Example - Pareto distributions (power/heavy tails)
• Let fZ(z)=K z-® I(z ¸ z0) ®>1
• FZ(z)= K/(®-1) (z01-® – z1-®) I(z ¸ z0)
• K= (®-1)/z0 1-®
• FZ(z)= (1 – (z/z0)1-®) I(z ¸ z0)
• RZ(z)=1-FZ(z) = (z/z0)1-® + I(z · z0) · (z/z0)1-®
CII(s) ¼ 2RZ(2s) = K · (2s)2(H-1) (H - Hurst parameter)
H>1/2 : Long Range Dependence (LRD)
1-® = 2(H-1) H=(1-®)/2+1=3/2-®/2 or ®=3-2HLRD ® < 2
Sample means
• ET = 1/T s0T I(t) dt
• I(t) indicates on-state• Var(ET)=E(E T 2)= 1/T2 E((s0
T I(t) dt)2)
= 1/T2 E(s0T I(t) dt s0
T I(t) dt)
= 1/T2 E(s0
T s0T I(t) I(s) ds dt)
= 1/T2 s0T s0
T E(I(t) I(s)) ds dt
= 1/T2 s0T s0
T CII(t-s) ds dt
¼ 1/T2 s0T s0
T 2RZ(2|t-s|) ds dt
= 4/T2 s0T s0
t RZ(2(t-s)) ds dt
Sample meansfor 2 state Markov process
• Var(ET) = 4/T2 s0T s0
t RZ(2(t-s)) ds dt
• RZ(z) =1-FZ(t)=exp(-¸ t)
• Var(ET) · 4/T2 s0T s0
t exp(-2¸ (t-s)) ds dt
= 4/T2 s0T exp(-2¸ t) s0
t exp(2¸ s) dx dt
= 4/T2/¸ s0T exp(-2¸ t) (exp(2¸ t)-1) dt
=4/T2/¸ s0T (1-exp(-2¸ t)) dt
=4/T2/¸ (T+1/2¸ (1-exp(-¸ T))=4/¸ (1/T+1/2T2¸ (1-exp(-¸ T)) ¼ 4/¸/T
Sample meansfor white noise
• w is white noise• B(t)=s0
t w(t) dt• B(t) is Brownian motion (Wiener process)• Var(B(t))=´ t (by definition)• ET=1/T s0
t w(t) dt = 1/T B(T)
• Var(ET)=1/T2 var(B(T))=1/T2 ´ T = ´/T• 2 state Markov like white noise
Sample meansfor Brownian motion
• B(s)=B(t)+sts w(x) dx = B(t)+b s ¸ t
• b and B(t) are independent• CBB(t,s) = E(B(t)B(s)) = E(B(t) (B(t)+b)) =E(B2(t))=´ t =
´ min{t,s} !!!• ET=1/T s0
t B(t) dt
• Var(ET)=1/T2 s0T s0
t CBB(t,s) ds dt
• =1/T2 s0T s0
T ´ min{t,s} ds dt
• =2/T2 s0T s0
t ´ s ds dt
• =1/T2 s0T ´ t2 dt
• =1/T2/3 ´ T3 = 1/3 ´ T
Sample meansfor renewal with Pareto distributions
• Var(ET) = 4/T2 s0T s0
t RZ(t-s) ds dt
• RZ(z) = C z1-®
• Var(ET) = 4C/T2 s0T s0
t (t-s)1-® ds dt
= -4C/T2 s0T st
0 x1-® dx dt
= 4C/T2/(2-®) s0T t2-® dt
= 4C/T2/(2-®)/(3-®) T3-®
= 4C/(2-®)/(3-®) T1-®
For ® ¼ 1 right between white noise (s0) and Brownian motion (s-1) fractional Brownian motion (s-1/2) BH(t)=s0
t (t-s)H-1/2 w(s) ds
Self similarity
• A process X is self similar with Hurst parameter H iff: a-H X(at) is equivalent to X(t)(up to finite joint distributions)
CXX(s) = E(X(0)X(s))= (1/s)-2H E(X(0/s)X(s/s)) = s2H CXX(0,1)
CXX(t,s)= E(X(t)X(s))= (1/s)-2H E(X(t/s)X(s/s))=
= s2H CXX(t/s,1) -> s2H CXX(0,1) for t/s -> 0
CXX(t,t+s)= E(X(t)X(t+s))= E(X(t/(t+s))X((t+s)/(t+s)))=
= (t+s)2H CXX(t/(t+s),1) -> (t+s)2H CXX(1,1) for t -> 1
Self similarity• Y(n)=X(n)-X(n-1)• CYY(1,m) = E((X(1)-X(0))(X(1+m)-X(m)))
=E(X(1)X(1+m))+E(X(0)X(m))-E(X(1)X(m))-E(X(0)X(1+m)) = m2H (E(X(1/m)X(1/m+1))+E(X(0)X(1))-E(X(1/m)X(1))-E(X(0)X(1/m+1))) = m2H (CXX(1/m,1/m+1)+ CXX(0,1)- CXX(1/m,1)- CXX(0,1/m+1))
= m2H (CXX(1/m,1/m+1) - CXX(0,1/m+1) + CXX(0,1)- CXX(1/m,1))
¼ m2H (CXX(0,1)+1/m D1+1/m D2 + D12/2 1/m2 + D21/2 1/m2 + D11/2 1/m2 + D22/2 1/m2 -(CXX(0,1)+1/m D2 + D22/2 1/m2)
+ CXX(0,1) -(CXX(0,1)+1/m D1+ D11/2 1/m2 ))
= m2H (D12/2 1/m2 + D21/2 1/m2) = m2H-2 (D12+ D21)/2
Frequency Domain
• RZ(z) = C z1-®
• log(RZ(z))=log(C) + (1-®) log(z)
• CYY(1,m) = K m2H-2
• SYY(!) = C ! 1-2H
• log(SYY(!))=log(C) + (1-2H) log(!)
Distribution of files sizes
Time averages (aggregated)
Time averages (cont’d)
Aggregated statistics
Estimating the Hurst parameter
Miniproject
• Make a statistic on the filesizes of your file system.• Check for power tailed behaviour.• Simulate an M/G/1 queue with power tailed service times.• Compare with results for an M/M/1 queue with the same load:
½ = mean service time/mean interarrival time
• Simulate an alternating renewal process with power tailed ”ON” distribution.
• Compute an autocorrelation estimate.• Compute estimates of the 1-step increments of sample means.• Compute a power spectrum estimate.
Summary LRD
• Let fZ(z)=K z-® I(z ¸ z0) ®>1
• RZ(z) ¼ (z/z0)1-®
• CII(s) ¼ 2RZ(2s) · (2s)2(H-1)
(H - Hurst parameter, I indicates on period)• H>1/2 : Long Range Dependence (LRD)• LRD ® < 2• log(RZ(z))=log(C) + (1-®) log(z)
Summary M/G/1
• M/M/1:Q=½/(1-½)
• M/G/1: (Pollachek-Kinchine)Q=½ + (½2 + ¸2 var(S))/2/(1-½)
• fS(s)=K s-® I(s ¸ s0) ®>1
• E(S2) = K s s01 s2 s-® ds = K s s0
1 s2-® ds =
[s3-®]s01 /(3-®)
Summary SS
• A process X is self similar with Hurst parameter H iff: a-H X(at) is equivalent to X(t)(up to finite joint distributions)
• Y(n)=X(n)-X(n-1)• CYY(1,m) ¼ m2H-2 (D12+ D21)/2
• CYY(1,m) = K m2H-2
• SYY(!) = C ! 1-2H
• log(SYY(!))=log(C) + (1-2H) log(!)
Summary (Sample means)
• ET = 1/T s0T I(t) dt
• I(t) indicates on-state• Var(ET)= 4/T2 s0
T s0t RZ(2(t-s)) ds dt
• ¼ 4/¸/T (2 state Markov)• = ´/T (White noise)• = 1/3 ´ T (Brownian motion)• = 4C/(2-®)/(3-®) T1-® (Power tail)