Chester Nov08 Terry Lynch

Post on 11-Jun-2015

181 views 1 download

Tags:

transcript

Introduction Bounded Noise Extensions Future work

On the Growth of the Extreme Fluctuationsof SDEs with Markovian Switching

Terry Lynch(joint work with Dr. John Appleby)

Dublin City University, Ireland.

Leverhulme International NetworkUniversity of Chester, UK.

November 7th 2008

Supported by the Irish Research Council for Science, Engineering and Technology

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Introduction Bounded Noise Extensions Future work

Recap - one year ago

Theorem 1

Let X be the unique adapted continuous solution satisfying

dX (t) = f (X (t), t) dt + g(X (t), t) dB(t).

If there exist ρ > 0 and real numbers K1 and K2 such that∀ (x , t) ∈ R× [0,∞)

xf (x , t) ≤ ρ, 0 < K2 ≤ g2(x , t) ≤ K1

then X satisfies

lim supt→∞

|X (t)|√2t log log t

≤√

K1, a.s.

Question: Why the hell is everyone using that iterated logarithm?!

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Introduction Bounded Noise Extensions Future work

Introduction

We consider the rate of growth of the partial maxima of thesolution X (t)t≥0 of a nonlinear finite-dimensional stochasticdifferential equation (SDE) with Markovian Switching.

We find upper and lower estimates on this rate of growth byfinding constants α and β and an increasing function ρ s.t.

0 < α ≤ lim supt→∞

‖X (t)‖ρ(t)

≤ β, a.s.

We look at processes which have mean-reverting drift termsand which have either: (a) bounded noise intensity, or (b)unbounded noise intensity.

Introduction Bounded Noise Extensions Future work

Motivation

The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Introduction Bounded Noise Extensions Future work

Motivation

The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Introduction Bounded Noise Extensions Future work

Motivation

The ability to model a self-regulating economic system whichis subjected to persistent stochastic shocks is facilitated bythe use of mean-reverting drift terms.

The type of finite-dimensional system studied allows for theanalysis of heavy tailed returns distributions in stochasticvolatility market models in which many assets are traded.

Equations with Markovian switching are motivated byeconometric evidence which suggests that security prices oftenmove from confident to nervous (or other) regimes.

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Introduction Bounded Noise Extensions Future work

Motivational case study

To demonstrate the potential impact of swift changes in investorsentiment, we look at a case study of Elan Corporation.

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Introduction Bounded Noise Extensions Future work

Regular Variation

Our analysis is aided by the use of regularly varying functions

In its basic form, may be viewed as relations such as

limx→∞

f (λx)

f (x)= λζ ∈ (0,∞) ∀ λ > 0

We say that f is regularly varying at infinity with index ζ, orf ∈ RV∞(ζ)

Regularly varying functions have useful properties such as:

f ∈ RV∞(ζ) ⇒ F (x) :=

∫ x

1f (t) dt ∈ RV∞(ζ + 1)

f ∈ RV∞(ζ) ⇒ f −1(x) ∈ RV∞(1/ζ).

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Introduction Bounded Noise Extensions Future work

S.D.E with Markovian switching

We study the finite-dimensional SDE with Markovian switching

dX (t) = f (X (t),Y (t)) dt + g(X (t),Y (t)) dB(t) (1)

where

Y is a Markov chain with finite irreducible state space S,

f : Rd × S → Rd , and g : Rd × S → Rd × Rr are locallyLipschitz continuous functions,

B is an r -dimensional Brownian motion, independent of Y .

Under these conditions, there is a unique continuous and adaptedprocess which satisfies (1).

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Introduction Bounded Noise Extensions Future work

Bounded Noise

Consider the i th component of our SDE

dXi (t) = fi (X (t),Y (t)) dt +r∑

j=1

gij(X (t),Y (t)) dBj(t).

We suppose that the noise intensity g is bounded in the sense that

there exists K2 > 0 s.t. ‖g(x , y)‖F ≤ K2 for all y ∈ S,

there exists K1 > 0 s.t. inf‖x‖∈Rd

y∈S

∑rj=1

(∑di=1 xigij (x ,y)

)2

‖x‖2 ≥ K 21 .

By Cauchy-Schwarz we get the following upper and lower estimate

K 21 ≤ ‖g(x , y)‖2

F ≤ K 22 , for all y ∈ S.

Introduction Bounded Noise Extensions Future work

Upper bound

The strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

≤ −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Introduction Bounded Noise Extensions Future work

Upper bound

The strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

≤ −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Introduction Bounded Noise Extensions Future work

Upper bound

The strength of the restoring force is characterised by a locallyLipschitz continuous function φ : [0,∞) → (0,∞) withxφ(x) →∞ as x →∞, where

lim sup‖x‖→∞

supy∈S

〈x , f (x , y)〉‖x‖φ(‖x‖)

≤ −c2 ∈ (−∞, 0). (2)

Theorem 1

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (2) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

2 (1+ε)2c2

log t) ≤ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Introduction Bounded Noise Extensions Future work

Lower bound

Moreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

≤ c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Introduction Bounded Noise Extensions Future work

Lower bound

Moreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

≤ c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Introduction Bounded Noise Extensions Future work

Lower bound

Moreover, we ensure that the degree of non-linearity in f ischaracterised by φ also, by means of the assumption

lim sup‖x‖→∞

supy∈S

|〈x , f (x , y)〉|‖x‖φ(‖x‖)

≤ c1 ∈ (0,∞). (3)

Theorem 2

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ satisfying (3) and that g satisfiesthe bounded noise conditions. Then X satisfies

lim supt→∞

‖X (t)‖

Φ−1(K2

1 (1−ε)2c1

log t) ≥ 1, a.s. on Ωε,

where Φ(x) =∫ x1 φ(u) du and Ωε is an almost sure event.

Introduction Bounded Noise Extensions Future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Introduction Bounded Noise Extensions Future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Introduction Bounded Noise Extensions Future work

Growth rate of large fluctuations

Taking both theorems together, in the special case where φ is aregularly varying function, we get the following:

Theorem 3

Let X be the unique adapted continuous solution satisfying (1).Suppose there exists a function φ ∈ RV∞(ζ) satisfying (2) and (3)and that g satisfies the bounded noise conditions. Then Xsatisfies(

K 21

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

a.s.,

where Φ(x) =∫ x1 φ(u) du ∈ RV∞(ζ + 1).

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Comments and Example

Note that φ small ⇒ Φ small ⇒ Φ−1 large. Thus, asexpected, weak mean-reversion results in large fluctuations.

Take a simple one-dimensional Ornstein Uhlenbeck process

dX (t) = −cX (t)dt + K dB(t)

where, in our notation,

c1 = c2 = c and K1 = K2 = K ,φ(x) = x ∈ RV∞(1) ⇒ ζ = 1

Φ(x) = x2

2 and Φ−1(x) =√

2x

Then applying theorem 3 recovers the well known result(K 2

1

2c1

) 1ζ+1

≤ lim supt→∞

‖X (t)‖Φ−1(log t)

≤(

K 22

2c2

) 1ζ+1

lim supt→∞

|X (t)|√2 log t

=

(K 2

2c

) 12

, a.s.

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Introduction Bounded Noise Extensions Future work

Outline of proofs

Apply time–change and Ito transformation to reducedimensionality and facilitate the use of one-dimensional theory,

Use the bounding conditions on f and g to construct upperand lower comparison processes whose dynamics are notdetermined by the switching parameter Y ,

Use stochastic comparison theorem to prove that ourtransformed process is bounded above and below by thecomparison processes for all t ≥ 0 a.s.,

The large deviations of the comparison processes aredetermined by means of a classical theorem of Motoo.

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Introduction Bounded Noise Extensions Future work

Extensions

We can impose a rate of nonlinear growth of g with ‖x‖ as‖x‖ → ∞ through an increasing scalar function γ.

Then the growth rate of the deviations are of the orderΨ−1(log t) where Ψ(x) =

∫ x1 φ(u)/γ2(u) du.

Using norm equivalence in Rd we can study the size of thelargest component of the system rather than the norm, to getupper and lower bounds of the form

0 <1√d

α ≤ lim supt→∞

max1≤j≤d |Xj(t)|ρ(t)

≤ β ≤ +∞, a.s.

We can extend the state space of the Markov chain to acountable state space.

Introduction Bounded Noise Extensions Future work

Outline

1 IntroductionMotivationRegular Variation

2 Bounded NoiseMain ResultsOutline of Proofs

3 Extensions

4 Future work

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.

Introduction Bounded Noise Extensions Future work

Future work

To study finite-dimensional processes which are alwayspositive, e.g. the Cox Ingersoll Ross (CIR) model.

To investigate the growth, explosion and simulation of ourdeveloped results. This will involve stochastic numericalanalysis and Monte Carlo simulation.

This simulation will allow us to determine whether thequalitative features of our dynamical systems are preservedwhen analysed in discrete time.

To investigate whether our results can be extended to SDEswith delay.