+ All Categories
Home > Documents > Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time...

Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time...

Date post: 04-Jul-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
47
Lecture 19 Markov chains in insurance Lecture 19 1 / 14
Transcript
Page 1: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Lecture 19Markov chains in insurance

Lecture 19 1 / 14

Page 2: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Introduction

We will describe how certain types of markov processes can be used tomodel behavior that are useful in insurance applications.

The focus in this lecture is on applications.

Lecture 19 2 / 14

Page 3: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Introduction

We will describe how certain types of markov processes can be used tomodel behavior that are useful in insurance applications.

The focus in this lecture is on applications.

Lecture 19 2 / 14

Page 4: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (1)

A continuous time Markov chain defined on a finite or countable infinitestate space S is a stochastic process Xt , t ≥ 0, such that for any 0 ≤ s ≤ t

P(Xt = x |Is) = P(Xt = x |Xs),

whereIs = All information generated by Xu for u ∈ [0, s].

Hence, when calculating the probability P(Xt = x |Is), the only thing thatmatters is the value of Xs . This is the Markov property.

Here and onwards, all states we consider are assumed to be elements in S .

Lecture 19 3 / 14

Page 5: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (1)

A continuous time Markov chain defined on a finite or countable infinitestate space S is a stochastic process Xt , t ≥ 0, such that for any 0 ≤ s ≤ t

P(Xt = x |Is) = P(Xt = x |Xs),

whereIs = All information generated by Xu for u ∈ [0, s].

Hence, when calculating the probability P(Xt = x |Is), the only thing thatmatters is the value of Xs . This is the Markov property.

Here and onwards, all states we consider are assumed to be elements in S .

Lecture 19 3 / 14

Page 6: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (1)

A continuous time Markov chain defined on a finite or countable infinitestate space S is a stochastic process Xt , t ≥ 0, such that for any 0 ≤ s ≤ t

P(Xt = x |Is) = P(Xt = x |Xs),

whereIs = All information generated by Xu for u ∈ [0, s].

Hence, when calculating the probability P(Xt = x |Is), the only thing thatmatters is the value of Xs . This is the Markov property.

Here and onwards, all states we consider are assumed to be elements in S .

Lecture 19 3 / 14

Page 7: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (2)

We only consider time-homogeneous Markov chains, which means that allMarkov chains Xt we consider have the property

P(Xs+t = y |Xs = x) = P(Xt = y |X0 = x).

We call the function

pt(x , y) = P(Xt = y |X0 = x)

the transition function.

Note that

P(Xt = x |Is) = {Markov property}= P(Xt = x |Xs)

= {Defintion of the transition function}= pt(Xs , x).

Lecture 19 4 / 14

Page 8: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (2)

We only consider time-homogeneous Markov chains, which means that allMarkov chains Xt we consider have the property

P(Xs+t = y |Xs = x) = P(Xt = y |X0 = x).

We call the function

pt(x , y) = P(Xt = y |X0 = x)

the transition function.

Note that

P(Xt = x |Is) = {Markov property}= P(Xt = x |Xs)

= {Defintion of the transition function}= pt(Xs , x).

Lecture 19 4 / 14

Page 9: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (2)

We only consider time-homogeneous Markov chains, which means that allMarkov chains Xt we consider have the property

P(Xs+t = y |Xs = x) = P(Xt = y |X0 = x).

We call the function

pt(x , y) = P(Xt = y |X0 = x)

the transition function.

Note that

P(Xt = x |Is) = {Markov property}

= P(Xt = x |Xs)

= {Defintion of the transition function}= pt(Xs , x).

Lecture 19 4 / 14

Page 10: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (2)

We only consider time-homogeneous Markov chains, which means that allMarkov chains Xt we consider have the property

P(Xs+t = y |Xs = x) = P(Xt = y |X0 = x).

We call the function

pt(x , y) = P(Xt = y |X0 = x)

the transition function.

Note that

P(Xt = x |Is) = {Markov property}= P(Xt = x |Xs)

= {Defintion of the transition function}= pt(Xs , x).

Lecture 19 4 / 14

Page 11: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (2)

We only consider time-homogeneous Markov chains, which means that allMarkov chains Xt we consider have the property

P(Xs+t = y |Xs = x) = P(Xt = y |X0 = x).

We call the function

pt(x , y) = P(Xt = y |X0 = x)

the transition function.

Note that

P(Xt = x |Is) = {Markov property}= P(Xt = x |Xs)

= {Defintion of the transition function}

= pt(Xs , x).

Lecture 19 4 / 14

Page 12: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (2)

We only consider time-homogeneous Markov chains, which means that allMarkov chains Xt we consider have the property

P(Xs+t = y |Xs = x) = P(Xt = y |X0 = x).

We call the function

pt(x , y) = P(Xt = y |X0 = x)

the transition function.

Note that

P(Xt = x |Is) = {Markov property}= P(Xt = x |Xs)

= {Defintion of the transition function}= pt(Xs , x).

Lecture 19 4 / 14

Page 13: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (3)

The transition intensity from state x to state y is defined by

λ(x , y) = limt↓0

pt(x , y)

t.

It is not unusual to define a Markov process in terms of its transitionintensities.

Note that λ(x , y) is a constant for fixed states x and y – it is neitherdependent on time nor on any randomness.

Lecture 19 5 / 14

Page 14: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (3)

The transition intensity from state x to state y is defined by

λ(x , y) = limt↓0

pt(x , y)

t.

It is not unusual to define a Markov process in terms of its transitionintensities.

Note that λ(x , y) is a constant for fixed states x and y – it is neitherdependent on time nor on any randomness.

Lecture 19 5 / 14

Page 15: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (3)

The transition intensity from state x to state y is defined by

λ(x , y) = limt↓0

pt(x , y)

t.

It is not unusual to define a Markov process in terms of its transitionintensities.

Note that λ(x , y) is a constant for fixed states x and y

– it is neitherdependent on time nor on any randomness.

Lecture 19 5 / 14

Page 16: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (3)

The transition intensity from state x to state y is defined by

λ(x , y) = limt↓0

pt(x , y)

t.

It is not unusual to define a Markov process in terms of its transitionintensities.

Note that λ(x , y) is a constant for fixed states x and y – it is neitherdependent on time nor on any randomness.

Lecture 19 5 / 14

Page 17: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (4)

Using the transition intensities, we can define the intensity matrix

:

Λ(x , y) =

{λ(x , y) if x 6= y

−∑

y 6=x λ(x , y) if x = y

LetP(t) = [pt(x , y)]

be the matrix of transition probabilities.

In general it holds that

P ′(t) = ΛP(t) = P(t)Λ.

These are the backward and forward equations respectively.

Lecture 19 6 / 14

Page 18: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (4)

Using the transition intensities, we can define the intensity matrix:

Λ(x , y) =

{λ(x , y) if x 6= y

−∑

y 6=x λ(x , y) if x = y

LetP(t) = [pt(x , y)]

be the matrix of transition probabilities.

In general it holds that

P ′(t) = ΛP(t) = P(t)Λ.

These are the backward and forward equations respectively.

Lecture 19 6 / 14

Page 19: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (4)

Using the transition intensities, we can define the intensity matrix:

Λ(x , y) =

{λ(x , y) if x 6= y

−∑

y 6=x λ(x , y) if x = y

LetP(t) = [pt(x , y)]

be the matrix of transition probabilities.

In general it holds that

P ′(t) = ΛP(t) = P(t)Λ.

These are the backward and forward equations respectively.

Lecture 19 6 / 14

Page 20: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (4)

Using the transition intensities, we can define the intensity matrix:

Λ(x , y) =

{λ(x , y) if x 6= y

−∑

y 6=x λ(x , y) if x = y

LetP(t) = [pt(x , y)]

be the matrix of transition probabilities.

In general it holds that

P ′(t) = ΛP(t) = P(t)Λ.

These are the backward and forward equations respectively.

Lecture 19 6 / 14

Page 21: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Continuous time Markov chains (4)

Using the transition intensities, we can define the intensity matrix:

Λ(x , y) =

{λ(x , y) if x 6= y

−∑

y 6=x λ(x , y) if x = y

LetP(t) = [pt(x , y)]

be the matrix of transition probabilities.

In general it holds that

P ′(t) = ΛP(t) = P(t)Λ.

These are the backward and forward equations respectively.

Lecture 19 6 / 14

Page 22: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

The Poisson process

A Poisson process is a Markov process with intensity matrix

Λ =

−λ λ 0 0 0 · · ·0 −λ λ 0 0 · · ·0 0 −λ λ 0 · · ·...

......

.... . .

.

It is a counting process: the only transitions possible is from n to n + 1.

We can solve the equation for the transition probabilities to get

P(X (t) = n) = e−λtλntn

n!, n = 0, 1, 2, . . . .

Lecture 19 7 / 14

Page 23: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

The Poisson process

A Poisson process is a Markov process with intensity matrix

Λ =

−λ λ 0 0 0 · · ·0 −λ λ 0 0 · · ·0 0 −λ λ 0 · · ·...

......

.... . .

.It is a counting process

: the only transitions possible is from n to n + 1.

We can solve the equation for the transition probabilities to get

P(X (t) = n) = e−λtλntn

n!, n = 0, 1, 2, . . . .

Lecture 19 7 / 14

Page 24: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

The Poisson process

A Poisson process is a Markov process with intensity matrix

Λ =

−λ λ 0 0 0 · · ·0 −λ λ 0 0 · · ·0 0 −λ λ 0 · · ·...

......

.... . .

.It is a counting process: the only transitions possible is from n to n + 1.

We can solve the equation for the transition probabilities to get

P(X (t) = n) = e−λtλntn

n!, n = 0, 1, 2, . . . .

Lecture 19 7 / 14

Page 25: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

The Poisson process

A Poisson process is a Markov process with intensity matrix

Λ =

−λ λ 0 0 0 · · ·0 −λ λ 0 0 · · ·0 0 −λ λ 0 · · ·...

......

.... . .

.It is a counting process: the only transitions possible is from n to n + 1.

We can solve the equation for the transition probabilities to get

P(X (t) = n) = e−λtλntn

n!, n = 0, 1, 2, . . . .

Lecture 19 7 / 14

Page 26: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A whole-life insurance model (1)

Let us consider a simple model of a whole-life insurance.

To fit into the Markovian model, we assume a constant force of mortalityλ.

This means that we have the picture

λ

Alive → Dead

Lecture 19 8 / 14

Page 27: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A whole-life insurance model (1)

Let us consider a simple model of a whole-life insurance.

To fit into the Markovian model, we assume a constant force of mortalityλ.

This means that we have the picture

λ

Alive → Dead

Lecture 19 8 / 14

Page 28: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A whole-life insurance model (1)

Let us consider a simple model of a whole-life insurance.

To fit into the Markovian model, we assume a constant force of mortalityλ.

This means that we have the picture

λ

Alive → Dead

Lecture 19 8 / 14

Page 29: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A whole-life insurance model (2)

Let us define.

State 1 = Alive

State 2 = Dead

This implies that the intensity matrix is given by[−λ λ0 0

]Note that a row of zeros for a state means that that state is absorbing.

Lecture 19 9 / 14

Page 30: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A whole-life insurance model (2)

Let us define.

State 1 = Alive

State 2 = Dead

This implies that the intensity matrix is given by[−λ λ0 0

]

Note that a row of zeros for a state means that that state is absorbing.

Lecture 19 9 / 14

Page 31: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A whole-life insurance model (2)

Let us define.

State 1 = Alive

State 2 = Dead

This implies that the intensity matrix is given by[−λ λ0 0

]Note that a row of zeros for a state means that that state is absorbing.

Lecture 19 9 / 14

Page 32: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A more complex life insurance model (1)

In this previous models we only considered if the individual was alive ordead.

In some cases we want to know if the individual is alive and healthy, oralive and an invalid.

This leads to the following model:

µ1Invalid � Healthy

µ2λ2 ↘ ↙ λ1

Dead

Lecture 19 10 / 14

Page 33: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A more complex life insurance model (1)

In this previous models we only considered if the individual was alive ordead.

In some cases we want to know if the individual is alive and healthy, oralive and an invalid.

This leads to the following model:

µ1Invalid � Healthy

µ2λ2 ↘ ↙ λ1

Dead

Lecture 19 10 / 14

Page 34: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A more complex life insurance model (1)

In this previous models we only considered if the individual was alive ordead.

In some cases we want to know if the individual is alive and healthy, oralive and an invalid.

This leads to the following model

:

µ1Invalid � Healthy

µ2λ2 ↘ ↙ λ1

Dead

Lecture 19 10 / 14

Page 35: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A more complex life insurance model (1)

In this previous models we only considered if the individual was alive ordead.

In some cases we want to know if the individual is alive and healthy, oralive and an invalid.

This leads to the following model:

µ1Invalid � Healthy

µ2λ2 ↘ ↙ λ1

Dead

Lecture 19 10 / 14

Page 36: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A more complex life insurance model (2)

With

State 1 = Healthy

State 2 = Invalid

State 3 = Dead

the intensity matrix is given by −λ1 − µ1 µ1 λ1µ2 −λ2 − µ2 λ20 0 0

Lecture 19 11 / 14

Page 37: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A more complex life insurance model (2)

With

State 1 = Healthy

State 2 = Invalid

State 3 = Dead

the intensity matrix is given by −λ1 − µ1 µ1 λ1µ2 −λ2 − µ2 λ20 0 0

Lecture 19 11 / 14

Page 38: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (1)

So far we have only considered one individual.

Now assume that an insurance company has insured n individuals.

For each of the individuals, the force of mortaility is the constant λ, andthe individuals die independently of each other.

Let N be the number of individuals alive after 1 year.

What is the distribution of N?

Lecture 19 12 / 14

Page 39: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (1)

So far we have only considered one individual.

Now assume that an insurance company has insured n individuals.

For each of the individuals, the force of mortaility is the constant λ, andthe individuals die independently of each other.

Let N be the number of individuals alive after 1 year.

What is the distribution of N?

Lecture 19 12 / 14

Page 40: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (1)

So far we have only considered one individual.

Now assume that an insurance company has insured n individuals.

For each of the individuals, the force of mortaility is the constant λ, andthe individuals die independently of each other.

Let N be the number of individuals alive after 1 year.

What is the distribution of N?

Lecture 19 12 / 14

Page 41: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (1)

So far we have only considered one individual.

Now assume that an insurance company has insured n individuals.

For each of the individuals, the force of mortaility is the constant λ, andthe individuals die independently of each other.

Let N be the number of individuals alive after 1 year.

What is the distribution of N?

Lecture 19 12 / 14

Page 42: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (1)

So far we have only considered one individual.

Now assume that an insurance company has insured n individuals.

For each of the individuals, the force of mortaility is the constant λ, andthe individuals die independently of each other.

Let N be the number of individuals alive after 1 year.

What is the distribution of N?

Lecture 19 12 / 14

Page 43: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (2)

Let X be the Markov process connected to one individual’s state

:

X (t) =

{1 if the individual is alive at time t2 if the individual is dead at time t

For one individual, the probability of being alive after 1 year is

P(X (1) = 1) = e−λ.

Since the individuals die independently of each other, it follows that

N ∼ Bin(n, e−λ).

Lecture 19 13 / 14

Page 44: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (2)

Let X be the Markov process connected to one individual’s state:

X (t) =

{1 if the individual is alive at time t2 if the individual is dead at time t

For one individual, the probability of being alive after 1 year is

P(X (1) = 1) = e−λ.

Since the individuals die independently of each other, it follows that

N ∼ Bin(n, e−λ).

Lecture 19 13 / 14

Page 45: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (2)

Let X be the Markov process connected to one individual’s state:

X (t) =

{1 if the individual is alive at time t2 if the individual is dead at time t

For one individual, the probability of being alive after 1 year is

P(X (1) = 1) = e−λ.

Since the individuals die independently of each other, it follows that

N ∼ Bin(n, e−λ).

Lecture 19 13 / 14

Page 46: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

A model of more than one life (2)

Let X be the Markov process connected to one individual’s state:

X (t) =

{1 if the individual is alive at time t2 if the individual is dead at time t

For one individual, the probability of being alive after 1 year is

P(X (1) = 1) = e−λ.

Since the individuals die independently of each other, it follows that

N ∼ Bin(n, e−λ).

Lecture 19 13 / 14

Page 47: Lecture 19 - Markov chains in insurancearmerin/FinInsMathRwanda/Lecture19.pdf · Continuous time Markov chains (1) Acontinuous time Markov chainde ned on a nite or countable in nite

Reference

I have partly used

Basic Life Insurance Mathematics by Ragnar Norberg

in this lecture.

Lecture 19 14 / 14


Recommended