of 36
8/9/2019 Utility of Markov Chains
1/36
Utility Of Markov
Chain
Utility Of Markov
Chain
Presented By :-Divya
Mittal
8/9/2019 Utility of Markov Chains
2/36
ContentsContents
Stochastic Process
Types of Stochastic Process
Markov Process
Markov Chain Applications of Markov Chain in different
fields
Markov Chain applied on a Tennis
Problem Conclusion
References
Contents
8/9/2019 Utility of Markov Chains
3/36
Stochastic Process:-An IntroductionStochastic Process:-An Introduction
Types
Markov
chain
Applic
ations
Tennis
Problem
Stochastic
Process This process is dynamic that is any random phenomenonwhich is a function of time, changing over time is
Stochastic Process (which revolve over time).
A collection or a family of random variable which is a
function of parameter t which takes in a set T(say)of realnumber. Here T is called Index Set and t is time.
Notation
Index set may be dicrete or continuous.
The set of all possible values of X(t) is called its state
space. Space is also a set it may be discrete or
continuous.
},{ TtXtI
8/9/2019 Utility of Markov Chains
4/36
Markov
Process
Applica-tion
Tennis
Problem
Lets know how can we classify Stochastic Process?Lets know how can we classify Stochastic Process?
Stochastic
Process
Types
Counting Process (Discrete time process)
Poisson Process
Renewal Process
Martingales or fair game process Stationary Process
Brownian Motion
Markov Process
8/9/2019 Utility of Markov Chains
5/36
Now first we should try to understand about Markov Process.Now first we should try to understand about Markov Process.
Stochastic
Process
Types
Markov
Process
Applica
-tion
Tennis
Problem
Markov processes represent thesimplest generalization of
independent processes by permitting
the outcome at any instant to dependonly on the outcome that proceeds it
and none before that. Thus in Markov
process X(t), the past has no influence
on the future if the present isspecified.
8/9/2019 Utility of Markov Chains
6/36
Markov ChainMarkov Chain
Markov Chain introduced byRussian MathematicianAndrei
Andreevich Markov(1856-1922)
introduced that its a special
kind of Markov Process, wherethe system can occupy a finite or
countably infinite no. of states
e(1).e(2)e(j)
Markov
Chain
8/9/2019 Utility of Markov Chains
7/36
Andrei Andreevich MarkovAndrei Andreevich Markov
Born in June 2
,1856
Died on
20july,1922
Born in RyazanRussia
Founder
8/9/2019 Utility of Markov Chains
8/36
Markov ChainMarkov Chain So A discrete time process with discrete state
space (S) is called a Markov Chain if it satisfies
the following property (where we have a
collection of random variables)
i.e. if
It follows that if
then
)](/)([]),(/)([ 11 e!e nnnnnn ttPttttP
Markov
Chain
nn
tt
1
ntttt ...321
)()...(/)([ 11 tttP nnn e
)](/)([ 1e! nnn ttP
8/9/2019 Utility of Markov Chains
9/36
Applications of Markov ChainApplications of Markov Chain
Stochastic
Process
Types
Markov
Process
Application
Tennis
Problem
Markov Chain constitute important model in
many applied fields.
We can apply markov chain in many fields like:-
1. Physics 2. Social Sciences
3. Chemistry 4. Mathematical Biology
5. Testing 6. Gambling
7. Queuing Theory 8. Music
9. Internet Application 10. Tennis
11. Statistics 12. Markov Test Generator
13. Economics and Finance and in many fields.
8/9/2019 Utility of Markov Chains
10/36
Can we apply Stochastic Process on Tennis? Yes, how let us seeCan we apply Stochastic Process on Tennis? Yes, how let us see
Stochastic
Process
Types
Markov
Process
Application
Tennis
Problem
A tennis problemwith the scoring
system 15, 30, 40,
60.
Two players of
the game, one is
the server
Another one is
the receiver.
8/9/2019 Utility of Markov Chains
11/36
ScoringScoring
Now there may be only two possibilities
either server will win or receiver will win.
Thus the score in a game can be only one ofthe following (servers score is always thefirst number):
{ 15:0, 0:15, 15:15, 30:0, 30:15, 30:30,15:30, 0:30, 40:0, 40:15, 40:30, 30:40, 15:40,0:40, deuce, advantage in, advantage out,the game }
A group of games will be a set, and a group
of different sets will be a match.
8/9/2019 Utility of Markov Chains
12/36
State diagram for a game in tennisState diagram for a game in tennis
0:0
15:0
15:15
0:30
0:40
0:15
30:15
30:0
40:0
15:40
15:30
40:15
40:30
Adv.in
30:30
Deuce
Receiver
game
30:40
Adv.out
Servers
game
State
Diagram
8/9/2019 Utility of Markov Chains
13/36
GameGame We can define two probabilities:- Let,
p denote the probability of the server winning a
point
and
q denote the probability of the receiver winning a
point
(q=1-p)
The state diagram for a game is fig.1
where states are identified by
scores.
Thus first point is scored with prob.=
Game
8/9/2019 Utility of Markov Chains
14/36
8/9/2019 Utility of Markov Chains
15/36
GameGame Finally fifth point is scored with probabilities
Rest of the game resembles a random walk over5state
Two absorbing states
(server wins; receiver wins)
and the three transient states
(adv.in, deuce, adv.out)
)41(}{ 40
qpserverwinsPp !!
23
1 4}.{ qpinadvPp !!
22
26}{ qpdeucePp !!
32
34}.{ qpoutadvPp !!
)41(}{ 44
pqnsreceiver iPp !!
!),(40
ee
Game
8/9/2019 Utility of Markov Chains
16/36
GameGame The transition probability matrix for this random walk is
So this random walk starts with the initial distribution
P =
10000
000
000
0000
00001
pp
qp
p
],,,,[)0(43210 pppppp !
Game
8/9/2019 Utility of Markov Chains
17/36
GameGame For the three transient states
we have
Thus in long run
,3,2,1, !je j From, we have 0)( pnijp
!p QPn
10000
000
000
00000001
4,30,3
4,20,2
4,10,1
ff
ff
ff
Game
8/9/2019 Utility of Markov Chains
18/36
GameGame Where are given by
with N=4, and
Then we obtain the long run distribution for the game to be
0,kf
N
kN
kpq
pqf
)/(1
)/(10,
!
0,4, 1 kk ff !
]1,0,0,0,[)0()( )0(limlim ggnn
ppQpnnp Pp $!!gp
gp
Game
8/9/2019 Utility of Markov Chains
19/36
GameGame Where, let be the probability that server will win the
match and be the probability that receiver will win
the match.
Then, server will win with probability
4
4
0
4
0,
4
0 )/(1
)/(1
pq
pqp
fpp k
kk
k
k
kg
!!
!
!
gp
gq
,4,...1,0,, !kpith k
Game
8/9/2019 Utility of Markov Chains
20/36
GameGame For ex. if the server plays twice as well as
the receiver (p=2/3.q=1/3), then from theabove formula we get the probability of theserver winning a game to be 0.856 and thereceiver winning the game to be 0.144.
On the other hand if, if the players are ofabout the same strength server having aslight advantage so that p=0.52 and q=.45,then the probabilities for winning the gamefor the server and receiver are 0.55 and0.48, respectively.
Notice that while y=the probability ofwinning a point differ only by 0.04, theprobability of winning a game differ by 0.1.
Game
8/9/2019 Utility of Markov Chains
21/36
GameGame
As we shall see, thisamplification of even the
slightest advantage of the
stronger player is brought out in
an even more pronouncedmanner in a set by the
underlying random walk there.
Game
8/9/2019 Utility of Markov Chains
22/36
SetSet A group of games will be set.
Fig.2 shows the state diagram for a set where the states
are once again identifies by scores.
Where the probability of the server winning a game is
given by =
and that of the receiver winning a game is given by=
Thus P (6 :0)=
So from fig.2 at the 11th or12th game a new random
walk phenomenon is
gp
6
gp
)1( gg pq !
gq
10000
000
000
000
00001
gg
gg
gg
qp
qp
qp
P=
Set
8/9/2019 Utility of Markov Chains
23/36
8/9/2019 Utility of Markov Chains
24/36
State diagram for a setState diagram for a set
Fig (b)
0Y
(a) Set initialization. Each circle represents a game.
(b) Each set results in a random walk among five states
that are initialized by the distribution.
Two-
game
margin
for
server
One-
game
margin
for
server
Equal
scoreafter
five-all
One-
game
margin
forreceiver
Two-
game
margin
forreceiver
State
Diagram
1Y 2Y 3Y 4Y
8/9/2019 Utility of Markov Chains
25/36
SetSet So the 11th game is scored with probabilities
)}4:6()3:6()2:6()1:6()0:6{(}{0
7777PserverwinsP !$Y
46362666 12656216 ggggggggg qpqpqpqpp !
56
1252}5:6{.}{
ggqpPserveradvP !!$Y
02$Y
56
3252}6:5{.}{
gg
pqPvreceiveradP !!$Y
P{ equal score after five all}=
46362666
412656216}{ ggggggggg pqpqpqpqqnreceiverwiP !$Y
Set
8/9/2019 Utility of Markov Chains
26/36
SetSet So we obtain the long run probability distribution for a
set of games to be
where ( represents the counter part of Q for sets)
]1,0,0,0,[],,,,[)(43210lim ssg
n
ppQnp $pgp
YYYYY
gQ
4
4
0
4
)/(1
)/(1
gg
k
kggk
spq
pq
p
!!
Y
Set
8/9/2019 Utility of Markov Chains
27/36
SetSet For ex., an opponent with twice the
skills will win each set with
probability 0.9987whereas among two
equally seeded players ,the one with
a slight advantage (p=0.51)will wineach set only with probability 0.5734.
In the later case, the odds in favor of
the stronger player are not
significant in any one set and henceseveral sets must be played to bring
out the better among the two top
seeded players.
Set
8/9/2019 Utility of Markov Chains
28/36
MatchMatch Usually three or five sets are played to complete the
match.
To win a three set match, a player needs to score either
a (2:0) or (2:1) and hence the probability of winning a
three set match is given by
=P{2:0} +P{2:1} =
where represents the probability of winning a set for
the player and
Similarly, the probability of winning a five-set match forthe same player is given by
mp sss qpp
222
2333 63}2:3{}1:3}0:3{ sssssm qpqppPPPp !!
ss pq ! 1sq
Match
8/9/2019 Utility of Markov Chains
29/36
State diagram for the matchState diagram for the match
0:11:0
2:3
1:2
3:2
0:0
2:1
1:1
0:3
2:2
3:0
0:22:0
Server's
The match
Receiver win
the match
3:1 3:0
State
Diagram
8/9/2019 Utility of Markov Chains
30/36
Game of TennisGame of Tennis
Player Skills Probability of
winning game
Probability of
winning a set
Probability of winning a match
3 sets 5sets
p q P(g) 1-p(g) P(s) 1-p(s) P(m) 1-p(m) P(m) 1-p(m)
0.75 0.25 0.949 0.051 1.000 0 1 0 1 0
0.66 0.34 0.856 0.144 0.9987 0.0013 1 0 1 0
0.60 0.40 0.736 0.264 0.9661 0.0339 0.9966 0.0034 0.9996 0.0004
0.55 0.45 0.623 0.377 0.8215 0.1785 0.9158 0.0842 0.9573 0.0427
0.52 0.48 0.550 0.450 0.6446 0.3554 0.7109 0.2891 0.7564 0.2436
0.51 0.49 0.525 0.475 0.5734 0.4266 0.6093 0.3907 0.6357 0.3643
Game
8/9/2019 Utility of Markov Chains
31/36
MatchMatch Referring the last table, top seeds and their
low ranked opponents (p=0.66, q=0.34)
should be able to settle the match in three
sets in favor of the top seed with probability
one.
For closely seeded players of approximatelyequal skills (p=0.51.q=0.49), the probability
of winning a three set match is 0.609, and
winning a five set match is 0.636 for the
player with slight advantage. Thus to bring
out the contrast between two closely seeded
players, it is necessary to play at least a
five-set match or even a seven set match.
Match
8/9/2019 Utility of Markov Chains
32/36
MatchMatch The game of tennis has two random walk
models imbedded in it at two levels one atthe game level and the other at the setlevel and they are design to bring out thebetter among two players of approximatelyequal skill.
Using the 5*5 matrix for the random walkin a set, it is easy to show that the totalgames in a set can continue to aconsiderable number (beyond 12) before anabsorbing takes place especially between
top seeded players, and to conserve timeand players energy , tie-breakers areintroduced into sets.
Match
8/9/2019 Utility of Markov Chains
33/36
Tie-breakersTie-breakers
At the score of 6:6 in every set, tie-breakers
are played, and the player whose turn to isto serve the game. The opponent serves the
next two points and the server is alternated
after every two points until the player who
scores the first seven points with a two-point lead wins the game and the set.
That means the twp point lead requirement
once again introduces yet another random
walk model towards the later part of the tie-
breaker game.
Tie-
breakers
8/9/2019 Utility of Markov Chains
34/36
Tie-breakersTie-breakers
The tie-breaker game is playedessentially in the same spirit of
an entire set.
The tie breaker is a set played
rapidly within a set at an
accelerated pace.
Tie-
breakers
8/9/2019 Utility of Markov Chains
35/36
ReferencesReferences Probability, Random Variables and
Stochastic processes :--Athanasios Papoulis
S. Unnikrishna Pillai
Introduction to Stochastic Process:--
A. K. Basu Applied Stochastic processes:-- A
biostatistical and population orientedapproach:-
Suddhendu Biswas
8/9/2019 Utility of Markov Chains
36/36