+ All Categories
Home > Documents > Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Date post: 17-Dec-2015
Category:
Upload: hester-garrison
View: 223 times
Download: 0 times
Share this document with a friend
Popular Tags:
37
Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim
Transcript
Page 1: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Probabilistic Reasoning (2)

Daehwan Kim, Ravshan Khamidov, Sehyong Kim

Page 2: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Contents Basics of Bayesian Networks (BNs)

Construction Inference Single and Multi connected BNs

Inference in Multi-connected BNs Clustering algorithms Cutset conditioning

Approximate Inference In Bayesian Network Direct Sampling Markov Chain Monte Carlo

Example applications of BNs

Page 3: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Representing Knowledge in an Uncertainty Joint probability distribution

Answer any questions about the domainBecome intractably large as the number of

variables growsSpecifying probabilities for atomic events is

unnatural and difficult

Page 4: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Representing Knowledge in an Uncertainty A Bayesian network

Provide concise way to present conditional independence relationships in the domain

Specifies full joint distribution but often exponentially smaller than the full joint distribution

Page 5: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Definition Topology of the network + CPT A set of random variables makes up the nodes of the

network A set of directed links or arrows connects pairs of nodes.

Example: X->Y  means X has a direct influence on Y. Each node has a conditional probability table (CPT) that

quantifies the effects that the parents have on the node. The parents of a node are all those nodes that have arrows pointing to it.

The graph has no directed cycles (hence is a directed, acyclic graph, or DAG).

Page 6: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Construction Network construction

General procedure for incremental network construction:

choose the set of relevant variables Xi that describe the domain

choose the ordering for the variables while there are variables left:

pick a variable Xi and add a node to the network for it set Parent(Xi) by testing its conditional independence in the

net define the conditional probability table for Xi

Page 7: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Topology of the network Suppose we choose the ordering

B,E,A,J,M P(B|E)= P(E)? Yes P(A|B)= P(A)? P(A|E)= P(A)? No P(J|A,B,E)= P(J|A)? Yes P(J |A)= P(J)? No P(M|A)= P(M)? No P(M|A,J)= P(M|A)? Yes

Burglary Earthquake

Alarm

JohnCalls MaryCalls

Page 8: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Conditional Probability Table (CPT)

Once we get the topology of the network, conditional probability table (CPT) must be specified for each node.

Example of CPT for the variable WetGrass:

C

W

RS

S R P(W=F) P(W=T)

F F

T F

F T

T T

1.0 0.0

0.1 0.9

0.1 0.9

0.01 0.99

Page 9: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Conditional Probability Table (CPT)

Each row in the table contains the conditional probability of each node value for a conditioning case. Each row must sum to 1, because the entries represent an exhaustive set of cases for the variable.

A conditioning case is a possible combination of values for the parent nodes.

Page 10: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Example Representing Knowledge Example

P(C=F) P(C=T)

0.5 0.5

C P(S=F) P(S=T)

F

T

0.5 0.5

0.9 0.1

C P(R=F) P(R=T)

F

T

0.8 0.2

0.2 0.8

Cloudy

WetGrass

RainSpinkler

S R P(W=F) P(W=T)

F F

T F

F T

T T

1.0 0.0

0.1 0.9

0.1 0.9

0.01 0.99

Page 11: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Example

,Pr( , 1, , 1)Pr( 1, 1) 0.2781

Pr( 1| 1)Pr( 1) Pr( 1) 0.647

0.4 981

2c rC c S R r WS W

S WW W

,Pr( , 1, , 1)Pr( 1, 1) 0.4581

Pr( 1| 1) 0.7079Pr( 1) Pr( 1) 0.6471

c sC c S R r WR W

R WW W

, ,

Pr( 1) Pr( , , , 1) 0.6471c r s

W C c S s R r W

C

W

RS

Page 12: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Example

In the example, notice that the two causes "compete" to "explain" the observed data. Hence S and R become conditionally dependent given that their common child, W, is observed, even though they are marginally independent.

For example, suppose the grass is wet, but that we also know that it is raining. Then the posterior probability that the sprinkler is on goes down:

Pr(S=1|W=1,R=1) = 0.1945

Page 13: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Inference Inference in Bayesian network means

computing the probability distribution of a set of query variables, given a set of evidence variables

Page 14: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Exact inference Inference by enumeration (with alarm

example)B E

A

J M

( , , , , )( , , )

( | , )( , ) ( , )

e a

P B e a j mP B j m

P B j mP j m P j m

( )( , , , , ) ( | , ) ( | ) ( | )( )e a e a

P B e a j m PP B a B e P j a P mP e a

Page 15: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Exact inference Variable elimination by distributive law

In general

( )( , , , , ) ( | , ) ( | ) ( | )( )e a e a

P B e a j m PP B a B e P j a P mP e a

Q H E

B E

A

J M

( , , ) ( ) ( | , ) ( | ) ( ) ( | , ) ( | )( , )

( | )( ) ( ) ( ) ( )

( , , ( | , )) ( ) ( | ) ( ) (( ) | )|

h h h

P Q h e P Q P Q h e P h e P Q P Q h e P h eP Q e

P Q eP e P e P e P e

P Q h e P Q PP Q h e P Qh e P Q P h eh

Page 16: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Exact inference Another example of variable elimination

Pr( ) Pr( , , , )

= Pr( )Pr( | )Pr( | )Pr( | , )

= Pr( ) Pr( | ) Pr( | )Pr( | , )

Pr( ) Pr( ) Pr( | ) 1( ,

c s r

c s r

c s r

W w C c S s R r W w

C c S s C c R r C c W w S s R r

C c S s C c R r C c W w S s R r

W w C c S s C c T c w

, )c s

s

Pr( ) Pr( ) 2( , )c

W w C c T c w

2( , ) Pr( | ) 1( , , )s

T c w S s C c T c w s

C

W

RS

Page 17: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Basics of Bayesian Network Complexity of exact inference

O(n) for polytree(singly connected network) - there exist at most one undirected path between any two nodes in the networks (e.g. alarm example)

Multiply connected network: exponential time (e.g. wet grass example)

Page 18: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Inference in Multi-connected BNsClustering algorithm (aka Join Tree algorithm)

Basic ideaTransform network into probabilistically

equivalent single connected BN (aka polytree) by merging (clustering) offending nodes

Most effective approach for exact evaluation of multiple connected BNs

The “new” node has only one parentThe time is reduced and is O(n)

Page 19: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Inference in Multi-connected BNsClustering algorithm (aka Join Tree algorithm)

Cloudy

WetGrass

Sprinkler Rain

Cloudy

Spr+Rain

WetGrass

S RP(W)

______________

t t.99

t f.90

f t.90

f f.00

S RP(W)

________________

t t.99

t f.90

f t.90

f f.00

P(C)=.5 P(C)=.5

C P(R)

________

t .80

f .20

C P(S)

________

t .10

f .50

C P(S+R=x)

t t t f f t f f

_________________________

t .08 .02 .72.18

f .10 .40 .1040

Page 20: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Inference in Multi-connected BNsClustering algorithm (aka Join Tree algorithm)

Jensen Join-tree (Jensen, 1996) version the current most efficient algorithm in this class (e.g. was used in Hugin, Netica)

Network evaluation is done in two stages Compile into join-tree

May be slow May require too much memory if original network is highly

connected Do belief updating in join-tree (usually fast)

Note: clustered nodes have increased complexity; updates may be computationally complex

Page 21: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Inference in Multi-connected BNsCutset conditioning The Basic idea

find a minimal set of nodes whose instantiation will make the remainder of the network single connected and therefore safe for propagation

Historical note: This technique to deal with the problem of propagation was suggested by Pearl.

Page 22: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Inference in Multi-connected BNsCutset conditioning methods Once a variable is instantiated it can be

duplicated and thus “break” a cycle A cutset is a set of variables whose instantiation

makes the graph a polytree Each polytree’s likelihood is used as a weight

when combining the results Evaluating the most likely polytrees first is called

bounded cutset conditioning

Page 23: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Inference in Multi-connected BNsCutset conditioning - Examples

Cloudy+ Cloudy+

Sprinkler Rain

Wet GrassP(S)=0.1

P(R)=0.8

Cloudy- Cloudy-

Sprinkler Rain

Wet GrassP(S)=0.5

P(R)=0.2

C P(S) C P(R)T 0.10 T 0.80F 0.50 F 0.20

Eliminate Cloudy from the BN; Sum(Cloudy+,Cloudy-)

Page 24: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Approximate Inference In Bayesian Network Solution to intractably large, multiply connected networks

Monte Carlo algorithm Widely used to estimate quantities that are difficult to calculate

exactly Randomized sampling algorithm Accuracy depends on the number of samples Two families

Direct sampling Markov chaining sampling

Page 25: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Direct Sampling Method

ProcedureSampling from known probability distributionEstimate value as

(# of matched samples) / (# of total samples)

Sampling orderSample each variable in turn, in topological

order

Page 26: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Example in simple case

Cloudy

WetGrass

Sprinkler Rain

S RP(W)

______________

t t.99

t f.90

f t.90

f f.00

P(C)=.5

C P(R)

________

t .80

f .20

C P(S)

________

t .10

f .50

[Cloudy, Sprinkler, Rain, WetGrass]

[true, , , ]

[true, false, , ]

[true, false, true, ]

[true, false, true, true]

Sampling

N = 1000N(Rain=true) = N([ _ , _ , true, _ ]) = 511P(Rain=true) = 0.511

Estimating

Page 27: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Rejection Sampling

Used in compute conditional probabilities Procedure

Generating sample from prior distribution specified by the Bayesian Network

Rejecting all that do not match the evidenceEstimating probability

Page 28: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Rejection SamplingExample Let us assume we want to estimate P(Rain|Sprinkler = true) with

100 samples

100 samples 73 samples => Sprinkler = false 27 samples => Sprinkler = true

8 samples => Rain = true 19 samples => Rain = false

P(Rain|Sprinkler = true) = NORMALIZE({8,19}) = {0.296,0.704}

Problem It rejects too many samples

Page 29: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Likelihood Weighting

AdvantageAvoiding inefficiency of rejection sampling

IdeaGenerating only events consistent with

evidenceEach event is weighted by likelihood that the

event accords to the evidence

Page 30: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Likelihood WeightingExample P(Rain|Sprinkler=true, WetGrass = true)? Sampling

The weight is set to 1.0 1. Sample from P(Cloudy) = {0.5,0.5} => true 2. Sprinkler is an evidence variable with value true

w w * P(Sprinkler=true | Cloudy = true) = 0.1 3. Sample from P(Rain|Cloudy=true)={0.8,0.2} => true 4. WetGrass is an evidence variable with value true

w w * P(WetGrass=true |Sprinkler=true, Rain = true) = 0.099 [true, true, true, true] with weight 0.099

Estimating Accumulating weights to either Rain=true or Rain=false Normalize

Page 31: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Markov Chain Monte Carlo

Let’s think of the network as being in a particular current state specifying a value for every variable

MCMC generates each event by making a random change to the preceding event

The next state is generated by randomly sampling a value for one of the nonevidence variables Xi, conditioned on the current values of the variables in the MarkovBlanket of Xi

Page 32: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Markov Blanket Markov blanket: Parents + children + children’s parents Node is conditionally independent of all other nodes in network,

given its Markov Blanket

Page 33: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Markov Chain Monte Carlo Example Query P(Rain|Sprinkler = true, WetGrass = true) Initial state is [true, true, false, true]

The following steps are executed repeatedly: Cloudy is sampled, given the current values of its MarkovBlanket

variablesSo, we sample from P(Cloudy|Sprinkler = true, Rain=false)Suppose the result is Cloudy = false.

Then current state is [false, true, false, true] Rain is sampled, given the current values of its MarkovBlanket variables

So, we sample from P(Rain|Cloudy=false,Sprinkler = true, Rain=false)Suppose the result is Rain = true.

Then current state is [false, true, true, true]

After all the iterations, let’s say the process visited 20 states where rain is true and 60 states where rain is false then the answer of the query is NORMALIZE({20,60})={0.25,0.75}

Page 34: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Example applications of BNs

Microsoft Belief Networks Advantages

Easy to learn how to use We can specify full and casually independent probability

distributions And finally it is free

http://www.research.microsoft.com/adapt/MSBNx/ Netica – from Norsys Software Corp/

Disadvatages Not free, commercial product

http://www.norsys.com/

Page 35: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Example applications of BNsMicrosoft Belief Networks

Page 36: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Example applications of BNs Netica

Page 37: Probabilistic Reasoning (2) Daehwan Kim, Ravshan Khamidov, Sehyong Kim.

Thank you!


Recommended