+ All Categories
Home > Documents > Probabilistic Inference

Probabilistic Inference

Date post: 20-Jan-2016
Category:
Upload: yamin
View: 34 times
Download: 0 times
Share this document with a friend
Description:
Probabilistic Inference. Reading: Chapter 13 Next time: How should we define artificial intelligence? Reading for next time (see Links, Reading for Retrospective Class): Turing paper Mind, Brain and Behavior, John Searle Prepare discussion points by midnight, wed night (see end of slides). - PowerPoint PPT Presentation
Popular Tags:
46
Probabilistic Inference Reading: Chapter 13 Next time: How should we define artificial intelligence? Reading for next time (see Links, Reading for Retrospective Class): Turing paper Mind, Brain and Behavior, John Searle Prepare discussion points by midnight, wed night (see end of slides)
Transcript
Page 1: Probabilistic Inference

Probabilistic Inference

Reading: Chapter 13

Next time: How should we define artificial intelligence?Reading for next time (see Links, Reading for Retrospective Class):Turing paperMind, Brain and Behavior, John SearlePrepare discussion points by midnight, wed night (see end of slides)

Page 2: Probabilistic Inference

2

Transition to empirical AI

Add in Ability to infer new facts from old Ability to generalize Ability to learn based on past observation

Key: Observation of the world Best decision given what is known

Page 3: Probabilistic Inference

3

Overview of Probabilistic Inference

Some terminology

Inference by enumeration

Bayesian Networks

Page 4: Probabilistic Inference

4

Page 5: Probabilistic Inference

5

Page 6: Probabilistic Inference

6

Page 7: Probabilistic Inference

7

Page 8: Probabilistic Inference

8

Page 9: Probabilistic Inference

9

Probability Basics

Sample space

Atomic event

Probability model

An event A

Page 10: Probabilistic Inference

10

Page 11: Probabilistic Inference

11

Random Variables

Random variable

Probability for a random variable

Page 12: Probabilistic Inference

12

Page 13: Probabilistic Inference

13

Page 14: Probabilistic Inference

14

Page 15: Probabilistic Inference

15

Page 16: Probabilistic Inference

16

Page 17: Probabilistic Inference

17

Logical Propositions and Probability

Proposition = event (set of sample points) Given Boolean random variables A and B:

Event a = set of sample points where A(ω)=true Event ⌐a=set of sample points where A(ω)=false Event aΛb=points where A(ω)=true and B(ω)=true

Often the sample space is the Cartesian product of the range of variables

Proposition=disjunction of atomic events in which it is true (aVb) = (⌐aΛb)V(aΛ⌐b)V(aΛb)

P(aVb)= P(⌐aΛb)+P(aΛ⌐b)+P(aΛb)

Page 18: Probabilistic Inference

18

Page 19: Probabilistic Inference

19

Page 20: Probabilistic Inference

20

Page 21: Probabilistic Inference

21

Page 22: Probabilistic Inference

22

Page 23: Probabilistic Inference

23

Page 24: Probabilistic Inference

24

Page 25: Probabilistic Inference

25

Axioms of Probability

All probabilities are between 0 and 1

Necessarily true propositions have probability 1. Necessarily false propositions have probability 0

The probability of a disjunction is P(aVb)=P(a)+P(b)-P(aΛb)

P(⌐a)=1-p(a)

Page 26: Probabilistic Inference

26

The definitions imply that certain logically related events must have related probabilitiesP(aVb)= P(a)+P(b)-P(aΛb)

Page 27: Probabilistic Inference

27

Prior Probability

Prior or unconditional probabilities of propositions P(female=true)=.5 corresponds to belief prior to

arrival of any new evidence Probability distribution gives values for all

possible assignments P(color) = (color = green, color=blue, color=purple) P(color)=<.6,.3,.1> (normalized: sums to 1)

Joint probability distribution for a set of r.v.s gives the probability of every atomic event on those r.v.s (i.e., every sample point) P(color,gender) = a 3X2 matrix

Page 28: Probabilistic Inference

28

Page 29: Probabilistic Inference

29

Page 30: Probabilistic Inference

30

Page 31: Probabilistic Inference

31

Page 32: Probabilistic Inference

32

Page 33: Probabilistic Inference

33

Page 34: Probabilistic Inference

34

Inference by enumeration

Start with the joint distribution

Page 35: Probabilistic Inference

35

Inference by enumeration

P(HasTeeth)=.06+.12+.02=.2

Page 36: Probabilistic Inference

36

Inference by enumeration

P(HasTeethVColor=Green)=.06+.12+.02+.24=.44

Page 37: Probabilistic Inference

37

Conditional Probability

Conditional or posterior probabilities E.g., P(PlayerWins|HostOpenDoor=1 and

PlayerPickDoor2 and Door1=goat) = .5

If we know more (e.g., HostOpenDoor=3 and door3-goat):P(PlayerWins)=1Note: the less specific belief remains valid after more evidence arrives, but is not always useful

New evidence may be irrelevant, allowing simplification: P(PlayerWins|California-

earthquake)=P(PlayerWins)=.3

Page 38: Probabilistic Inference

38

Conditional Probability

A general version holds for joint distributions:

P(PlayerWins,HostOpensDoor1)=P(PlayerWins|HostOpensDoor1)*P(HostOpensDoor1)

Page 39: Probabilistic Inference

39

Inference by enumeration Compute conditional probabilities: P(⌐Hasteeth|color=green)= P(⌐HasteethΛcolor=green)

P(color=green)0.8 = 0.24

0.06+.24

Page 40: Probabilistic Inference

40

Normalization Denominator can be viewed as normalization constraint α P(⌐Hasteeth|color=green) = α P(⌐Hasteeth|color=green)

=α[P(⌐Hasteeth,color=green, female)+ P(⌐Hasteeth,color=green, ⌐ female)]=α[<0.03,0.12>+<0.03,0.012>]=α<0.06,0.24>=<0.2,0.8>

Compute distribution on query variable by fixing evidence variables and summing over hidden variables

Page 41: Probabilistic Inference

41

Inference by enumeration

Page 42: Probabilistic Inference

42

Independence

A and B are independent iffP(A|B)=P(A) or P(B|A)=P(B) or P(A,B)=P(A)P(B)

32 entries reduced to 12; for n independent biased coins, 2n -> n

Absolute independence powerful but rare Any domain is large with hundreds of

variables none of which are independent

Page 43: Probabilistic Inference

43

Page 44: Probabilistic Inference

44

Conditional Independence

If I have length <=.2, the probability that I am female doesn’t depend on whether or not I have teeth: P(female|length<=.2,hasteeth)=P(female|hasteeth)

The same independence holds if I am >.2 P(male|length>.2,hasteeth)=P(male|

length>.2) Gender is conditionally independent of

hasteeth given length

Page 45: Probabilistic Inference

45

In most cases, the use of conditional independence reduces the size of the representation of the joint distribution from exponential in n to linear in n

Conditional independence is our most basic and robust form of knowledge about uncertain environments

Page 46: Probabilistic Inference

46

Next Class: Turing Paper

A discussion class

Graduate students and non-degree students: Anyone beyond a bachelor’s:

Prepare a short statement on the paper. Can be your reaction, your position, a place where you disagree, an explication of a point.

Undergraduates: Be prepared with questions for the graduate students

All: Submit your statement or your question by midnight Wed night.

All statements and questions will be printed and distributed in class on Wednesday.


Recommended