+ All Categories
Home > Documents > Anatole Gershman, Eugene Fink, Bin Fu , and Jaime G. Carbonell

Anatole Gershman, Eugene Fink, Bin Fu , and Jaime G. Carbonell

Date post: 04-Jan-2016
Category:
Upload: race
View: 26 times
Download: 0 times
Share this document with a friend
Description:
Analysis of uncertain data: Evaluation of Given Hypotheses Selection of probes for information gathering. Anatole Gershman, Eugene Fink, Bin Fu , and Jaime G. Carbonell. Analysis of uncertain data: Evaluation of Given Hypotheses Selection of probes for information gathering. - PowerPoint PPT Presentation
27
Analysis of uncertain data: Evaluation of Given Hypotheses Selection of probes for information gathering Anatole Gershman, Eugene Fink, Bin Fu, and Jaime G. Carbonell
Transcript
Page 1: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Analysis of uncertain data:Evaluation of Given Hypotheses

Selection of probes for information gathering

Anatole Gershman, Eugene Fink, Bin Fu, and Jaime G. Carbonell

Page 2: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Analysis of uncertain data:Evaluation of Given Hypotheses

Selection of probes for information gathering

Anatole Gershman, Eugene Fink, Bin Fu, and Jaime G. Carbonell

Page 4: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

ExampleObservations:

According to many rumors, quarterback Brett Favre has closed on the purchase of a home in Eden Prairie, MN, where the Minnesota Vikings' team facility is located.

Without the tearful public ceremony that accompanied his retirement announcement from the Green Bay Packers just 11 months ago, quarterback Brett Favre has told the New York Jets he is retiring.

Minnesota coach Brad Childress, jilted at the altar Tuesday afternoon by Brett Farve telling him he wasn’t going to play for the Vikings in 2009.

Page 5: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

ExampleObservation distributions:

Without the tearful public ceremony that accompanied his retirement announcement from the Green Bay Packers just 11 months ago, quarterback Brett Favre has told the New York Jets he is retiring.

P(says retire | Retires) = 0.9

P(says retire | Joins Vikings) = 0.6

Bayesian induction:P (Retire|says retire) = P (Retire) ∙ P(says retire|Retire) /

(P (Retire) ∙ P(says retire|Retire)

+ P (Joins Vikings) ∙ P(Joins Vikings|Retire))

= 0.5

Page 7: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

General problemWe base the analysis on m observable features, denoted OBS1, OBS2, …, OBSm. Each observation is a variable that takes one of several discrete values.

OBS1

For every observation, OBSa, we know the number of its possible values, num[a]. Thus, we have num[1..m] with the number of values for each observation.

num[1] = 2

I will RETIRE!

I won’t RETIRE!

For every hypothesis, we know the related probability distribution of each observation. P(oa,j | Hi) represents the probabilities of possible values of OBSa.

0.9

0.1

0.4

0.6

We know a specific value of each observation val [1..m].

Page 10: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Extension #1After discovering val, Posterior probability of H0:Post(H0) = P(H0) ∙ P(val | H0) / P(val) = P(H0) ∙ P(val | H0)

/ (P(H0) ∙ P(val | H0) + likelihood(val)).

Bad news: We do not know P(val | H0).

Good news: Post(H0) monotonically depends on P(val | H0); thus, if we obtain lower and upper bounds for P(val | H0), we also get bounds for Post(H0).

Page 11: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Plausibility principleUnlikely events normally do not happen; thus, if we have observed val, then its likelihood must not be too small.

Plausibility threshold: We use a global constant plaus, which must be between 0.0 and 1.0. If we have observed val, we assume that P(val) ≥ plaus / num.

We use it to obtains bounds for P(val | H0), : Lower: (plaus / num − likelihood(val)) / prior[0]. Upper: 1.0.

Page 12: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Plausibility principle

We use it to obtains bounds for P(val | H0): Lower: (plaus / num − likelihood(val)) / P(H0) . Upper: 1.0.

We substitute these bounds into the dependency of Post(H0) on P(val | H0), , thus obtaining the bounds for Post(H0): Lower: 1.0 − likelihood(val) ∙ num / plaus. Upper: P(H0) / (P(H0) + likelihood(val)).

We have derived bounds for the probability that none of the given hypotheses is correct.

Page 13: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Extension #2Multiple observations:Which one(s) to Use?

Independence assumption: usually does not work.

Bayesian analysis: Use their joint distribution? Difficult to get.

We identify the highest-utility observation and do not use other observations to corroborate it.

Page 18: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Analysis of uncertain data:Evaluation of Given Hypotheses

Selection of probes for information gathering

Anatole Gershman, Eugene Fink, Bin Fu, and Jaime G. Carbonell

Page 21: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

ExampleProbe:

ProbeCost

ObservationProbability

Gain (utility function)

Page 22: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Probe Selectionsingle-obs-gain(probej)= visible[i, a, j]· (likelihood(1) · probe-gain(1)+ … + likelihood(num[a]) · probe-gain(num[a]))+ (1.0 − visible[i, a, j]) · cost[j]

gain(probj)= max (single-obs-gain(probj, obs1),…,single-obs-gain(probj, obsm))

Probe Cost

ObservationProbability

UtilityFunction

Page 23: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

ExperimentTask: Evaluating hypothesizes (H1, H2, H3, H4).

No Probe, Accuracy of distinguishing between H1 and other hypotheses

Page 24: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Experiment

Probe Selection to distinguish H1 and other hypotheses

Task: Evaluating hypothesizes (H1, H2, H3, H4).

Page 25: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Experiment

Probe Selection to distinguish four hypotheses

Task: Evaluating hypothesizes (H1, H2, H3, H4).

Page 26: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Summary

Use Bayesian inference to distinguish among mutually exclusive hypotheses.

H0 hypothesis Multiple observations

Use Probe to gather more information for better analysis

Cost, Utility function, Observation Probability,...

Page 27: Anatole Gershman, Eugene Fink,  Bin Fu , and Jaime G. Carbonell

Thank you


Recommended