+ All Categories
Home > Documents > Inductive Amnesia

Inductive Amnesia

Date post: 06-Jan-2016
Category:
Upload: sancha
View: 46 times
Download: 3 times
Share this document with a friend
Description:
Inductive Amnesia. The Reliability of Iterated Belief Revision. EvenOdd StraightCrooked Reliability“Confirmation” Performance “Primitive norms” Correctness“Coherence” Classical statistics Bayesianism Learning theoryBelief Revision Theory. A Table of Opposites. - PowerPoint PPT Presentation
Popular Tags:
162
Inductive Amnesia Inductive Amnesia The Reliability The Reliability of of Iterated Belief Revision Iterated Belief Revision
Transcript
Page 1: Inductive Amnesia

Inductive AmnesiaInductive Amnesia

The Reliability The Reliability

of of

Iterated Belief RevisionIterated Belief Revision

Page 2: Inductive Amnesia

A Table of OppositesA Table of Opposites

EvenEven OddOdd StraightStraight CrookedCrooked ReliabilityReliability “Confirmation”“Confirmation” Performance Performance “Primitive norms”“Primitive norms” CorrectnessCorrectness “Coherence”“Coherence” Classical statistics Classical statistics BayesianismBayesianism Learning theoryLearning theory Belief Revision TheoryBelief Revision Theory

Page 3: Inductive Amnesia

The IdeaThe Idea

Belief revision is inductive reasoningBelief revision is inductive reasoning A restrictive norm prevents us from finding truths A restrictive norm prevents us from finding truths

we could have found by other meanswe could have found by other means Some proposed belief revision methods are Some proposed belief revision methods are

restrictiverestrictive The restrictiveness is expressed as The restrictiveness is expressed as inductive inductive

amnesiaamnesia

Page 4: Inductive Amnesia

Inductive AmnesiaInductive Amnesia

No restriction on memory...No restriction on memory... No restriction on predictive power...No restriction on predictive power... But prediction causes memory loss...But prediction causes memory loss... And perfect memory precludes prediction!And perfect memory precludes prediction! Fundamental dilemmaFundamental dilemma

Page 5: Inductive Amnesia

OutlineOutline

I. I. Seven belief revision methodsSeven belief revision methods II. II. Belief revision as learningBelief revision as learning III. III. Properties of the methodsProperties of the methods IV. IV. The Goodman hierarchyThe Goodman hierarchy V. V. Negative resultsNegative results VI. VI. Positive resultsPositive results VII. VII. DiscussionDiscussion

Page 6: Inductive Amnesia

Points of InterestPoints of Interest

Strong negative and positive resultsStrong negative and positive results Short run advice from limiting analysisShort run advice from limiting analysis 2 is magic for reliable belief revision2 is magic for reliable belief revision Learning as cube rotationLearning as cube rotation GrueGrue

Page 7: Inductive Amnesia

Part IPart I

Iterated Belief RevisionIterated Belief Revision

Page 8: Inductive Amnesia

Bayesian (Vanilla) UpdatingBayesian (Vanilla) Updating

B

Propositions are sets of “possible worlds”

Page 9: Inductive Amnesia

Bayesian (Vanilla) UpdatingBayesian (Vanilla) Updating

B

E newevidence

Page 10: Inductive Amnesia

Bayesian (Vanilla) UpdatingBayesian (Vanilla) Updating

Perfect memoryPerfect memory No inductive leapsNo inductive leaps

BB’

E

B’ = B *E = B E

Page 11: Inductive Amnesia

Epistemic HellEpistemic Hell

B

Page 12: Inductive Amnesia

Epistemic HellEpistemic Hell

B

E

Epistemic hell

Surprise!

Page 13: Inductive Amnesia

Epistemic HellEpistemic Hell

Scientific revolutionsScientific revolutions Suppositional reasoningSuppositional reasoning Conditional pragmaticsConditional pragmatics Decision theoryDecision theory Game theoryGame theory Data basesData bases

B

E

Epistemic hell

Page 14: Inductive Amnesia

Ordinal EntrenchmentOrdinal EntrenchmentSpohn 88Spohn 88

Epistemic state Epistemic state SS maps worlds to ordinals maps worlds to ordinals Belief state of Belief state of SS = = bb ( (S S ) = ) = S S -1-1(0)(0) Determines “centrality” of beliefsDetermines “centrality” of beliefs Model: orders of infinitesimal probabilityModel: orders of infinitesimal probability

S

B = b (S)

2

1

0

+ 1

Page 15: Inductive Amnesia

Belief Revision MethodsBelief Revision Methods

*

S’

S

E

* takes an epistemic state and a proposition to an epistemic state

b(S)

b (S *E )

Page 16: Inductive Amnesia

Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88

b (S )

S

Page 17: Inductive Amnesia

Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88

E

S

E newevidencecontradicting b (S )

b (S )

Page 18: Inductive Amnesia

Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88

E

B’

S

E

*C

b (S )

S *C E

Page 19: Inductive Amnesia

Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88

Conditions an entire Conditions an entire entrenchment orderingentrenchment ordering

Perfect memoryPerfect memory Inductive leapsInductive leaps No epistemic hell on consistent No epistemic hell on consistent

sequencessequences Epistemic hell on inconsistent Epistemic hell on inconsistent

sequencessequences

E

B’

S

E

S *C E

*C

b (S )

Page 20: Inductive Amnesia

Lexicographic Updating *Lexicographic Updating *LLSpohn 88, Nayak 94Spohn 88, Nayak 94

S

Page 21: Inductive Amnesia

Lexicographic Updating *Lexicographic Updating *LLSpohn 88, Nayak 94Spohn 88, Nayak 94

S

E

Page 22: Inductive Amnesia

Lexicographic Updating *Lexicographic Updating *LLSpohn 88, Nayak 94Spohn 88, Nayak 94

Lift refuted possibilities above Lift refuted possibilities above non-refuted possibilities non-refuted possibilities preserving order.preserving order.

Perfect memory on consistent Perfect memory on consistent sequencessequences

Inductive leapsInductive leaps No epistemic hellNo epistemic hell

B’

S S *L E

E

*L

Page 23: Inductive Amnesia

Minimal or “Natural” Updating *Minimal or “Natural” Updating *MMSpohn 88, Boutilier 93Spohn 88, Boutilier 93

B

S

Page 24: Inductive Amnesia

Minimal or “Natural” Updating *Minimal or “Natural” Updating *MMSpohn 88, Boutilier 93Spohn 88, Boutilier 93

B

E

S

Page 25: Inductive Amnesia

Minimal or “Natural” Updating *Minimal or “Natural” Updating *MMSpohn 88, Boutilier 93Spohn 88, Boutilier 93

Drop the lowest Drop the lowest possibilities consistent possibilities consistent with the data to the bottom with the data to the bottom and raise everything else and raise everything else up one notchup one notch

inductive leapsinductive leaps No epistemic hellNo epistemic hell But...But...

E

S S *M E

*M

Page 26: Inductive Amnesia

AmnesiaAmnesia

What goes up can come downWhat goes up can come down Belief no longer entails past dataBelief no longer entails past data

E

Page 27: Inductive Amnesia

AmnesiaAmnesia

What goes up can come downWhat goes up can come down Belief no longer entails past dataBelief no longer entails past data

E E’

*M

Page 28: Inductive Amnesia

AmnesiaAmnesia

What goes up can come downWhat goes up can come down Belief no longer entails past dataBelief no longer entails past data

E E’

*M *M

Page 29: Inductive Amnesia

The Flush-to-The Flush-to- Method * Method *F,F,Goldszmidt and Pearl 94Goldszmidt and Pearl 94

B

S

Page 30: Inductive Amnesia

The Flush-to-The Flush-to- Method * Method *F,F,Goldszmidt and Pearl 94Goldszmidt and Pearl 94

E

S

EE

Page 31: Inductive Amnesia

The Flush-to-The Flush-to- Method * Method *F,F,Goldszmidt and Pearl 94Goldszmidt and Pearl 94

Send non-Send non-EE worlds to a worlds to a fixed level fixed level and drop and drop E E --worlds rigidly to the worlds rigidly to the bottombottom

Perfect memory on Perfect memory on sequentially consistent sequentially consistent data data ifif is high enough is high enough

Inductive leapsInductive leaps No epistemic hellNo epistemic hell

E

S S *F, E

EE

*F,

Page 32: Inductive Amnesia

Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88

S

Page 33: Inductive Amnesia

Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88

E

S

EE

Page 34: Inductive Amnesia

Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88

E

S

EE

Page 35: Inductive Amnesia

Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88

Drop Drop E E worlds to the worlds to the bottom. Drop non-bottom. Drop non-EE worlds worlds to the bottom and then jack to the bottom and then jack them up to level them up to level

Perfect memory on Perfect memory on consistent sequences if consistent sequences if is is large enoughlarge enough

No epistemic hellNo epistemic hell ReversibleReversible But...But...

B

E

S S *J, E

B’

EE

*J,

Page 36: Inductive Amnesia

Empirical BackslidingEmpirical Backsliding

Page 37: Inductive Amnesia

Empirical BackslidingEmpirical Backsliding

E

Page 38: Inductive Amnesia

Empirical BackslidingEmpirical Backsliding

Ordinal Jeffrey Ordinal Jeffrey conditioning can conditioning can increase the increase the plausibility of a plausibility of a refuted possibilityrefuted possibility

E

Page 39: Inductive Amnesia

The Ratchet Method *The Ratchet Method *R,R,Darwiche and Pearl 97Darwiche and Pearl 97

S

Page 40: Inductive Amnesia

The Ratchet Method *The Ratchet Method *R,R,Darwiche and Pearl 97Darwiche and Pearl 97

S

E

Page 41: Inductive Amnesia

The Ratchet Method *The Ratchet Method *R,R,Darwiche and Pearl 97Darwiche and Pearl 97

Like ordinal Jeffrey Like ordinal Jeffrey conditioning except conditioning except refuted possibilities move refuted possibilities move up by up by from their current from their current positionspositions

Perfect memory if Perfect memory if is is large enoughlarge enough

Inductive leapsInductive leaps No epistemic hellNo epistemic hell

S

S *R, E

E

B

B’

*R,

Page 42: Inductive Amnesia

Part IIPart II

Belief Revision as LearningBelief Revision as Learning

Page 43: Inductive Amnesia

Iterated Belief RevisionIterated Belief Revision

*

S0 S1 S2

b (S2)

S0

((SS00 * ()) = * ()) = SS00

((SS00 * ( * (EE00, ..., , ..., EEnn, , EEn+n+11)) = ()) = (SS00 * ( * (EE00, ..., , ..., EEnn, )) * , )) * EEn+n+11

E0 E1

b (S1)b (S0)

Page 44: Inductive Amnesia

A Very Simple Learning ParadigmA Very Simple Learning Paradigm

mysterious system

outcomesequence

0 0 1 0 0

e

e|n

possible infinite trajectories

n

Page 45: Inductive Amnesia

Empirical PropositionsEmpirical Propositions

[] = the proposition

that has occurred[k, n] = theproposition thatk occurs at stage n

n

Empirical propositions are sets of possible trajectoriesEmpirical propositions are sets of possible trajectories Some special cases:Some special cases:

e

{e} = the proposition that the future trajectory is exactly e

“fan”k

Page 46: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

Page 47: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

e

possible trajectories

Page 48: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

b (S 0)

e

Page 49: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

b (S 1)

Page 50: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

b (S 2)

Page 51: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

b (S 3)

Page 52: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

convergence to {e }

b (S 4)

Page 53: Inductive Amnesia

Trajectory IdentificationTrajectory Identification

((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,

bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}

b (S 5)

etc...

Page 54: Inductive Amnesia

ReliabilityReliability

Let Let K K be a set of possible outcome be a set of possible outcome trajectoriestrajectories

((*, S*, S00) ) identifies Kidentifies K ((*, S*, S00) identifies each ) identifies each

ee in in KK

Page 55: Inductive Amnesia

Identifiability CharacterizedIdentifiability Characterized

Proposition: Proposition: KK is identifiable just in case is identifiable just in case KK is countable is countable

Page 56: Inductive Amnesia

Completeness and RestrictivenessCompleteness and Restrictiveness

* is * is completecomplete each ientifiable each ientifiable KK is is identifiable by (identifiable by (*, S*, S00), for some choice of ), for some choice of SS00..

Else * is Else * is restrictiverestrictive. .

Page 57: Inductive Amnesia

Part IIIPart III

Properties of the MethodsProperties of the Methods

Page 58: Inductive Amnesia

Timidity and StubbornnessTimidity and Stubbornness

timidity: no inductive leaps timidity: no inductive leaps without refutationwithout refutation

stubbornness: no retractions stubbornness: no retractions without refutationwithout refutation

BB’

Page 59: Inductive Amnesia

Timidity and StubbornnessTimidity and Stubbornness

timidity: no inductive leaps timidity: no inductive leaps without refutationwithout refutation

stubbornness: no retractions stubbornness: no retractions without refutationwithout refutation

BB’

Page 60: Inductive Amnesia

Timidity and StubbornnessTimidity and Stubbornness

““Belief is Bayesian in the non-Belief is Bayesian in the non-problematic case”problematic case”

All the proposed methods are All the proposed methods are timid and stubborntimid and stubborn

Vestige of the dogma that Vestige of the dogma that probability rules inductionprobability rules induction

BB’

Page 61: Inductive Amnesia

Local ConsistencyLocal Consistency

Local consistency: The Local consistency: The updated belief must updated belief must always be consistent with always be consistent with the current datumthe current datum

All the methods under All the methods under consideration are designed consideration are designed to be locally consistentto be locally consistent

Page 62: Inductive Amnesia

Timidity and StubbornnessTimidity and Stubbornness

timidity: no inductive leaps timidity: no inductive leaps without refutationwithout refutation

stubbornness: no retractions stubbornness: no retractions without refutationwithout refutation

BB’

Page 63: Inductive Amnesia

Positive Order-invariancePositive Order-invariance

Positive order-invariance: Positive order-invariance: ranking among worlds ranking among worlds satisfying all the data so satisfying all the data so far are preservedfar are preserved

All the methods considered All the methods considered are positively order-are positively order-invariantinvariant

Page 64: Inductive Amnesia

Data-RetentivenessData-Retentiveness

Data-retentiveness: Each Data-retentiveness: Each world satisfying all the data is world satisfying all the data is placed above each world placed above each world failing to satisfy some datumfailing to satisfy some datum

Data-retentiveness is sufficient Data-retentiveness is sufficient but not necessary for perfect but not necessary for perfect memorymemory

**CC, *, *LL are data-retentive are data-retentive

**R,R,,, * *J,J, are data-retentive if are data-retentive if is above the top of is above the top of SS. .

S

Page 65: Inductive Amnesia

Enumerate and TestEnumerate and Test

A method enumerates and A method enumerates and tests just in case it is: tests just in case it is:

locally consistent,locally consistent, positively order-invariant,positively order-invariant, data-retentivedata-retentive Enumerate and test Enumerate and test

methods: *methods: *CC, *, *LL

The methods with The methods with parameter parameter if a is above if a is above the top of the top of SS 00. .

preserved entrenchmentordering on live possibilities

epistemicdump forrefutedpossibilities

Page 66: Inductive Amnesia

CompletenessCompleteness

Proposition: Proposition: If * enumerates and tests, then * is If * enumerates and tests, then * is completecomplete

Proof: Let Proof: Let SS00 be an enumeration of be an enumeration of KK

Page 67: Inductive Amnesia

CompletenessCompleteness

Proposition: Proposition: If * enumerates and tests, then * is If * enumerates and tests, then * is completecomplete

Proof: Let Proof: Let SS00 be an enumeration of be an enumeration of KK

Let Let e e be in be in KK

e

Page 68: Inductive Amnesia

CompletenessCompleteness

Feed successive data along Feed successive data along ee:: [0, [0, ee(0)], [1, (0)], [1, ee(1)], ..., [(1)], ..., [nn, , ee((nn)], ...)], ...

[0, e (0)]

e

Page 69: Inductive Amnesia

CompletenessCompleteness

[0, e (0)]

e local consistencypositive invariance

data retentiveness

e

Page 70: Inductive Amnesia

CompletenessCompleteness

[0, e (0)]

e

e

[1, e (1)]

Page 71: Inductive Amnesia

CompletenessCompleteness

[0, e (0)]

e

e

e

[1, e (1)]

local consistencypositive invariance

data retentiveness

Page 72: Inductive Amnesia

CompletenessCompleteness

[0, e (0)]

e

e

e

[1, e (1)][2, e (2)]

Page 73: Inductive Amnesia

CompletenessCompleteness

[0, e (0)]

e

e

e

[1, e (1)][2, e (2)]

e

local consistpositive invar

data retentiveness

Convergence

Page 74: Inductive Amnesia

QuestionQuestion

What about the methods that aren’t data What about the methods that aren’t data retentive?retentive?

Are they complete?Are they complete? If not, can they be objectively compared?If not, can they be objectively compared?

Page 75: Inductive Amnesia

Part IV:Part IV:

The Goodman HierarchyThe Goodman Hierarchy

Page 76: Inductive Amnesia

The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman

A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty

ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))

nnee

Page 77: Inductive Amnesia

nnkk

ee ‡ ‡ nnee

The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman

A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty

ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))

Page 78: Inductive Amnesia

The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman

A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty

ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))

nn

mmkk

ee ‡ ‡ nnee

((ee ‡ ‡ nn) ‡ ) ‡ mm

Page 79: Inductive Amnesia

The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman

A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty

ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))

nn

mmkk

ee

((((ee ‡ ‡ nn) ‡ ) ‡ mm) ‡ ) ‡ kk

nn

mmkk

ee ‡ ‡ nn((ee ‡ ‡ nn) ‡ ) ‡ mm

Page 80: Inductive Amnesia

The Goodman HierarchyThe Goodman Hierarchy

GGnn((ee) = the set of all trajectories you can get by gruing ) = the set of all trajectories you can get by gruing ee up to up to nn positions positions

GGnneveneven((ee) = the set of all trajectories you can get by ) = the set of all trajectories you can get by

gruing gruing ee an even number of distinct positions up to 2 an even number of distinct positions up to 2nn

GG00((ee))GG11((ee))

GG22((ee))GG33((ee))

GG00eveneven((ee))

GG11eveneven

((ee))

nn

mmkk

Page 81: Inductive Amnesia

The Goodman LimitThe Goodman Limit

GG((ee) = ) = nnGGnn((ee))

GGeveneven((ee) = ) = nnGGnn

eveneven ((ee))

Proposition: Proposition: GGeveneven((ee) = the set of all finite ) = the set of all finite

variants of variants of ee

Page 82: Inductive Amnesia

The Goodman SpectrumThe Goodman Spectrum

GG00((ee))GG11((ee))

GG22((ee))

GG33((ee))

GG((ee))

Min Flush Jeffrey Ratch Lex Cond

= 2

= 2

= 2 yes

yes

yes

yes yes

yes

yes

yes

= 2

= 2

no

no

no

no = = 2 = 2

= n +1

= 3

= 2

yes yes

= 0 = 0 = 0yes

= 1

Page 83: Inductive Amnesia

The Even Goodman SpectrumThe Even Goodman Spectrum

GG00eveneven

((ee))GG11

eveneven ((ee))

GG22eveneven

((ee))

GGnneveneven

((ee))

GGeveneven

((ee))

= 1

Min Flush Jeffrey Ratch Lex Cond

= 0

= 2

= 3

= n +1

= 0 = 0

= 1

= 1

= 1

= 1

yes

yes

yes

yes

yes yes

yes

yes

yes

yes

= 1

= 1

yes

no

no

no

no =

= 1

Page 84: Inductive Amnesia

Part V:Part V:

Negative ResultsNegative Results

Page 85: Inductive Amnesia

Epistemic DualityEpistemic Duality

. . .

“tabula rasa”Bayesian

“conjectures and refutations”Popperian

Page 86: Inductive Amnesia

Epistemic ExtremesEpistemic Extremes

. . .

perfect memoryno projections

projects the futuremay forget

*J,2

Page 87: Inductive Amnesia

Opposing Epistemic PressuresOpposing Epistemic Pressures

Page 88: Inductive Amnesia

Opposing Epistemic PressuresOpposing Epistemic Pressures

rarefaction for inductive leaps

Page 89: Inductive Amnesia

Opposing Epistemic PressuresOpposing Epistemic Pressures

Identification requires Identification requires bothboth Is there a critical value of Is there a critical value of for which for which

they can be balanced for a given they can be balanced for a given problem problem KK??

compression for memory

rarefaction for inductive leaps

Page 90: Inductive Amnesia

Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))

GG00((ee))GG11((ee))

GG22((ee))

GG33((ee))

GG((ee))

Min Flush Jeffrey Ratch Lex Cond

= 2

= 2

= 2 yes

yes

yes

yes yes

yes

yes

yes

= 2

= 2

no

no

no

no = = 2

= n +1

= 3

= 2

yes yes

= 0 = 0 = 0yes

= 1

= 2

Page 91: Inductive Amnesia

Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))

Proof: Proof: Suppose otherwiseSuppose otherwise Feed Feed ee until until ee is uniquely at the bottom is uniquely at the bottom

Page 92: Inductive Amnesia

Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))

Proof: Proof: Suppose otherwiseSuppose otherwise Feed Feed ee until until ee is uniquely at the bottom is uniquely at the bottom

e

data so far

?

Page 93: Inductive Amnesia

Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))

By the well-ordering condition, By the well-ordering condition,

e

data so far

else...

?

Page 94: Inductive Amnesia

Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))

Now feed Now feed e’ e’ foreverforever By stage By stage n, n, the picture is the samethe picture is the same

e

?

e’

e’’

e’ n

positive order invariance

timidity and stubbornness

Page 95: Inductive Amnesia

Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))

At stage At stage nn +1, +1, ee stays at the stays at the bottom (timid and stubborn).bottom (timid and stubborn).

So So e’ e’ can’t travel down can’t travel down (definitions of the rules)(definitions of the rules)

e’’ e’’ doesn’t rise (definitions doesn’t rise (definitions of the rules)of the rules)

Now Now e’’ e’’ makes it to the makes it to the bottom at least as soon as bottom at least as soon as e’e’

e

?

e’

e’’

e’ n

Page 96: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))

GG00((ee))GG11((ee))

GG22((ee))

GG33((ee))

GG((ee))

Min Flush Jeffrey Ratch Lex Cond

= 2

= 2

= 2 yes

yes

yes

yes yes

yes

yes

yes

= 2

= 2

no

no

no

no = = 2 = 2

= n +1

= 3

= 2

yes yes

= 0 = 0 = 0yes

= 1

Page 97: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

ProofProof: Suppose otherwise: Suppose otherwise Bring Bring ee uniquely to the bottom, say at stage uniquely to the bottom, say at stage kk

e

k

Page 98: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Start feeding Start feeding a a = = ee ‡ ‡ kk

e

k

a

Page 99: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

By some stage By some stage k’k’, , a a is uniquely downis uniquely down So between So between k + k + 1 and 1 and k’k’, there is a first stage , there is a first stage j j when no finite variant of when no finite variant of ee is at is at

the bottomthe bottom

a

k k’

a

Page 100: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Let Let c c in in GG22((ee) be a finite variant ) be a finite variant of of e e that rises to level 1 at that rises to level 1 at jj

k’

a

k j

c

Page 101: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Let Let c c in in GG22((ee) be a finite variant ) be a finite variant of of e e that rises to level 1 at that rises to level 1 at jj

k’

a

k j

c

Page 102: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

SoSo c c((j j - 1) - 1) aa((j j - 1)- 1)

k’

a

k j

c

Page 103: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Let Let dd be be aa up to up to jj and and ee thereafterthereafter

So is in So is in GG22((ee)) Since Since dd differs from differs from ee, , dd is at is at

least as high as level 1 at least as high as level 1 at j j k’

a

k j

c

d

1

Page 104: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Show: Show: cc agrees with agrees with ee after after jj..

k’

a

k j

c

d

1

Page 105: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Case: Case: jj = = kk+1+1 Then Then cc could have been chosen could have been chosen

as as ee since since ee is uniquely at the is uniquely at the bottom at bottom at kkk’

a

k j

c

d

1

Page 106: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Case: Case: jj > > kk+1+1 Then Then cc wouldn’t have been at wouldn’t have been at

the bottom if it hadn’t agreed the bottom if it hadn’t agreed with with aa (disagreed with (disagreed with ee))k’

a

k j

c

d

1

Page 107: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Case: Case: jj > > kk+1+1 So So cc has already used up its two has already used up its two

grues against grues against ee

k’

a

k j

c

d

1

Page 108: Inductive Amnesia

Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte

Feed c forever afterFeed c forever after By positive invariance, either By positive invariance, either

never projects never projects or or forgetsforgets the the refutation of refutation of cc at at jj-1 -1 k’ k j

c

d

1

d

Page 109: Inductive Amnesia

The Internal Problem of InductionThe Internal Problem of Induction

Necessary condition for success by pos ord-invar methods:Necessary condition for success by pos ord-invar methods: no data stream is a no data stream is a kk-limit point of data streams as low as it -limit point of data streams as low as it

after it has been presented for after it has been presented for kk steps steps

bad

Page 110: Inductive Amnesia

The Internal Problem of InductionThe Internal Problem of Induction

Necessary condition for success by pos ord-invar methods:Necessary condition for success by pos ord-invar methods: no data stream is a no data stream is a kk-limit point of data streams as low as it -limit point of data streams as low as it

after it has been presented for after it has been presented for kk steps steps

bad good

Page 111: Inductive Amnesia

Corollary: Stacking LemmaCorollary: Stacking Lemma

Necessary condition for Necessary condition for identification of identification of GGnn+1+1((ee) ) by positively order-by positively order-invariant methodsinvariant methods

If If ee is at the bottom level is at the bottom level after being presented up after being presented up to stage to stage kk, then some , then some data stream data stream e’ e’ in in GGnn+1+1((ee) ) - - GGnn ((ee) agreeing with the ) agreeing with the data so far is at least at data so far is at least at level level nn+1+1

Page 112: Inductive Amnesia

Corollary: Stacking LemmaCorollary: Stacking Lemma

Necessary condition for Necessary condition for identification of identification of GGnn+1+1((ee) ) by positively order-by positively order-invariant methodsinvariant methods

If If ee is at the bottom level is at the bottom level after being presented up after being presented up to stage to stage kk, then some , then some data stream data stream e’ e’ in in GGnn+1+1((ee) ) - - GGnn ((ee) in agreeing with ) in agreeing with the data so far is at least the data so far is at least at level at level nn+1+1

Why?

Page 113: Inductive Amnesia

Corollary: Stacking LemmaCorollary: Stacking Lemma

Necessary condition for Necessary condition for identification of identification of GGnn+1+1((ee) ) by positively order-by positively order-invariant methodsinvariant methods

If If ee is at the bottom level is at the bottom level after being presented up after being presented up to stage to stage kk, then some , then some data stream data stream e’ e’ in in GGnn+1+1((ee) ) - - GGnn ((ee) in agreeing with ) in agreeing with the data so far is at least the data so far is at least at level at level nn+1+1

Else!

Page 114: Inductive Amnesia

Even Stacking LemmaEven Stacking Lemma

Similarly for Similarly for GGn+n+11eveneven((ee))

Page 115: Inductive Amnesia

Even Stacking LemmaEven Stacking Lemma

Similarly for Similarly for GGn+n+11eveneven((ee))

Why?

Page 116: Inductive Amnesia

Even Stacking LemmaEven Stacking Lemma

Similarly for Similarly for GGn+n+11eveneven((ee))

Else!

Page 117: Inductive Amnesia

Method *Method *F,F,nn Fails on Fails on GGnn((ee))

GG00((ee))GG11((ee))

GG22((ee))

GG33((ee))

GG((ee))

Min Flush Jeffrey Ratch Lex Cond

= 2

= 2

= 2 yes

yes

yes

yes yes

yes

yes

yes

= 2

= 2

no

no

no

no = = 2 = 2

= n +1

= 3

= 2

yes yes

= 0 = 0 = 0yes

= 1

Page 118: Inductive Amnesia

Method *Method *F,F,nn Fails on Fails on GGnn((ee))

Proof for Proof for = 4: Suppose otherwise= 4: Suppose otherwise Bring Bring ee uniquely to the bottom uniquely to the bottom

Page 119: Inductive Amnesia

Method *Method *F,F,nn Fails on Fails on GGnn((ee))

Proof for Proof for = 4: Suppose otherwise= 4: Suppose otherwise Bring Bring ee uniquely to the bottom uniquely to the bottom

e

Page 120: Inductive Amnesia

0

4

e’

? e’’

Method *Method *F,F,nn Fails on Fails on GGnn((ee))

Apply stacking lemmaApply stacking lemma Let Let e’ e’ be in Gbe in G44((ee) - G) - G33((ee) at or ) at or

above level 4above level 4 Let Let e’’ e’’ be the same except at a be the same except at a

the first place the first place kk where where e’ e’ differs from differs from ee

Feed Feed e’e’ forever after forever after

k

e

Page 121: Inductive Amnesia

? e’’

Method *Method *F,F,nn Fails on Fails on GGnn((ee))

Timidity, stubbornness and Timidity, stubbornness and positive invariance hold the positive invariance hold the picture fixed up to picture fixed up to kk

k

0

4

e’

e

Page 122: Inductive Amnesia

? e’’

Method *Method *F,F,nn Fails on Fails on GGnn ((ee))

Ouch! Ouch! Positive invariance, Positive invariance, timidity and stubbornness and timidity and stubbornness and = 4 = 4

k

k

= 4

0

4

e’

e 0

e’

e

Page 123: Inductive Amnesia

Method *Method *F,F,nn Fails on Fails on GGnneveneven

((ee))

Same, using the even stacking lemmaSame, using the even stacking lemma

Page 124: Inductive Amnesia

Part VI:Part VI:

Positive ResultsPositive Results

Page 125: Inductive Amnesia

How to Program Epistemic StatesHow to Program Epistemic States

Hamming Distance: Hamming Distance: ((ee, , e’e’) = {) = {nn: : ee((nn) °) ° e’ e’((nn)})} ((ee, , e’e’) = | ) = | ((ee, , e’e’) |) |

(e, e’ ) = 9

e

e’

Page 126: Inductive Amnesia

Hamming AlgebraHamming Algebra

aa HH b mod e b mod e ((ee, , aa) is a subset of ) is a subset of ee, , bb))

1 1 1

1 1 0 1 0 1 0 1 1

1 0 0 0 1 0 0 0 1

0 0 0

Hamming

Page 127: Inductive Amnesia

Epistemic States as Boolean RanksEpistemic States as Boolean Ranks

Hamming Algebra

GGeveneven

((ee))

e

Page 128: Inductive Amnesia

Advantage of Hamming RankAdvantage of Hamming Rank

No violations of limit point condition over No violations of limit point condition over finite levels:finite levels:

nobody lower can match this

Page 129: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

GG00eveneven

((ee))GG11

eveneven ((ee))

GG22eveneven

((ee))

GGnneveneven

((ee))

GGeveneven

((ee))

= 1

Min Flush Jeffrey Ratch Lex Cond

= 0

= 2

= 3

= n +1

= 0 = 0

= 1

= 1

= 1

= 1

yes

yes

yes

yes

yes yes

yes

yes

yes

yes

= 1

= 1

yes

no

no

no

no =

= 1

Page 130: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

Proof: Proof: Let Let SS be generated by be generated by finite ranks of the Hamming finite ranks of the Hamming algebra rooted at algebra rooted at ee

GGeveneven

((ee))

e

S

Page 131: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

Let a be an arbitrary element of Let a be an arbitrary element of GG

eveneven((ee))

So So aa is at a finite level of is at a finite level of SS

e

a

S

Page 132: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

Consider the principal ideal of Consider the principal ideal of aa These are the possibilities that These are the possibilities that

differ from differ from ee only where only where aa does does

e

a S

Page 133: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

So these are the possibilities that So these are the possibilities that have just one difference from have just one difference from the truth for each level below the truth for each level below the truththe truth

e

a S

e

33

a

Page 134: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

Induction = hypercube rotationInduction = hypercube rotation

Page 135: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

a e Example

Page 136: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

a

e

a e Example

Page 137: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

a

e

Page 138: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

a

e

Page 139: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

ConvergenceConvergence

a

e

Page 140: Inductive Amnesia

**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))

Other possibilities have more than enough Other possibilities have more than enough differences to climb above the truthdifferences to climb above the truth

**J,1 J,1 doesn’t backslide since the rotation doesn’t backslide since the rotation keeps refuted possibilities rising keeps refuted possibilities rising

Page 141: Inductive Amnesia

**R,2R,2 is Complete is Complete

GG00((ee))GG11((ee))

GG22((ee))

GG33((ee))

GG((ee))

Min Flush Jeffrey Ratch Lex Cond

= 2

= 2

= 2 yes

yes

yes

yes yes

yes

yes

yes

= 2

= 2

no

no

no

no = = 2

= n +1

= 3

= 2

yes yes

= 0 = 0 = 0yes

= 1

= 2

Page 142: Inductive Amnesia

**R,2R,2 is Complete is Complete

Proposition: Proposition: **R,2R,2 is a complete function is a complete function identifieridentifier

Proof: Proof: Let Let KK be countable be countable Partition Partition KK into finite variant classes into finite variant classes

CC00, , CC11, ..., , ..., CCn n , ..., ...

Page 143: Inductive Amnesia

**R,2R,2 is Complete is Complete

Impose the Hamming distance ranking on Impose the Hamming distance ranking on each equivalence classeach equivalence class

Now raise the Now raise the nnth Hamming ranking by th Hamming ranking by nn

CC00 CC11 CC22 CC33 CC44

SS

Page 144: Inductive Amnesia

**R,2R,2 is Complete is Complete

Else might generate horizontal limit points:Else might generate horizontal limit points:

CC00 CC11 CC22 CC33 CC44

SS

Page 145: Inductive Amnesia

**R,2R,2 is Complete is Complete

Data streams in different columns differ Data streams in different columns differ infinitely often from the truth. infinitely often from the truth.

CC00 CC11 CC22 CC33 CC44

truthzoom

!

Page 146: Inductive Amnesia

**R,2R,2 is Complete is Complete

Data streams in the same column just barely Data streams in the same column just barely make it because they jump by 2 for each make it because they jump by 2 for each difference from the truthdifference from the truth

CC00 CC11 CC22 CC33 CC44

SS

Page 147: Inductive Amnesia

**R,2R,2 is Complete is Complete

Data streams in the same column just barely Data streams in the same column just barely make it because they jump by 2 for each make it because they jump by 2 for each difference from the truthdifference from the truth

CC00 CC11 CC22 CC33 CC44

SS

Page 148: Inductive Amnesia

**R,2R,2 is Complete is Complete

Convengence at least by the stage when 2Convengence at least by the stage when 2mm differences from differences from ee have been observed for have been observed for each each eeii belowbelow e e that is not in the column of that is not in the column of ee

CC00 CC11 CC22 CC33 CC44

SSe m

e 0

e 3

e 4

Page 149: Inductive Amnesia

How about *How about *J,2J,2??

The same thing works, The same thing works, rightright??

CC00 CC11 CC22 CC33 CC44

SS

Page 150: Inductive Amnesia

The Wrench in the WorksThe Wrench in the Works

Proposition: Proposition: Even *Even *J,2 J,2 can’t succeed if we can’t succeed if we add ¬ add ¬ ee to to GG

eveneven((ee) when ) when SS extends the extends the

Hamming rankingHamming ranking Proof: Proof: Suppose otherwiseSuppose otherwise

Page 151: Inductive Amnesia

The Wrench in the WorksThe Wrench in the Works

Feed ¬ Feed ¬ e e until it is uniquely at the bottom until it is uniquely at the bottom

¬¬e

k

Page 152: Inductive Amnesia

The Wrench in the WorksThe Wrench in the Works

Let Let nn exceed exceed kk and the original height of ¬ and the original height of ¬ee aa = ¬ = ¬e e ‡ ‡ n n b = b = ¬¬e e ‡ ‡ n+n+11

¬¬e

a

b

k n

Page 153: Inductive Amnesia

The Wrench in the WorksThe Wrench in the Works

By positive invariance, timidity and stubb.By positive invariance, timidity and stubb.

¬¬e

a

b

k n

Page 154: Inductive Amnesia

The Wrench in the WorksThe Wrench in the Works

By positive invariance, timidity, stubbornness and the fact that ¬By positive invariance, timidity, stubbornness and the fact that ¬ee was was alone in the basementalone in the basement

¬¬e

ab

k n

Ouch!!!

Page 155: Inductive Amnesia

(e, e’ ) = 6

e

e’

Solution: A Different Initial StateSolution: A Different Initial State

Goodman Distance: Goodman Distance: ((ee, , e’e’) = {) = {nn: : e’e’ grues grues ee at at nn)})} gg ( (ee, , e’e’) = | ) = | ((ee, , e’e’) |) |

Page 156: Inductive Amnesia

Hamming vs. Goodman AlgebrasHamming vs. Goodman Algebras

aa HH b mod e b mod e ((ee, , aa) is a subset of ) is a subset of ee, , bb))

aa GG b mod e b mod e ((ee, , aa) is a subset of) is a subset ofee, , bb))

Goodman1 0 1

1 1 0 0 1 0

0 1 1 0 0 1

0 0 0

1 1 1

1 1 0 1 0 1 0 1 1

1 0 0 0 1 0 0 0 1

0 0 0

Hamming

1 0 0

1 1 1

Page 157: Inductive Amnesia

Epistemic States as Boolean RanksEpistemic States as Boolean Ranks

GoodmanHamming

GG((ee))GGeveneven

((ee))

GGoddodd

((ee))

e e

Page 158: Inductive Amnesia

Epistemic States as Boolean RanksEpistemic States as Boolean Ranks

Goodman

GG00((ee))GG11((ee))GG22((ee))GG33((ee))

e

Page 159: Inductive Amnesia

**J,2 J,2 can identify can identify GG ((ee))

GG00((ee))GG11((ee))

GG22((ee))

GG33((ee))

GG((ee))

Min Flush Jeffrey Ratch Lex Cond

= 2

= 2

= 2 yes

yes

yes

yes yes

yes

yes

yes

= 2

= 2

no

no

no

no = = 2

= n +1

= 3

= 2

yes yes

= 0 = 0 = 0yes

= 1

= 2

Page 160: Inductive Amnesia

**J,2 J,2 can identify can identify GG ((ee))

Proof: Proof: Use the Goodman ranking as initial stateUse the Goodman ranking as initial state Show by induction that the method projects Show by induction that the method projects ee

until a grue occurs at until a grue occurs at nn, then projects , then projects ee ‡ ‡ nn, until , until another grue occurs at another grue occurs at n’n’, etc. , etc.

Page 161: Inductive Amnesia

Part VII:Part VII:

DiscussionDiscussion

Page 162: Inductive Amnesia

SummarySummary

Belief revision as inductive inquiryBelief revision as inductive inquiry Reliability vs. intuitive symmetriesReliability vs. intuitive symmetries Intuitive symmetries imply reliability for large Intuitive symmetries imply reliability for large Intuitive symmetries restrict reliability for small Intuitive symmetries restrict reliability for small Sharp discriminations among proposed methodsSharp discriminations among proposed methods Isolation of fundamental epistemic dilemmaIsolation of fundamental epistemic dilemma = 2 as fundamental epistemic invariant= 2 as fundamental epistemic invariant Learning as cube rotationLearning as cube rotation Surprising relevance of tail reversalsSurprising relevance of tail reversals


Recommended