+ All Categories
Home > Documents > Reasons for (prior) belief in Bayesian epistemology

Reasons for (prior) belief in Bayesian epistemology

Date post: 06-Jan-2016
Category:
Upload: lars
View: 27 times
Download: 0 times
Share this document with a friend
Description:
Reasons for (prior) belief in Bayesian epistemology. Christian List ( joint work with Franz Dietrich) http://personal.lse.ac.uk/LIST/ Paper forthcoming in Synthese LSE, November 2012. Introduction. Bayesian epistemology - PowerPoint PPT Presentation
31
Reasons for (prior) belief in Bayesian epistemology Christian List (joint work with Franz Dietrich) http://personal.lse.ac.uk/LIST/ Paper forthcoming in Synthese LSE, November 2012
Transcript
Page 1: Reasons for (prior) belief in Bayesian epistemology

Reasons for (prior) belief in Bayesian epistemology

Christian List(joint work with Franz Dietrich)

http://personal.lse.ac.uk/LIST/

Paper forthcoming in Synthese

LSE, November 2012

Page 2: Reasons for (prior) belief in Bayesian epistemology

Introduction

Bayesian epistemology

tells us how to move from prior to posterior beliefs in light

of new evidence or information,

but says little about where our prior beliefs come from.

It offers few resources to describe some prior beliefs as

rational or well-justified, and others as irrational or

unreasonable.

Page 3: Reasons for (prior) belief in Bayesian epistemology

Rational choice theory

A different strand of epistemology takes the central epistemological question

to be

not how to change one’s beliefs in light of new evidence (though this

obviously remains important),

but what reasons justify a given set of beliefs in the first place.

We offer an account of rational belief formation that closes some of the gap

between Bayesianism and its reason-based alternative.

We formalize the idea that an agent can have reasons for his or her (prior)

beliefs, as distinct from evidence/information in the Bayesian sense.

This is part of a larger programme of research on the role of reasons in

rational agency (Dietrich and List 2012a,b).

Page 4: Reasons for (prior) belief in Bayesian epistemology

Posterior beliefs

Prior beliefs Evidence/information

FIXED/GIVEN CHANGEABLE

Page 5: Reasons for (prior) belief in Bayesian epistemology

(Prior) beliefs

Credibility relation Doxastic reasons

FIXED/GIVEN CHANGEABLE

ALSO CHANGEABLEAs before

Posterior beliefs

Evidence/information

Page 6: Reasons for (prior) belief in Bayesian epistemology

Overview of this talk

Beliefs

Reasons for belief

An example

An axiomatic characterization result

The cardinal case

Page 7: Reasons for (prior) belief in Bayesian epistemology

Overview of this talk

Beliefs

Reasons for belief

An example

An axiomatic characterization result

The cardinal case

Page 8: Reasons for (prior) belief in Bayesian epistemology

The objects of belief

We want to model how an agent forms his or her prior beliefs

over some set X of basic objects of beliefs.

The elements of X could be, e.g., possible worlds, states

of the world, or rival hypotheses.

We call them epistemic possibilities.

We only assume that the alternatives in X are mutually

exclusive and jointly exhaustive of the relevant space of

possibilities.

Page 9: Reasons for (prior) belief in Bayesian epistemology

An agent’s beliefs

In Bayesian epistemology, the agent’s beliefs are usually

represented by a credence function (subjective probability

function) on X, which assigns to each possibility in X a real number

between 0 and 1, with a sum-total of 1.

However, we here begin by representing the agent’s beliefs by a

credence order on ≿ X (a complete and transitive binary relation).

x≿y means that the agent believes x at least as strongly as y.

( and ≻ denote the induced strict and indifference relations.)

Page 10: Reasons for (prior) belief in Bayesian epistemology

Beliefs and belief formation

Bayesian epistemology gives an account of how an agent’s beliefs

should rationally change in response to evidence or information.

If the agent receives evidence that rules out some possibilities in X, he

or she must change the credence order so as to rank any possibilities

ruled out below (or weakly below) any possibilities not ruled out, while

not changing other rankings (Bayesian updating).

Here, however, we focus on the problem of belief formation:

How does the agent arrive at his or her beliefs over X in the first place,

before receiving any evidence?

Page 11: Reasons for (prior) belief in Bayesian epistemology

Beliefs and belief formation

We can look at this problem from both positive and

normative perspectives, i.e., we can ask

either how an agent actually forms his or her beliefs,

or how he or she ought rationally to do so.

We develop a formal framework that can be used to

investigate both questions. This is where reasons come

into play.

Page 12: Reasons for (prior) belief in Bayesian epistemology

Overview of this talk

Beliefs

Reasons for belief

An example

An axiomatic characterization result

The cardinal case

Page 13: Reasons for (prior) belief in Bayesian epistemology

Reasons in general

Reasons can be conceptualized in a number of ways.

Scanlon, e.g., defines a reason as “a consideration that counts in

favor of some judgment-sensitive attitude [e.g., belief or desire]”

(What we owe to each other, p. 67).

We adopt a more general definition, preserving the “counting” part

but not the “in favor” part of Scanlon’s definition.

We think of reasons as propositions that play a special role (that

somehow “count” or “matter”) in the agent’s attitude formation – in the

present context, in his/her belief formation.

Page 14: Reasons for (prior) belief in Bayesian epistemology

Doxastic reasons

A proposition (in general) is a subset of X.

It is true of the possibilities contained in it, and false of all others.

More generally, propositions could be represented by sentences

from a suitable language.

We can also think of each proposition as capturing a particular

property of the epistemic possibilities.

Now suppose that there is some set of propositions, D, that the agent

focuses on in his/her belief formation process; we call these the

agent’s doxastic reasons.

Page 15: Reasons for (prior) belief in Bayesian epistemology

Doxastic reasons

When a proposition is in D,

this does not mean that the agent believes it;

it only means that, in forming his/her belief about each epistemic

possibility, the agent considers whether or not the proposition

is true of that possibility.

So, the propositions in D stand for questions that the agent asks

him/herself in the process of belief formation.

We further define D to be the set of all possible such sets D.

D could simply be the set of all possible sets of propositions (or

smaller – I set technicalities aside).

Page 16: Reasons for (prior) belief in Bayesian epistemology

The focal doxastic reasons

To indicate that the agent’s credence order ≿ depends on his

or her set D, we append the subscript D to the symbol ≿.

≿D = the agent’s credence order when D is focal

A full model of an agent’s beliefs requires the ascription of a

family (≿D)D D of credence orders to the agent, one ≿D for each

D D.

So, how exactly does the credence order ≿D depend on D?

Page 17: Reasons for (prior) belief in Bayesian epistemology

Overview of this talk

Beliefs

Reasons for belief

An example

An axiomatic characterization result

The cardinal case

Page 18: Reasons for (prior) belief in Bayesian epistemology

An example: meeting in DC

Suppose I have agreed to meet Alexandru somewhere in Washington DC at

12 noon tomorrow.

We have not agreed on a place, and we have no way to communicate.

I have no evidence in the standard sense as to where Alexandru is likely to

expect me.

Here are some possibilities:

Union Station

Lincoln Memorial

White House

Hilary Clinton’s apartment

How do I form my prior beliefs over where Alexandru might expect me?

Page 19: Reasons for (prior) belief in Bayesian epistemology

An example: meeting in DC

The following are some possible considerations that might be relevant:

A : The place in question is where one arrives in Washington ({Union}).

F : The place in question is world-famous ({Lincoln, WH}).

R : The place in question has restricted access ({WH, Hilary C.’s apt.}).

My credence orderings across variations in D might look like this:

D={A,F,R} Union ≻D Lincoln ≻D WH ≻D Hilary Clinton’s apt.

D={A,F} Union ≻D Lincoln D WH ≻D Hilary Clinton’s apt.

D={A,R} Union ≻D Lincoln ≻D WH D Hilary Clinton’s apt.

D={F,R} Lincoln ≻D WH ≻D Union ≻D Hilary Clinton’s apt.

D={A} Union ≻D Lincoln D WH D Hilary Clinton’s apt.

D={F} WH D Lincoln ≻D Union D Hilary Clinton’s apt.

D={R} Union D Lincoln ≻D WH D Hilary Clinton’s apt.

Union D Lincoln D WH D Hilary Clinton’s apt.

Page 20: Reasons for (prior) belief in Bayesian epistemology

An example: meeting in DC

Can we say something systematic about these

beliefs?

They are what we call reason-based.

Page 21: Reasons for (prior) belief in Bayesian epistemology

Reason-based beliefs

The agent’s family of credence orders is reason-based if there exists a

binary relation over sets of reasons (a credibility relation) such that:

for any D D and any x, y X:

The agent believes x more than y when focusing on the reasons in D

if and only if

the set of reasons in D that are true of x is ranked above

the set of reasons in D that are true of y;

formally, x ≿D y {RD : xR} {RD : yD}.

Page 22: Reasons for (prior) belief in Bayesian epistemology

The example again… The following are a few possible considerations that might be relevant:

A : The place in question is where one arrives in Washington ({Union}).

F : The place in question is world-famous ({Lincoln, WH}).

R : The place in question has restricted access ({WH, Hilary C.’s apt.}).

My credence orderings across variations in D might look like this:

D={A,F,R} Union ≻D Lincoln ≻D WH ≻D Hilary Clinton’s apt.

D={A,F} Union ≻D Lincoln D WH ≻D Hilary Clinton’s apt.

D={A,R} Union ≻D Lincoln ≻D WH D Hilary Clinton’s apt.

D={F,R} Lincoln ≻D WH ≻D Union ≻D Hilary Clinton’s apt.

D={A} Union ≻D Lincoln D WH D Hilary Clinton’s apt.

D={F} WH D Lincoln ≻D Union D Hilary Clinton’s apt.

D={R} Union D Lincoln ≻D WH D Hilary Clinton’s apt.

Union D Lincoln D WH D Hilary Clinton’s apt.

These beliefs are reason-based, w.r.t. the following credibility relation:

{A} > {F} > {F,R} > > {R}

Page 23: Reasons for (prior) belief in Bayesian epistemology

Overview of this talk

Beliefs

Reasons for belief

An example

An axiomatic characterization result

The cardinal case

Page 24: Reasons for (prior) belief in Bayesian epistemology

Two axioms on the relationship between reasons and beliefs

Axiom 1. ‘Principle of insufficient reason.’

For any x, y X and any D D,

if {R D : x R} = {R D : y R},

then x D y.

Axiom 2. ‘Invariance of relative likelihoods under the addition of

irrelevant reasons.’

For any x, y X and any D,D’ D with D D’,

if for all R D’ \D, x R and y R,

then x ≿D y x ≿D’ y.

Page 25: Reasons for (prior) belief in Bayesian epistemology

The basic representation theorem

(1) The agent’s family of credence orders satisfies Axioms 1

and 2

if and only if

(2) it is reason-based.

That is, there exists a credibility relation over sets of

reasons such that, for any D D and x, y X,

x ≿D y {RD : xR} {RD : yR}.

Page 26: Reasons for (prior) belief in Bayesian epistemology

Overview of this talk

Beliefs

Reasons for belief

An example

An axiomatic characterization result

The cardinal case

Page 27: Reasons for (prior) belief in Bayesian epistemology

Beliefs: the cardinal case

We now go beyond considering credence orders and represent

an agent’s beliefs in the more standard way by credence

functions (subjective probability functions).

A credence function is a function Pr : X [0,1] such that

xX Pr(x) = 1.

(Probabilities of propositions are defined in the usual way.)

We write PrD as the agent’s credence function when D is his or

her set of doxastic reasons in relation to the possibilities in X.

We are interested in how PrD depends on D.

Page 28: Reasons for (prior) belief in Bayesian epistemology

Two axioms on the relationship between reasons and beliefs

(the cardinal case)

Axiom 1. ‘Principle of insufficient reason’

For any x, y X and any D D,

if {RD : R is true of x} = {RD : R is true of y},

then PrD(x) = PrD(y).

Axiom 2. ‘Invariance of likelihood ratios under the addition of

irrelevant reasons’

For any x, y X and any D, D’ D with D D’,

if no R in D’ \D is true of x or y,

then PrD(x)/PrD(y) = PrD’ (x)/PrD’ (y).

Page 29: Reasons for (prior) belief in Bayesian epistemology

Theorem 2

(1) The agent’s family of credence functions PrD across

all D D satisfies Axioms 1 and 2

if and only if

(1) there is a credibility function from possible

combinations of doxastic reasons into the real

numbers such that, for all D D and all x X,

({RD : R is true of x}) PrD(x) = .

x’ X ({RD : R is true of x’ })

Page 30: Reasons for (prior) belief in Bayesian epistemology

Remarks

The (prior) probability of a given possibility is proportional to the

credibility of the set of doxastic reasons that are true of that

possibility.

The factor of proportionality,

1 / x’ X ({RD : R is true of x’ }) ,

depends on D and ensures that probabilities add up to 1.

Interpretationally,

just as practical reasons are good-making features of actions,

so doxastic reasons are plausible-making features of epistemic

possibilities.

Page 31: Reasons for (prior) belief in Bayesian epistemology

(Prior) beliefs

Credibility relation Doxastic reasons

FIXED/GIVEN CHANGEABLE

ALSO CHANGEABLEAs before

Posterior beliefs

Evidence/information


Recommended