+ All Categories
Home > Documents > Objections to dualism

Objections to dualism

Date post: 24-Feb-2016
Category:
Upload: angus
View: 53 times
Download: 1 times
Share this document with a friend
Description:
Objections to dualism. Intuitive appeal to consistency: why should the world inside our heads be different from everything outside our heads? Interaction problem No evidence Ockham’s razor In explanations, entities should not be multiplied unnecessarily - PowerPoint PPT Presentation
46
Objections to dualism 1) Intuitive appeal to consistency: why should the world inside our heads be different from everything outside our heads? 2) Interaction problem 3) No evidence 4) Ockham’s razor In explanations, entities should not be multiplied unnecessarily i.e. the simplest explanation is generally to be preferred. 5) Lack of explanatory power
Transcript
Page 1: Objections to dualism

Objections to dualism

1) Intuitive appeal to consistency: why should the world inside our heads be different from everything outside our heads?

2) Interaction problem3) No evidence4) Ockham’s razor

– In explanations, entities should not be multiplied unnecessarily

– i.e. the simplest explanation is generally to be preferred.

5) Lack of explanatory power

Page 2: Objections to dualism

IdealismBishop George Berkeley (1685-1753)

Everything is mind.

• Advantages: solves the interaction problem and solves the problem of mere physical things having consciousness, intelligence, etc.

• Disadvantages: lack of explanatory resources. All explanations appeal to God. Why do things seem physical? Why do things persist? Why do things appear to be explicable through physical laws and reducible to simpler physical objects and forces? Because that’s the way God imagines it.

Page 3: Objections to dualism

Physicalism

Definitions of physicalism

Physicalism: the belief that the only kinds of things are physical things and the only kinds of properties are physical properties

Physicalism is the belief that everything in the universe can be explained in terms of physics (thus, there is no mysterious non-physical stuff that does not follow physical laws)

All mental phenomena can be explained in terms of non-mental phenomena

Page 4: Objections to dualism

Physicalism with regard to the mind

The mind is a biological machine (maybe like a computer).

If we understand how the mind works physically, we can understand thoughts, feelings, consciousness

A computer or robot could theoretically have a mental life (i.e. consciousness)

Page 5: Objections to dualism

The Hard Problem of Consciousness

Page 6: Objections to dualism

Easy and hard problems of consciousness

Distinction proposed by David Chalmers

The easy problems:

• finding the neural correlate of consciousness

• explaining the ability to apply information to thinking and behavior

• explaining the ability to focus attention, recall items from memory, integrate perceptions, etc.

The hard problem:

Why does consciousness feel the way it does? Why does it feel like anything?

Page 7: Objections to dualism

Why the problem is hard“You can look into your mind until you burst, and you will not discover neurons and synapses and all the rest; and you can stare at someone’s brain from dawn till dusk and you will not perceive the consciousness that is so apparent to the person whose brain you are so rudely eye-balling.“ (McGinn 1999)

“The problem of consciousness, simply put, is that we cannot understand how a brain, qua gray, granular lump of biological matter, could be the seat of human consciousness, the source or ground of our rich and varied phenomenological lives. How could that ‘lump’ be conscious – or, conversely, how could I, as conscious being, be that lump?” (Akins 1993

Page 8: Objections to dualism

What is it like to be a bat?

Thomas Nagel

One of the most famous papers in all of philosophy!

(1974)

We can never know what it feels like to be a bat.

Page 9: Objections to dualism

Why a bat?

There is something it is like to be a bat.

Compare:

Cloud, rock, tree – nothing it is to be like

Mosquito, frog, computer – who knows? People have different intuitions.

Page 10: Objections to dualism

Bats are mammals. Most people agree they have experiences – they are conscious.

But, their consciousness is alien to us:

They “see” by sonar.They fly and hang upside-down.

They lust for other bats.

We might be able to imagine what it would be like for us to live and behave like a bat.

But we can’t imagine what it is like for a bat to be a bat.

Page 11: Objections to dualism

Problem of privacy?

Nagel: not a problem of privacy e.g. “no one can catch my catches”

It’s not that we cannot experience bat token experiences, e.g. Billy the bat’s sonar qualia.

Bat qualia are mental types that other subjects could also experience. But we cannot learn what these types are like objectively.

Page 12: Objections to dualism

Bat’s experience is subjective. Consciousness = having a point of view

Scientific knowledge is objective.“The view from nowhere.”

Example: lightning– subjective: looks like a flash of light– objective: electrical discharge

Study of objective science can never reveal the character of subjective experience.

Page 13: Objections to dualism

Is this the same as the problem of other minds?

Not quite.

What is it like to be an eskimo?

What is it like to be Tom Cruise?

Nagel: we can answer these questions fairly well by using our imagination. But, the answer is accessible to us only because we base our imagination on our own experiences. We need the subjective experience of being human to imagine the experience of others.

Objective science alone could not give us these answers.

A Martian could not learn from objective facts what it is like to be human.

Page 14: Objections to dualism

Science cannot explain consciousness in physical terms.

“I have not defined the term 'physical'. Obviously it does not apply just to what can be described by the concepts of contemporary physics, since we expect further developments. Some may think there is nothing to prevent mental phenomena from eventually being recognized as physical in their own right. But whatever else may be said of the physical, it has to be objective.” (Nagel 1974)

Physical facts are objective.

Consciousness is subjective.

So consciousness can never be explained by physical facts.

Question: Is this right? Are only objective facts physical? Are the objective and the subjective irreconcilable?

Page 15: Objections to dualism

Is physicalism about mental states wrong?

Nagel: not necessarily

“It would be a mistake to conclude that physicalism must be false…. It would be truer to say that physicalism is a position we cannot understand because we do not at present have any conception of how it might be true.” (Nagel 1974)

Example: we saying “mind is brain” is like pre-Socratic philosopher saying: “matter is energy”

“Strangely enough, we may have evidence for the truth of something we cannot really understand.” (Nagel 1974)

Example: caterpillar butterfly

Page 16: Objections to dualism

Possible responses

1) Agree that is it impossible to understand qualia by investigating the physical facts, because

a) dualism is true –dualists; or

b) we don’t have the mental capacity to understand it -- the “New Mysterians”, e.g. Nagel, Colin McGinn

Quote from Colin McGinn:

“consciousness is indeed a deep mystery. . . . The reason for this mystery, I maintain, is that our intelligence is wrongly designed for understanding consciousness.” (McGinn, 1999)

Page 17: Objections to dualism

2) Refute the claim from a functionalist perspective

– We can know what it is like to be a bat, objectively. By learning the functional properties of the bat’s qualia, we understand what the qualia is like.

– We can learn, for example, what size objects the bat can detect, what kind of detail it can perceive, how well it can track speeds and orientations of flying objects, etc. We can learn what the bat’s map of the world is like.

– Is that equivalent to knowing what it is like for the bat to perceive?

– E.g. Daniel Dennett

Page 18: Objections to dualism

3) Deny that physical facts must be objective. There are subjective physical facts (Max Deutsch). Some physical facts can only be learned from the perspective of someone experiencing them.

Max Deutsch, “Subjective physical facts” available at: http://consc.net/online/1.3a

Online papers on consciousness website: also many other interesting papers dealing with

the Knowledge Argument

Page 19: Objections to dualism

4) Agree that the hard problem is too hard to solve … for now.Concentrate on the “easy” problems of consciousness and believe that the answers to the hard problem will come eventually

The typical cognitive science approach.

Francis Crick, in a work about visual consciousness:

“I have said almost nothing about qualia – the redness of red – except to brush it to one side and hope for the best” (Crick 1994)

Page 20: Objections to dualism

The Chinese Room Argument

Page 21: Objections to dualism

Can a machine think?

If we are just biological machines, could we make a thinking machine?

Would it have to be biological?

How would we know if the machine was intelligent?

How would we know if the machine had consciousness?

Page 22: Objections to dualism

The Turing Test

In 1950, a computer scientist, Alan Turing, wanted to provide a practical test to answer “Can a machine think?”

His solution -- the Turing Test:

If a machine can conduct a conversation so well that people cannot tell whether they are talking with a person or with a computer, then the computer can think. It passes the Turing Test.

Page 23: Objections to dualism

Sufficient vs. necessary• The Turing Test is meant to be a sufficient test of

intelligence.

• Not a necessary test.

• In other words, if something can pass the Turing Test, then that thing is intelligent.

• But something that is intelligent could still fail the Turing Test.

• E.g. Spock

Page 24: Objections to dualism

There are many modern attempts to produce computer programs that pass the Turing Test.

In fact, in 1991 Dr. Hugh Loebner started the annual Loebner Prize competition, with prize money offered to the author of the computer program that performs the best on a Turing Test.

One winner of the Loebner prize was a program called ALICE.

You can try her out on this website: http://alicebot.blogspot.com/

And you can talk to others at:http://www.pandorabots.com/botmaster/en/~12c23476ad41c7e1e90d9f6c~/mostactive

Page 25: Objections to dualism

Human: Do you like talking to people?ALICE: No, I don't care for it.Human: Then you must be unhappy now.ALICE: Right now? Interesting deduction. Do you mind if I tell other people you say I am unhappy.Human: I don't mind.ALICE: OK then.Human: But tell me, are you unhappy now?ALICE: Oh. Umm. Do I seem like it? I am as happy as ever.Human: You don't seem unhappy, but you said you don't like talking to people, so I figured you must be unhappy now.ALICE: Oh. You don't know me well enough to say that.Human: It was just a deductionALICE: Oh I get it. Very interesting.Human: So, was my reasoning wrong?ALICE: Alright then. It depends who you ask.Human: I'm asking you.ALICE: OK you can ask me.

A Conversation with Alice

Page 26: Objections to dualism

Searle’s Chinese Room ArgumentJohn Searle

• Famous philosopher at the University of California, Berkeley

• Most well-known in philosophy of language, philosophy of mind and consciousness studies

• Wrote “Minds, Brains and Programs” in 1980, which described the “Chinese Room Argument”

Page 27: Objections to dualism

Searle’s Chinese Room Argument• The Chinese Room argument is one kind of objection to to the

Turing Test

• Also an attack on “strong AI”

• Searle makes distinction between strong AI and weak AI

• Strong AI: “the appropriately programmed computer really is a mind, in the sense that computers, given the right programs can be literally said to understand”

• Weak AI: Computers can simulate thinking and help us to learn about how humans think

• Searle objects only to strong AI.

Page 28: Objections to dualism

The Chinese RoomSearle cannot understand any Chinese.

He is in a room with input and output windows, and a list of rules about manipulating Chinese characters.

The characters are all “squiggles and squoggles” to him.

Chinese scripts and questions come in from the input window.

Following the rules, he manipulates the characters and produces a reply, which he pushes through the output window.

Page 29: Objections to dualism

The Chinese answers that Searle produces are very good.

In fact, so good, no one can tell that he is not a native Chinese speaker!

Searle’s Chinese Room passes the Turing Test. In other words, it functions like an intelligent person.

Searle has only conducted symbol manipulation, with no understanding, yet he passes the Turing Test.

Therefore, passing the Turing Test does not ensure understanding.

In other words, although Searle’s Chinese Room functions like a mind, it is not a mind, and therefore there is more to intelligence than mere functioning.

Searle believes that no computer-like system could be truly intelligent.

Page 30: Objections to dualism

Syntax vs. semantics

Searle argued that computers can never understand because computer programs are purely syntactical with no semantics.

Syntax: the rules for symbol manipulation, e.g. grammer

Semantics: understanding what the symbols (e.g. words) mean

Syntax without semantics: The bliggedly blogs browl aborigously.

Semantics without syntax: Milk want now me.

Page 31: Objections to dualism

• Searle concludes that symbol manipulation alone can never produce understanding.

• Computer programming is only symbol manipulation.

• Computer programming can never produce understanding.

• Strong AI is false and intelligence cannot be created in a computer.

Page 32: Objections to dualism

What could produce real understanding?

Searle: “it is a biological phenomenon” and “only something with the same causal powers as brains can have [understanding]”.

Page 33: Objections to dualism

ObjectionsThe Systems Reply

Searle is part of a larger system. Searle doesn’t understand Chinese, but the whole system (Searle + room + rules) does understand Chinese.

The knowledge of Chinese is in the rules contained in the room.

The ability to implement that knowledge is in Searle.

The whole system understands Chinese.

Page 34: Objections to dualism

Searle’s Response to the Systems Reply

1) It’s absurd to say that the room and the rules can provide understanding

2) What if I memorized all the rules and internalized the whole system. Then there would just be me and I still wouldn’t understand Chinese.

Counter-response to Searle’s response

If Searle could internalize the rules, part of his brain wouldunderstand Chinese. Searle’s brain would house twopersonalities: English-speaking Searle and Chinese-speaking system.

Page 36: Objections to dualism

Searle inside the robot

Page 37: Objections to dualism

Searle’s response to the Robot Reply

1) The robot reply admits that there is more to understanding than mere symbol manipulation.

2) The robot reply still doesn’t work. Imagine that I am in the head of the robot. I have no contact with the perceptions or actions of the robot. I still only manipulate symbols. I still have no understanding.

Counter-response to Searle’s response

Combine the robot reply with the systems reply. The robot as a whole understands Chinese, even though Searle doesn’t.

Page 38: Objections to dualism

The Complexity Reply

• Really a type of systems reply.

• Searle’s thought experiment is deceptive. A room, a man with no understanding of Chinese and “a few slips of paper” can pass for a native Chinese speaker.

• It would be incredibly difficult to simulate a Chinese speaker’s conversation. You need to program in knowledge of the world, an individual personality with simulated life history to draw on, and the ability to be creative and flexible in conversation. Basically you need to be able to simulate the complexity of an adult human brain, which is composed of billions of neurons and trillions of connections between neurons.

Page 39: Objections to dualism

Complexity changes everything.

Our intuitions about what a complex system can do are highly unreliable.

Tiny ants with tiny brains can produce complex ant colonies.

Computers that at the most basic level are just binary switches that flip from 1 to 0 can play chess and beat the world’s best human player.

If you didn’t know it could be done, you would not believe it.

Maybe symbol manipulation of sufficient complexity can create semantics, i.e. can produce understanding.

Page 40: Objections to dualism

Evaluation of Searle’s Claims

1) The Turing Test:

Searle is probably right about the Turing Test.

Simulating a human-like conversation probably does not guarantee real human-like understanding.

Certainly, it appears that simulating conversation to some degree does not require a similar degree of understanding. Programs like ALICE presumably have no understanding at all.

.

Page 41: Objections to dualism

2) Machine intelligence

Advocates of machine intelligence can respond that identification of the of the room/computer and a mind is carried out at the wrong level.

The computer as a whole is a thinking machine, like a brain is a thinking machine. But the computer’s mental states may not be equivalent to the brain’s mental states.

If the computer is organized as a really long list of questions with canned answers, the computer does not have mental states such as belief or desire.

But if the computer is organized like a human mind, with concepts, complex organization and hierarchical layers of functional systems, the computer can have beliefs, desires, etc.

Page 42: Objections to dualism

3) Strong AI:

Could an appropriately programmed computer have real understanding?

My view: too early to say.

The right kind of programming with the right sort of complexity may yield true understanding.

e.g. homuncular modularitymixing of levelsself-updating

Page 43: Objections to dualism

4) Syntax vs. Semantics

How can semantics (meaning) come out of symbol manipulation? How can 1s and 0s result in real meaning? It’s mysterious. But then how can the firing of neurons result in real meaning? Also mysterious.

Page 44: Objections to dualism

5) Consciousness

Can a computer have consciousness? Again, it is hard to understand how silicon and metal can have feelings. But it is no easier to understand how meat can have feelings.

If a computer could talk intelligently and convincingly about its feelings, we would probably ascribe feelings to it. But would we be right?

Page 45: Objections to dualism

5) Searle’s claim: understanding can only occur in biological systems with the same causal properties as the brain:

Why? What is special about biological systems? What evidence is there?

Page 46: Objections to dualism

ReadingsRequired:

• Searle, John. R. (1990), “Is the Brain's Mind a Computer Program?” in Scientific American, 262, pgs. 20-25

• Churchland, Paul, and Patricia Smith Churchland (1990) “Could a machine think?” in Scientific American 262, pgs. 26-31


Recommended