+ All Categories
Home > Documents > The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Date post: 21-Dec-2015
Category:
View: 221 times
Download: 2 times
Share this document with a friend
Popular Tags:
42
The Neural Basis of The Neural Basis of Thought and Language Thought and Language Week 11 Week 11 Metaphor and Bayes Nets Metaphor and Bayes Nets
Transcript
Page 1: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

The Neural Basis ofThe Neural Basis ofThought and LanguageThought and Language

Week 11Week 11

Metaphor and Bayes NetsMetaphor and Bayes Nets

Page 2: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Schedule

• Assignment 7 extension, due Wednesday night

• Last Week– Aspect and Tense

– Event Structure Metaphor

• This Week– Frames & how it maps to X-schemas

– Inference, KARMA: Knowledge-based Action Representations for Metaphor and Aspect

• Next Week– Grammar

Page 3: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Announcement

• Panel: "Cruise Control": Careers in Artificial Intelligence.

• Friday, April 16th from 3-4:30 in Bechtel Hall 120A/B.

• The panel is an informal session, where professionals in the field of AI will answer general questions about their entry into the field, trends, etc.

• Panelists will be:

– Peter Norvig from Google

– Charlie Ortiz of Teambotics at SRI

– Nancy Chang from ICSI.

• Moderator will be:

– Barbara Hightower, CS advisor.

Page 4: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Quiz

1. What are metaphors? Give two examples of Primary Metaphors and sentences using them.

2. What are Event Structure Metaphors? Give an example.

3. How do Bayes Nets fit into the simulation story? What are the benefits of that model?

4. What are Dynamic Bayesian Networks?

Page 5: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Going from motor control to abstract reasoning

• The sensory-motor system is directly engaged in abstract reasoning

• Both the physical domain and abstract domain are structured by schemas and frames, i.e. there are

– semantic roles, and

– relation between semantic roles

• schemas generally refer to embodied, “universal” knowledge, whereas frames are generally culturally specific

Page 6: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

The Commercial-Transaction schema

schema Commercial-Transactionsubcase of Exchangeroles

customer participant1vendor participant2money entity1 : Moneygoods entity2goods-transfer transfer1 money-transfer transfer2

Page 7: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Quiz

1. What are metaphors? Give two examples of Primary Metaphors and sentences using them.

2. What are Event Structure Metaphors? Give an example.

3. How do Bayes Nets fit into the simulation story? What are the benefits of that model?

4. What are Dynamic Bayesian Networks?

Page 8: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Metaphors

• metaphors are mappings from a source domain to a target domain

• metaphor maps specify the correlation between source domain entities / relation and target domain entities / relation

• they also allow inference to transfer from source domain to target domain (possibly, but less frequently, vice versa)

<TARGET> is <SOURCE>

Page 9: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Primary Metaphors

• The key thing to remember about primary metaphors is that they have an experiential basis

• Affection Is Warmth

• Important is Big

• Happy is Up

• Intimacy is Closeness

• Bad is Stinky

• Difficulties are Burdens

• More is Up

• Categories are Containers

• Similarity is Closeness

• Linear Scales are Paths

• Organization is Physical Structure

• Help is Support

• Time Is Motion

• Relationships are Enclosures

• Control is Up

• Knowing is Seeing

• Understanding is Grasping

• Seeing is Touching

Page 10: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Affection is Warmth

• Subjective Judgment: Affection

• Sensory-Motor Domain: Temperature

• Example: They greeted me warmly.

• Primary Experience: Feeling warm while being held affectionately.

• more examples:– She gave me a cold shoulder

– Now that I've known such-and-such for a while, he's finally warming up to me.

Page 11: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Important is Big

• Subjective Judgment: Importance

• Sensory-Motor Domain: Size

• Example: Tomorrow is a big day.

• Primary experience: As a child, important things in your environment are often big, e.g., parents, but also large things that exert a force on you

• more examples:– Don't sweat the small stuff.

– I'll have a meeting with the big boss today.

Page 12: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

How are these metaphors developed?

• Conflation Hypothesis:Children hypothesize an early meaning for a source domain word that conflates meanings in both the literal and metaphorical senses – experiencing warmth and affection when being

held as a child

– observing a higher water level when there's more water in a cup

Page 13: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.
Page 14: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.
Page 15: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

The Dual Metaphors for Time

• Time is stationary and we move through it

– It takes a long time to write a book

– We are behind schedule

schedules are landmarks on this landscape that we have to be at by a certain time

• Time is a moving object

– The deadline is approaching

– He is forever chasing his past

the past is an object that has come by and moved past him

Page 16: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

A different experiment by Boroditsky & Ramscar, 2002

• “Next Wednesday's meeting has been moved forward two days. What day is the meeting now that it has been rescheduled?”

• Is the meeting Monday? or Friday?

Page 17: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Results of the experiment

• two spatial primes:

A. participant sitting in an office chair moving through space. (ego-moving prime)

B. participant pulling an office chair towards himself with a rope. (time-moving prime)

• results:

A. more likely to say Friday

B. more likely to say Monday

Page 18: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Quiz

1. What are metaphors? Give two examples of Primary Metaphors and sentences using them.

2. What are Event Structure Metaphors? Give an example.

3. How do Bayes Nets fit into the simulation story? What are the benefits of that model?

4. What are Dynamic Bayesian Networks?

Page 19: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Event Structure Metaphor

• Target Domain: event structure

• Source Domain: physical space• States are Locations

• Changes are Movements

• Causes are Forces

• Causation is Forced Movement

• Actions are Self-propelled Movements

• Purposes are Destinations

• Means are Paths

• Difficulties are Impediments to Motion

• External Events are Large, Moving Objects

• Long-term Purposeful Activities are Journeys

Page 20: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

The Dual of the ESM

• Attributes are possessions

• Changes are Movements of Possessions (acquisitions or losses)

• Causes are forces

• Causation is Transfer of Possessions (giving or taking)

• Purposes are Desired Objects

• Achieving a Purpose Is Acquiring a Desired Object

Page 21: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Examples of the Dual

• I have a headache.

• I got a headache.

• My headache went away.

• The noise gave me a headache.

• The aspirin took away my headache.

• I'm in trouble. (Location ESM)

• The programming assignment gave me much trouble. (Object ESM)

Page 22: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Quiz

1. What are metaphors? Give two examples of Primary Metaphors and sentences using them.

2. What are Event Structure Metaphors? Give an example.

3. How do Bayes Nets fit into the simulation story? What are the benefits of that model?

4. What are Dynamic Bayesian Networks?

Page 23: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Simulation-based Understanding

Analysis Process

“Harry walked into the cafe.”Utterance

CAFESimulation

Belief State

General Knowledg

e

Constructions

SemanticSpecificatio

n

Page 24: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Semantic Analysis

• Takes in constructions

– pairing of form and meaning

– Form pole = syntax

– Meaning pole = frames and other schemas

• Spits out semantic specification

– schemas with bound roles

Page 25: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

What exactly is simulation?

• Belief update plus X-schema execution

hungry meeting

cafe

time of day

readystart

ongoingfinish

done

iterate

WALK

at goal

Page 26: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Bayes Nets: Take away points

• Computational technique to capture best fit

– Probabilistic

– Approximation to neural spreading activation

• Easy to write down (intuitive)

– Nodes in terms of explicit causal relations

• Efficient

– Much smaller than full joint...

• Known mechanisms to do inference

Page 27: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Review: Probability

• Random Variables

– Boolean/Discrete

• True/false

• Cloudy/rainy/sunny

– Continuous

• [0,1] (i.e. 0.0 <= x <= 1.0)

Page 28: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Priors/Unconditional Probability

• Probability Distribution

– In absence of any other info

– Sums to 1– E.g. P(Sunny=T) = .8 (thus, P(Sunny=F) = .2)

• This is a simple probability distribution

• Joint Probability

– P(Sunny, Umbrella, Bike)• Table 23 in size

– Full Joint is a joint of all variables in model

• Probability Density Function

– Continuous variables• E.g. Uniform, Gaussian, Poisson…

Page 29: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Conditional Probability

• P(Y | X) is probability of Y given that all we know is the value of X

– E.g. P(cavity=T | toothache=T) = .8• thus P(cavity=F | toothache=T) = .2

• Product Rule

– P(Y | X) = P(X Y) / P(X) (normalizer to add up to 1)

Y X

Page 30: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Inference

Toothache Cavity Catch Prob

False False False .576

False False True .144

False True False .008

False True True .072

True False False .064

True False True .016

True True False .012

True True True .108

P(Toothache=T)?P(Toothache=T, Cavity=T)? P(Toothache=T | Cavity=T)?

Page 31: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Bayes NetsBayes Nets

B E P(A|…)

TTFF

TFTF

0.950.940.290.001

Burglary Earthquake

Alarm

MaryCallsJohnCalls

P(B)

0.001

P(E)

0.002

A P(J|…)

TF

0.900.05

A P(M|…)

TF

0.700.01

Page 32: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Independence

X Y Z X Y Z

X

Y

Z X

Y

Z

X

Y

Z X

Y

Z

X independent of Z?X independent of Z? X conditionally independent of Z given Y?X conditionally independent of Z given Y?

NoNo

NoNo

NoNo

YesYes

YesYes

YesYes

Or below

Page 33: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Markov Blanket

X

X is independentof everything else given:

Parents, Children, Parents of Children

Page 34: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Reference: Joints

• Representation of entire network

• P(X1=x1 X2=x2 ... Xn=xn) =P(x1, ..., xn) = i=1..n P(xi|parents(Xi))

• How? Chain Rule

– P(x1, ..., xn) = P(x1|x2, ..., xn) P(x2, ..., xn) =... = i=1..n P(xi|xi-1, ..., x1)

– Now use conditional independences to simplify

Page 35: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Reference: Joint, cont.

P(x1, ..., x6) =P(x1) *P(x2|x1) *P(x3|x2, x1) *P(x4|x3, x2, x1) *P(x5|x4, x3, x2, x1) *P(x6|x5, x4, x3, x2, x1)

X2

X1

X3

X4

X6

X5

Page 36: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Reference: Joint, cont.

P(x1, ..., x6) =P(x1) *P(x2|x1) *P(x3|x2, x1) *P(x4|x3, x2, x1) *P(x5|x4, x3, x2, x1) *P(x6|x5, x4, x3, x2, x1)

X2

X1

X3

X4

X6

X5

Page 37: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Reference: Inference

• General case

– Variable Eliminate

– P(Q | E) when you have P(R, Q, E)

– P(Q | E) = ∑R P(R, Q, E) / ∑R,Q P(R, Q, E)

• ∑R P(R, Q, E) = P(Q, E)

• ∑Q P(Q, E) = P(E)

• P(Q, E) / P(E) = P(Q | E)

Page 38: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Reference: Inference, cont.

Q = {X1}, E = {X6}

R = X \ Q,E

P(x1, ..., x6) =P(x1) * P(x2|x1) * P(x3|x1) * P(x4|x2) *P(x5|x3) * P(x6|x5, x2)

X2

X1

X3

X4

X6

X5

P(x1, x6) = ∑x2 ∑x3 ∑x4 ∑x5 P(x1) P(x2|x1) P(x3|x1) P(x4|x2) P(x5|x3) P(x6|x5, x2)

= P(x1) ∑x2 P(x2|x1) ∑x3 P(x3|x1) ∑x4 P(x4|x2) ∑x5 P(x5|x3) P(x6|x5, x2)

= P(x1) ∑x2 P(x2|x1) ∑x3 P(x3|x1) ∑x4 P(x4|x2) m5(x2, x3)

= P(x1) ∑x2 P(x2|x1) ∑x3 P(x3|x1) m5(x2, x3) ∑x4 P(x4|x2) = ...

Page 39: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Approximation Methods

• Simple– no evidence

• Rejection– just forget about the invalids

• Likelihood Weighting– only valid, but not necessarily useful

• MCMC– Best: only valid, useful, in proportion

Page 40: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Stochastic SimulationStochastic Simulation

RainSprinkler

Cloudy

WetGrass1. Repeat N times: 1.1. Guess Cloudy at random 1.2. For each guess of Cloudy, guess Sprinkler and Rain, then WetGrass

2. Compute the ratio of the # runs where WetGrass and Cloudy are True over the # runs where Cloudy is True

P(WetGrass|Cloudy)?

P(WetGrass|Cloudy) = P(WetGrass Cloudy) / P(Cloudy)

Page 41: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Quiz

1. What are metaphors? Give two examples of Primary Metaphors and sentences using them.

2. What are Event Structure Metaphors? Give an example.

3. How do Bayes Nets fit into the simulation story? What are the benefits of that model?

4. What are Dynamic Bayesian Networks?

Page 42: The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.

Dynamic Bayes Nets


Recommended