+ All Categories
Home > Documents > Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh...

Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh...

Date post: 29-Dec-2015
Category:
Upload: godwin-pope
View: 219 times
Download: 1 times
Share this document with a friend
Popular Tags:
24
Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit
Transcript

Modelling Language EvolutionLecture 5: Iterated Learning

Simon Kirby

University of Edinburgh

Language Evolution & Computation Research Unit

Models so far…

Models of learning language Models of evolving ability to learn language Models of differing abilities to learn differing languages

What do these have in common? The language comes from “outside”

LINGUISTICAGENT

LANGUAGE

Neural network

TrainingSentences

Weightsettings

Two kinds of models

Language Acquisition Device

Primary Linguistic Data

Grammatical Competence

What can be learned?

What can evolve?

LADPLD GCLADPLD GC LADPLD GC

LADPLD GC LADPLD GCLADPLD GC

LADPLD GC

LADPLD GC

LADPLD GC

A new kind of model: Iterated Learning

LADPLD GCLADPLD GC LADPLD GC

LADPLD GC

LADPLD GC

LADPLD GC

LADPLD GC

LADPLD GCLADPLD GC

What kind of language evolves?

What can Iterated Learning explain?

My hypothesis: some functional linguistic structure emerges inevitably from the process of iterated learning without the need for natural selection or explicit functional pressure.

First target structure:

Recursive Compositionality: the meaning of an utterance is some function of the meaning of parts of that utterance and the way they are put together.

Compositional Holistic

walked went

I greet you Hi

I thought I saw a pussy cat chutter

The agent

Meaning-signal Pairs in(utterances from parent)

Meaning-signal Pairs out(to next generation)

Meanings(generated by environment)

Learning Algorithm

Internal linguistic representation

Agent(simulated individual)

Production AlgorithmNe

xt g

ene

ratio

n

What will the agents talk about?

Need some simple but structured “world”. Simple predicate logic:

Agents can string random characters together to form utterances.

loves(mary, john)admires(gavin, heather)

thinks(mary, likes(john, heather))

knows(heather, thinks(mary, likes(john, heather)))

How do agents learn?

Not using neural networksIn this model, interested in more traditional, symbolic

grammarsLearners try and form a grammar that is consistent

with the primary linguistic data they hear.Fundamental principle: learning is compression.

Learners try and fit data heard, but also generaliseLearning is a trade-off between these two

Two steps to learning

aryjohnlovesm),(/ maryjohnlovesS

aryjohnlovesm

marypeterloves

),(/

),(/

maryjohnlovesS

marypeterlovesS

peter

john

lovesmary

peterC

johnC

xCmaryxlovesS

/

/

/),(/

INCORPORATION (for each sentence heard)

GENERALISATION (whenever possible)

A simulation run

1. Start with one learner and one adult speaker neither of which have grammars.

2. Choose a meaning at random.

3. Get speaker to produce signal for that meaning (may need to “invent” random string).

4. Give meaning-signal pair to learner.

5. Repeat 2-4 one hundred and fifty times.

6. Delete speaker.

7. Make learner be the new speaker.

8. Introduce a new learner (with no initial grammar)

9. Repeat 2-8 thousands of times.

Results 1a: initial stages

Initially, speakers have no language, so “invent” random strings of characters.

A protolanguage emerges for some meanings, but no structure. These are holistic expressions:

1. ldg “Mary admires John”

2. xkq “Mary loves John”

3. gj “Mary admires Gavin”

4. axk “John admires Gavin”

5. gb “John knows that Mary knows that John admires Gavin”

Results 1b: many generations later…

6. gj h f tej m John Mary admires“Mary admires John”

7. gj h f tej wp John Mary loves“Mary loves John”

8. gj qp f tej m Gavin Mary admires“Mary admires Gavin”

9. gj qp f h m Gavin John admires“John admires Gavin”

10. i h u i tej u gj qp f h m John knows Mary knows Gavin John admires“John knows that Mary knows that John admires Gavin”

Quantitative results: languages evolve

What’s going on?

There is no biological evolution in the ILM.There isn’t even any communication; no notion of

function in model at all.So, why are structured languages evolving?Hypothesis:

Languages themselves are evolving to the conditions of the ILM in order that they are learnable.

The agents never see all the meanings…Only rules that are generalisable from limited exposure

are stable.

Language has to fit through a narrow bottleneck

This has profound implications for the structure of language

Linguistic competence

Linguistic performance

Linguistic competence

Production

Learning

A nice and simple model…

Language Meanings: 8-bit binary numbers Signals: 8-bit binary numbers

Agents 8x8x8 neural network (not SRN) Learns to associate signals to meanings

SIGNALS

MEANINGS

Bottleneck

Only one parameter in this model The bottleneck:

The number of meaning-signal pairs (randomly chosen) given to the next generation…

In each simulation, we can measure two things: Expressivity: the proportion of the

meaning-space an adult agent can give a unique signal to

Instability: how different each generation’s language is to that of the previous generation

Subset of meaning signal pairs

Subset of meaning signal pairs

Results

Bottleneck too tight: unstable and inexpressive language

Results

Bottleneck too wide: fairly stable and expressive eventually

Results

Medium bottleneck: maximal stability and expressivity

Adaptation

Language is evolving to be learnable

Structure in mapping emerges Meanings and signals are related by

simple rules of bit flipping and re-ordering

These rules can be learned from a subset

Despite the hugely different model, this is a very similar result to the earlier simulation

Summary

Language is learned by individuals with innate learning biases

The language data an individual hears is itself the result of learning

Languages adapt through iterated learning in response to our innate biases

There’s more! Our learning biases adapt through

biological evolution in response to the language we use

Tomorrow… use a simulation package to look at “grounding” models in an environment

Culturalevolution

Individual learning

Biological evolution


Recommended