+ All Categories
Home > Documents > 1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path...

1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path...

Date post: 27-Dec-2015
Category:
Upload: scarlett-daniels
View: 215 times
Download: 1 times
Share this document with a friend
38
1 On Bubbles and Drifts: Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions Thomas Trappenberg Thomas Trappenberg Dalhousie University, Canada Dalhousie University, Canada
Transcript

1

On Bubbles and Drifts:Continuous attractor networks and their relation to working memory, path integration, population decoding, attention, and motor functions

Thomas TrappenbergThomas Trappenberg

Dalhousie University, Canada Dalhousie University, Canada

2

CANNs can implement motor functions

Stringer, Rolls, Trappenberg, de Araujo, Self-organizing continuous attractor networks and motor functions Neural Networks 16 (2003).

State nodes Motor nodes

Movement selector nodes

3

My plans for this talk

Basic CANN model

Idiothetic CANN updates (path-integtration)

CANN & motor functions

Limits on NMDA stabilization

4

Once upon a time ... (my CANN shortlist)

Wilson & Cowan (1973) Grossberg (1973) Amari (1977) … Sampolinsky & Hansel (1996) Zhang (1997) … Stringer et al (2002)

Wilshaw & van der Malsburg (1976)

Droulez & Berthos (1988)

Redish, Touretzky, Skaggs, etc

5

Basic CANN: It’s just a `Hopfield’ net …

I ext rout

w

w

x

Recurrent architecture Synaptic weights

Nodes can be scrambled!

6

In mathematical terms …

Updating network states (network dynamics)

Gain function

Weight kernel

7Network can form bubbles of persistent activity (in Oxford English: activity packets)

0 5 10 15 20

20

40

60

80

100

Time [t]

Nod

e in

dex

External stimulus

End states

8Space is represented with activity packets in the hippocampal system

From Samsonovich & McNaughtonPath integration and cognitive mapping in a continuous attractor neural J. Neurosci. 17 (1997)

9

Various gain functions are used

End states

10Superior colliculus intergrates exogenous and endogenous inputs

C N

S N p r

T h a l

S E F

F E F

L IP

S C

R F

Cerebellum

11

Superior Colliculus is a CANN

Trappenberg, Dorris, Klein & Munoz,A model of saccade initiation based

on the competitive integration of exogenous and endogenous inputs

J. Cog. Neuro. 13 (2001)

12Weights describe the effective interaction in Superior Colliculus

Trappenberg, Dorris, Klein & Munoz,A model of saccade initiation based on the competitive integration of exogenous and endogenous inputs J. Cog. Neuro. 13 (2001)

13There are phase transitions in the weight-parameter space

14

CANNs can be trained with Hebb

Hebb:

Training pattern:

15Normalization is important to have convergent method

• Random initial states• Weight normalization

w(x,50)

Training timex

x y

w(x,y)

16Gradient-decent learning is also possible (Kechen Zhang)

Gradient decent with regularization = Hebb + weight decay

17

CANNs have a continuum of point attractors

Point attractors and basin of attraction

Line of point attractors

Can be mixed: Rolls, Stringer, Trappenberg A unified model of spatial and episodic memoryProceedings B of the Royal Society 269:1087-1093 (2002)

18

CANNs work with spiking neurons

Xiao-Jing Wang, Trends in Neurosci. 24 (2001)

19

Shutting-off works also in rate model

Time

No

de

20

CANN (integrators) are stiff

21

… and can drift and jumpTrappenberg, Dynamic cooperation and competition in a network of spiking neuronsICONIP'98

22

Neuroscience applications of CANNs

Persistent activity (memory) and winner-takes-all (competition)

• Cortical network (e.g. Wilson & Cowan, Sampolinsky, Grossberg)

• Working memory (e.g. Compte, Wang, Brunel, Amit (?), etc)

• Oculomotor programming (e.g. Kopecz & Schoener, Trappenberg et al.)

• Attention (e.g. Sompolinsky, Olshausen, Salinas & Abbott (?), etc)

• Population decoding (e.g. Wu et al, Pouget, Zhang, Deneve, etc )

• SOM (e.g. Wilshaw & van der Malsburg)

• Place and head direction cells (e.g. Zhang, Redish, Touretzky, Samsonovitch, McNaughton, Skaggs, Stringer et al.)

• Motor control (Stringer et al.)

basic

CANN

PI

Path-integration

23Modified CANN solves path-integration

24

CANNs can implement motor functions

Stringer, Rolls, Trappenberg, de Araujo, Self-organizing continuous attractor networks and motor functions Neural Networks 16 (2003).

State nodes Motor nodes

Movement selector nodes

25

... learning motor sequences (e.g. speaking a work)

Movement selector cells motor cells state cells

Experiment 1

26

… from noisy examples …

state cells:learning from noisy examples

Experiment 2

27

… and reaching from different initial states

Stringer, Rolls, Trappenberg, de Araujo,Self-organizing continuous attractor networks and motor function

Neural Networks 16 (2003).

Experiment 3

28

Drift is caused by asymmetries

NMDA stabilization

29

CANN can support multiple packets

Stringer, Rolls & Trappenberg,Self-organising continuous attractor networks with multiple

Activity packets, and the representation of spaceNeural Networks 17 (2004)

30

How many activity packets can be stable?

Trappenberg, Why is our working memory capacity so large?Neural Information Processing-Letters and Reviews, Vol. 1 (2003)

31

Stabilization can be too strong

Trappenberg & Standage,Multi-packet regions in stabilized continuousattractor networks, submitted to CNS’04

32

Conclusion

CANN are widespread in neuroscience models (brain)

Short term memory, feature selectivity (WTA)

`Path-integration’ is an elegant mechanisms to generate dynamic sequences (self-organized)

33

With thanks to

Cognitive Neuroscience, Oxford Univ. Edmund Rolls Simon Stringer Ivan Araujo

Psychology, Dalhousie Univ. Ray Klein

Physiology, Queen’s Univ. Doug Munoz Mike Dorris

Computer Science, Dalhousie Dominic Standage

34

CANN can discover dimensionality

35CANN with adaptive input strength explains express saccades

36CANN are great for population decoding (fast pattern matching implementation)

37

John Lisman’s hippocampus

38

0

1 1

( )( )( ) ( )

hdhd vii ihd

ac hd acij jhd c hd ac

j

inh hdij j

j

c hd cij j

j

dh th t Iw w r t

dt C

w rw r rC C

r

t

t

C

: activity of node i

: firing rate

: synaptic efficacy matrix

: global inhibition

: visual input

: time constant

: scaling factor

: #connections per node

: slope

: threshold

viI

ih

0

inhw

ir

ijw

Continuous dynamic (leaky integrator):

The model equations:

NMDA-style stabilization:1

2

if 0.5( )

elsewherei

i

rr

2 ( ( )) 1(1 e )i ih rir

ij i jw kr r Hebbian learning:

c hd hd cij i jw kr r r ac hd hd acij i jw kr r r


Recommended