+ All Categories
Home > Documents > Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf ·...

Fundamentals of Computational Neuroscience 2eweb.cs.dal.ca/~tt/CSCI650809/SlidesChapter10.pdf ·...

Date post: 26-Aug-2018
Category:
Upload: dodieu
View: 218 times
Download: 0 times
Share this document with a friend
13
Fundamentals of Computational Neuroscience 2e Thomas Trappenberg March 28, 2009 Chapter 10: The cognitive brain
Transcript

Fundamentals of ComputationalNeuroscience 2e

Thomas Trappenberg

March 28, 2009

Chapter 10: The cognitive brain

Hierarchical maps and attentive vision

Posterior

Inferior

V4

V2

V1

LGN

temporal cortex

ocipital cortex

thalamus

}

A. Ventral visual pathway B. Layered cortical maps

Eccentricity / deg

Rec

eptiv

e Fi

eld

Size

/ de

g

50

20

8.0

3.2

1.3

50208.03.21.30

Layer 4

Layer 3

Layer 2

Layer 1Layer 1

Layer 4

Layer 2

Layer 3

Attention in visual search and object recognition

Given :

Particular Features ( Target Object )

Function :

Scanning ( Attentional Window Scanns the

Entire Scene )

WHERE

Visual search

Given :

Particular features ( target object )

Function :

Scanning ( attentional window scans the

entire scene )

WHERE

Object Recognition

Given :

Particular Spatial Location ( Target Position)

Function :

Binding ( Attentional Windo Bind Features

for Identification )

WHAT

Object recognition

Given :

Particular spatial location ( target position)

Function :

Binding ( attentional window binds features

for identification )

WHAT

Gustavo Deco

Model

Inhibitory pool

Inhibitory pool

....

....

Visual field

Inhibitory Pool

Inhibitory pool

( )

„ “

„ “

Locus attentional preferred

Gabor jets

IT ( Object recognition )

PP Spatial location

V1 V4 (Feature

extraction )

LGN

Where

What

Top down bias ( Object specific ) Top down bias

( Location specific )

Example results

E X

X

Time

Number of items 1 2 3

PP

Number of items 1 2 3

PP

E F F

E F F

Time

Act

ivity

2 3

Act

ivity

A. “Parallel search” B. “Serial search”

The interconnecting workspace hypothesis

Globalworkspace

Evaluativesystem

(VALUE)

Long-termmemory(PAST)

Attentionalsystem

(Focusing)

Perceptual system (PRESENT)

Motorsystem

(FUTURE)

Stroop task modelling

A. Stroop task B. Workspace model for stroop task

grey

black

word naming

colournaming

image

task

grey

greyblack

black

COLOURblack

NAMING RESPONSE black

INPUTS & OUTPUTS

SPECIALIZED PROCESSORS

WORKSPACE NEURONS

REWARD(error signal)

VIGILANCE

attentionalsupression

of word

attentionalamplification

of colour

WORDgrey

The anticipating brain

1. The brain can develop a model of the world, which can be usedto anticipate or predict the environment.

2. The inverse of the model can be used to recognize causes byevoking internal concepts.

3. Hierarchical representations are essential to capture the richnessof the world.

4. Internal concepts are learned through matching the brain’shypotheses with input from the world.

5. An agent can learn actively by testing hypothesis throughactions.

6. The temporal domain is an important degree of freedom.

Agent Environment

) | c, a ( s p

) | ( a , s p a

,c ) | ( s p c ) | ( a c p External states

PNS Sensation

PNS Action

) |s , c ( s p

) ( p a

CNS Action

Internal states

c ,c ) | ( p c

CNS Sensation

|s , c

Recurrent networks with hidden nodes

The Boltzmann machine:

Hiddennodes

Visiblenodes

Energy: Hnm = − 12

∑ij wijsn

i smj

Probabilistic update: p(sni = +1) = 1

1+exp(−βP

j wij snj )

Boltzmann-Gibbs distribution: p(sv ; w) = 1Z

∑m∈h exp(−βHvm)

Training Boltzmann machine

Kulbach-Leibler divergence

KL(p(sv ),p(sv ; w)) =v∑s

p(sv ) logp(sv )

p(sv ; w)

=v∑s

p(sv ) log p(sv )−v∑s

p(sv ) log p(sv ; w)

Minimizing KL is equivalent to maximizing the average log-likelihoodfunction

l(w) =v∑s

p(sv ) log p(sv ; w) = 〈log p(sv ; w)〉.

Gradient decent→ Boltzmann Learning∆wij = η ∂l

∂wij= η β2

(〈sisj〉clamped − 〈sisj〉free

).

The restricted Boltzmann machine

Hidden nodes

Visible nodes

Boltzmann machine Restricted Boltzmann machine

Contrastive Hebbian learning: Alternating Gibbs sampling

t=1 t=2 t=3 t= 8

Model retina

RBM layers

Recognition readout and stimulation

Image input

Concept input


Recommended