+ All Categories
Home > Documents > 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation...

1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation...

Date post: 30-Dec-2015
Category:
Upload: easter-mckenzie
View: 213 times
Download: 0 times
Share this document with a friend
Popular Tags:
41
1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan, and Richard S. Zemel Annual review of neuroscience 2003 Presenter : Sangwook Hahn, Jisu Kim
Transcript
Page 1: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

1 / 41Inference and Computation with Population Codes

13 November 2012

Inference and Computation with Population Codes

Alexandre Pouget, Peter Dayan, and Richard S. Zemel

Annual review of neuroscience 2003

Presenter : Sangwook Hahn, Jisu Kim

Page 2: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

2 / 41Inference and Computation with Population Codes

13 November 2012

Outline

1.Introduction

2.The Standard Model ( First Part )

1. Coding and Decoding

2. Computation with Population Codes

3. Discussion of Standard Model

3.Encoding Probability Distributions ( Second Part

)

1. Motivation

2. Psychophysical Evidence

3. Encoding and Decoding Probability Distributions

4. Examples in Neurophysiology

5. Computations Using Probabilistic Population Codes

Page 3: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

3 / 41Inference and Computation with Population Codes

13 November 2012

Introduction

Single aspects of the world –(induce)> activity in multiple

neurons

For example

– 1. Air current is occurred by predator of cricket

– 2. Determine the direction of an air current

– 3. Evade with other direction from predicted predator’s move

air cur-

rent

Page 4: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

4 / 41Inference and Computation with Population Codes

13 November 2012

Introduction

Analyze the example at the view of neural activity

– 1. Air current is occurred by predator of cricket

– 2. Determine the direction of an air current

( i. population of neurons encode information about single

variable

ii. information decoded from population activity )

– 3. Evade with other direction from predicted predator’s move

air cur-

rent

Page 5: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

5 / 41Inference and Computation with Population Codes

13 November 2012

Guiding Questions (At First Part)

Q1:

How do populations of neurons encode information about single

variables?

How this information can be decoded from the population activity?

How do neural populations realize function approximation?

Q2:

How population codes support nonlinear computations

over the information they represent?

Page 6: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

6 / 41Inference and Computation with Population Codes

13 November 2012

The Standard Model – Coding

Cricket cercal system has hair cells (a) as primary sensory neurons

Normalized mean firing rates of 4 low-velocity interneurons

s is the direction of an air current (induced by predator)

Page 7: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

7 / 41Inference and Computation with Population Codes

13 November 2012

The Standard Model – Encoding Model

Mean activity of cell a depends on s

– : maximum firing rate

– : preferred direction of cell a

Natural way of describing tuning curves

– proportional to the

threshold projection

of v onto

Page 8: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

8 / 41Inference and Computation with Population Codes

13 November 2012

The Standard Model – Decoding

3 methods to decode homogeneous population codes

– 1. Population vector approach

– 2. Maximum likelihood decoding

– 3. Bayesian estimator

Population vector approach ( sum )

– : population vector

– : preferred direction

– : actual rates from the mean rates

– : approximation of wind direction (r is noisy rates)

Page 9: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

9 / 41Inference and Computation with Population Codes

13 November 2012

The Standard Model – Decoding

Main problem of population vector method

– It is not sensitive to the noise process that generates

– However, it works quite well

– Estimation of wind direction to

within a few degrees is possible

only with 4 noisy neurons

Page 10: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

10 / 41Inference and Computation with Population Codes

13 November 2012

The Standard Model – Decoding

Maximum likelihood decoding

– This estimator starts from the full probabilistic encoding

model

by taking into account the noise corrupting neurons activities

– A

– A

– If is high -> those s values are likely to the observed

activities

– If is low -> those s values are unlikely to the observed

activities

rms = root mean

square

deviation

Page 11: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

11 / 41Inference and Computation with Population Codes

13 November 2012

The Standard Model – Decoding

Bayesian estimators

– Combine likelihood P[r|s] with any prior information about

stimulus s

to produce a posterior distribution P[s|r] :

– If prior distribution P[s] is flat, there is no specific prior

information of s

and this is renormalization version of likelihood

– Bayesian estimator does a little better

than maximum likelihood

and population vector

Page 12: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

12 / 41Inference and Computation with Population Codes

13 November 2012

The Standard Model – Decoding

In homogenous population

– Bayesian & Maximum likelihood decoding >>> population vector

– ‘the greater the number of cells is ,

the greater the accuracy is’

since more cells can provide more information about stimulus

Page 13: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

13 / 41Inference and Computation with Population Codes

13 November 2012

Computation with Population Code

Discrimination

– If there are and where is a small angle,

we can use Bayesian poesterior (P[s|r]) in order to discriminate

those

– It is also possible to perform discrimination based directly on

activities by computing a linear :

– : usually 0 for a homogeneous population code

– : Relative weight

Page 14: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

14 / 41Inference and Computation with Population Codes

13 November 2012

Computation with Population Code

Noise Removal

– Maximum likelihood estimator is unclear

about its neurobiological relevance.

• 1. finding a single scalar value seems unreasonable

because population codes seem to be used throughout the

brain

• 2. while finding maximum likelihood value is difficult in

general

– Solution : utilizing recurrent connection within population

to make it behave like an autoassociative memory

• Autoassociative memories use nonlinear recurrent

interactions

to find the stored pattern that most closely matches a noisy

input

Page 15: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

15 / 41Inference and Computation with Population Codes

13 November 2012

Computation with Population Code

Basis Function Computations

– Function approximation compute the output of functions

for the case of multiple stimulus dimensions.

– For example,

– sh : head-centered direction to a target

sr : eye-centered direction

se : position of eyes in the head

Page 16: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

16 / 41Inference and Computation with Population Codes

13 November 2012

Computation with Population Code

Basis Function Computations

Page 17: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

17 / 41Inference and Computation with Population Codes

13 November 2012

Computation with Population Code

Basis Function Computations

– linear solution for homogeneous population codes

(mapping from one population code to another, ignoring noise )

Page 18: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

18 / 41Inference and Computation with Population Codes

13 November 2012

Guiding Questions (At First Part)

Q1:

How do populations of neurons encode information about single

variables?

-> p.6~7

How this information can be decoded from the population activity?

-> p.8~12

How do neural populations realize function approximation?

-> p.13~14

Q2:

How population codes support nonlinear computations

over the information they represent?

-> p.15~17

Page 19: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

19 / 41Inference and Computation with Population Codes

13 November 2012

Encoding Probability Distributions

Page 20: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

20 / 41Inference and Computation with Population Codes

13 November 2012

Motivation

The standard model has two main restrictions :

We only consider uncertainty coming from noisy neural activities.

(internal noise)

: Uncertainty is inherent, independent of internal noise.

We do not consider anything other than estimating the single value.

: Utilizing the full information contained in the posterior is crucial.

Page 21: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

21 / 41Inference and Computation with Population Codes

13 November 2012

Motivation

“ill-posed problems” : images do not contain enough information.

The aperture problem.

: Images does not unambiguously specify the motion of the object.

Solution - probabilistic approach.

: perception is conceived as statistical inference giving rise to proba-

bility distributions over the values.

Page 22: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

22 / 41Inference and Computation with Population Codes

13 November 2012

Motivation

Page 23: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

23 / 41Inference and Computation with Population Codes

13 November 2012

Psychophysical Evidence

Perceived speed of a grating increases with contrast.

Nervous system seeks the posterior distribution of velocity given the

image sequence, obtained through Bayes rule:

High contrast -> likelihood function becomes narrow

-> likelihood dominates product

Page 24: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

24 / 41Inference and Computation with Population Codes

13 November 2012

Psychophysical Evidence

Page 25: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

25 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Log-likelihood method :

The activity of a neuron tuned to prefer velocity v is viewed as re-

porting the log-likelihood function of the image given the motion

Provides a statistical interpretation, and decoding only involves the

simple operation of exponentiating to find the full likelihood.

Some schemes for computing require that the likelihood only have

one peak.

Page 26: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

26 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Gain encoding for Gaussian distributions :

Using Bayesian approach to decode a population pattern ->

Assuming independent noise in the response of neurons.

-> posterior distribution converges to Gaussian.

Gain of the population activity controls the standard deviation of the

posterior distribution.

Strong limitation : only viably work for simple Gaussians.

Page 27: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

27 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Page 28: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

28 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Convolution encoding :

Can deal with non-Gaussian distributions that cannot be character-

ized by a few parameters, such as their means and variances.

Represent the distribution using a convolution code, obtained by

convolving the distribution with a particular set of kernel functions.

Page 29: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

29 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Motivation : Fourier transform

-periodic, odd function ()

Encoding :

Decoding :

Page 30: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

30 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Use large neuronal population of neurons to encode any function by

devoting each neuron to the encoding of one particular coefficient.

The activity of neuron a is computed by taking the inner product be-

tween a kernel function assigned to that neuron and the function be-

ing encoded.

Page 31: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

31 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Encoding schemes

Kernel – sine function :

Kernel – Gaussian : Gaussian kernel

Kernel – Gaussian, :

Page 32: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

32 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Decoding scheme - Anderson’s approach

Activity if neuron a is considered to be a vote for a particular decod-

ing basis function .

Overall distribution decoded :

Page 33: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

33 / 41Inference and Computation with Population Codes

13 November 2012

Encoding and Decoding Probability Distribu-tions

Decoding scheme - Zemel’s approach

Probabilistic approach : recover the most likely distribution over s,

Can be achieved using a nonlinear regression method such as the

Expectation-Maximization algorithm.

Page 34: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

34 / 41Inference and Computation with Population Codes

13 November 2012

Examples in Neurophysiology

Uncertainty in 2-AFC (2-alternative forced choice)

: examples offer preliminary evidence that neurons represent proba-

bility distributions, or related quantities, such as log likelihood ratios.

There are also experiments supporting gain encoding, convolution

codes, and DDPC, respectively.

Page 35: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

35 / 41Inference and Computation with Population Codes

13 November 2012

Computations Using Probabilistic Population Codes

Experiment by Ernst & Banks (2002) : judge the width of a bar

The optimal strategy : Recovering the posterior distribution over the

width w, given the image V and haptic H

Using Bayes rule :

Page 36: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

36 / 41Inference and Computation with Population Codes

13 November 2012

Computations Using Probabilistic Population Codes

If we use convolution code for all distributions

– multiply all the population codes together term by term

– requires neurons that can multiply or sum : achievable neural

operation

If the probability distributions are encoded using the position and

gain of population codes

– Solution : Deneve et al. (2001)

– Some limitations

– Performs a Bayesian inference using noisy population codes

Page 37: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

37 / 41Inference and Computation with Population Codes

13 November 2012

Computations Using Probabilistic Population Codes

Page 38: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

38 / 41Inference and Computation with Population Codes

13 November 2012

Guiding Questions(At Second Part)

Q3: How may neural populations offer a rich representation of such

things as uncertainty in the aspects of the stimuli they represent?

# 21 ~ # 24

Probabilistic approach : perception is conceived as statistical infer-

ence giving rise to probability distributions over the values.

Hence stimuli of neural populations represents probability distribu-

tions, which gives information of uncertainty.

Page 39: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

39 / 41Inference and Computation with Population Codes

13 November 2012

Guiding Questions(At Second Part)

Q4: How can populations of neurons represent probability

distributions? How can they perform Bayesian probabilistic

inference?

#25 ~ #31 (for first), #37 ~ #39 (for second)

Several schemes have been proposed for encoding probability

distributions in populations of neurons : Log-likelihood method, Gain

encoding for Gaussian distributions, Convolution encoding.

Bayesian probabilistic inference can be done by multiply all the

population codes (convolution encoding), or using noisy population

codes (gain encoding)

Page 40: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

40 / 41Inference and Computation with Population Codes

13 November 2012

Guiding Questions(At Second Part)

Q5: How multiple aspects of the world are represented in single

populations? What computational advantages (or disadvantages)

such schemes have?

# 25 ~ # 28 (first)

Log-likelihood : likelihood

Gain encoding : mean and standard deviation

Convolution encoding : probability distribution

Page 41: 1 / 41 Inference and Computation with Population Codes 13 November 2012 Inference and Computation with Population Codes Alexandre Pouget, Peter Dayan,

41 / 41Inference and Computation with Population Codes

13 November 2012

Guiding Questions(At Second Part)

Q5: How multiple aspects of the world are represented in single

populations? What computational advantages (or disadvantages)

such schemes have?

# 25 ~ # 28 (second)

Log-likelihood : decoding is simple, but some distribution limitation

Gain encoding : strong distribution limitation.

Convolution encoding : can work for complicated distribution.


Recommended