+ All Categories
Home > Documents > GUM*02 tutorial session UTSA, San Antonio, Texas

GUM*02 tutorial session UTSA, San Antonio, Texas

Date post: 19-Mar-2016
Category:
Upload: ziva
View: 26 times
Download: 0 times
Share this document with a friend
Description:
GUM*02 tutorial session UTSA, San Antonio, Texas. Parameter searching in neural models Mike Vanier, Caltech. The problem:. you want to build a model of a neuron you have a body of data you know a lot about the neuron’s morphology physiology ion channel kinetics - PowerPoint PPT Presentation
41
GUM*02 tutorial session UTSA, San Antonio, Texas Parameter searching in neural models Mike Vanier, Caltech
Transcript
Page 1: GUM*02 tutorial session UTSA, San Antonio, Texas

GUM*02 tutorial sessionUTSA, San Antonio, Texas

Parameter searching in neural models

Mike Vanier, Caltech

Page 2: GUM*02 tutorial session UTSA, San Antonio, Texas

The problem: you want to build a model of a

neuron you have a body of data you know a lot about the neuron’s

morphology physiology ion channel kinetics

but you don’t know everything!

Page 3: GUM*02 tutorial session UTSA, San Antonio, Texas

Typical preliminary data set

anatomy rough idea of

morphology detailed

reconstruction

Page 4: GUM*02 tutorial session UTSA, San Antonio, Texas

Typical preliminary data set

physiology current clamp synaptic potentials potentiation modulators

Page 5: GUM*02 tutorial session UTSA, San Antonio, Texas

Typical preliminary data set ion channels

identities of the major types

kinetics modulation

Page 6: GUM*02 tutorial session UTSA, San Antonio, Texas

Missing data? ion channels

identities of ALL channels densities (uS/(um)2) detailed kinetics

anatomy detailed reconstructions? variability?

physiology voltage clamp, neuromodulators, etc.

???

Page 7: GUM*02 tutorial session UTSA, San Antonio, Texas

Harsh reality

most experiments not done with models in mind

>half of model parameters loosely constrained or unconstrained

experiments to collect model params are not very sexy

Page 8: GUM*02 tutorial session UTSA, San Antonio, Texas

A different approach collect data set model should match collect plausible parameters

those known to be correct educated guesses

build model test model performance modify parameters until get match

Page 9: GUM*02 tutorial session UTSA, San Antonio, Texas

How to modify parameters?

manually 10 parameters @ 5 values each:

9765625 possible simulations 1 sim/minute = 19 years!

use previous results to guide searching non-linear interactions? tedious!!!

Page 10: GUM*02 tutorial session UTSA, San Antonio, Texas

How to modify parameters?

automatically set ranges for each parameter define update algorithm start parameter search go home! check results in a day, week, ...

Page 11: GUM*02 tutorial session UTSA, San Antonio, Texas

match function

need to quantify goodness of fit reduce entire model to one number

0 = perfect match match:

spike rates spike times voltage waveform

Page 12: GUM*02 tutorial session UTSA, San Antonio, Texas

simple match function

inputs: different current levels e.g. 0.05, 0.1, 0.15, 0.2, 0.25, 0.3

nA outputs: spike times

2,, )(11ji

ncurrs

i

nspikes

jji datasim

nspikesncurrsmatch

Page 13: GUM*02 tutorial session UTSA, San Antonio, Texas

waveform match function

inputs: hyperpolarized current levels e.g. -0.05, -0.1 nA

outputs: Vm(t)2

0

))()((1 tdatatsimdttmax

ncurrsmatch i

ncurrs

i

tmax

ti

Page 14: GUM*02 tutorial session UTSA, San Antonio, Texas

other match functions

some data might be more important to match than the rest adaptation bursting behavior

incorporate into more complex match functions

Page 15: GUM*02 tutorial session UTSA, San Antonio, Texas

weight early spikes more

2)(11ijij

ncurrs

i

nspikes

jij datasimw

nspikesncurrmatch

wij: weighting params set wi0 < wi1 < wi2 < ...

Page 16: GUM*02 tutorial session UTSA, San Antonio, Texas

harder match functions

bursting purkinje cell, pyramidal cell

transitions btw complex behaviors regular spiking

bursting

Page 17: GUM*02 tutorial session UTSA, San Antonio, Texas

the data set

need exceptionally clean data set

noise in data set: model will try to replicate it!

need wide range of inputs

Page 18: GUM*02 tutorial session UTSA, San Antonio, Texas

typical data set for neuron model

current clamp over wide range hyperpolarized (passive) depolarized (spiking)

Page 19: GUM*02 tutorial session UTSA, San Antonio, Texas

the process (1)

build model anatomy channel params from lit

match passive data hyperpolarized inputs

Page 20: GUM*02 tutorial session UTSA, San Antonio, Texas

the process (2) create match function

waveform match for hyperpolarized spike match for depolarized

run a couple of simulations check that results aren’t ridiculous

get into ballpark of right params

Page 21: GUM*02 tutorial session UTSA, San Antonio, Texas

the process (3)

choose params to vary channel densities channel kinetics

minf(V), tau(V) curves passive params

choose parameter ranges

Page 22: GUM*02 tutorial session UTSA, San Antonio, Texas

the process (4)

select a param search method conjugate gradient genetic algorithm simulated annealing

set meta-params for method

Page 23: GUM*02 tutorial session UTSA, San Antonio, Texas

the process (5)

run parameter search periodically check best results

marvel at your own ingenuity curse at your stupid computer

figure out why it did/didn’t work

Page 24: GUM*02 tutorial session UTSA, San Antonio, Texas

results (motivation)

Page 25: GUM*02 tutorial session UTSA, San Antonio, Texas

parameter search methods

different methods have different attributes local or global optima? efficiency?

depends on nature of parameter space smooth or ragged?

Page 26: GUM*02 tutorial session UTSA, San Antonio, Texas

the shapes of space

smooth

ragged

Page 27: GUM*02 tutorial session UTSA, San Antonio, Texas

genesis param search methods

Conjugate gradient-descent (CG) Genetic algorithm (GA) Simulated annealing (SA) Brute Force (BF) Stochastic Search (SS)

Page 28: GUM*02 tutorial session UTSA, San Antonio, Texas

conjugate gradient (CG) “The conjugate gradient method is based on the

idea that the convergence to the solution could be accelerated if we minimize Q over the hyperplane that contains all previous search directions, instead of minimizing Q over just the line that points down gradient. To determine xi+1 we minimize Q over

x0 + span(p0,p1,p2,...,pi) where the pk represent previous search directions.”

Page 29: GUM*02 tutorial session UTSA, San Antonio, Texas

no, really... take a point in parameter space find the line of steepest descent

(gradient) minimize along that line repeat, sort of

along conjugate directions only i.e. ignore subspace of previous lines

Page 30: GUM*02 tutorial session UTSA, San Antonio, Texas

CG method: good and bad

for smooth parameter spaces: guaranteed to find local minimum

for ragged parameter spaces: guaranteed to find local minimum

;-) not what we want...

Page 31: GUM*02 tutorial session UTSA, San Antonio, Texas

genetic algorithm pick a bunch of random parameter sets

a “generation” evaluate each parameter set create new generation

copy the most fit sets mutate randomly, cross over

repeat until get acceptable results

Page 32: GUM*02 tutorial session UTSA, San Antonio, Texas

genetic algorithm (2) amazingly, this often works global optimization method many variations many meta-params

mutation rate crossover type (single, double) and rate

no guarantees

Page 33: GUM*02 tutorial session UTSA, San Antonio, Texas

simulated annealing make noise work for you! noisy version of “simplex algorithm”

evaluate points on simplex add noise to result based on “temperature” move simplex through space accordingly

gradually decrease temperature to zero

Page 34: GUM*02 tutorial session UTSA, San Antonio, Texas

simulated annealing(2)

some nice properties: guaranteed to find global optimum

but may take forever ;-) when temp = 0, finds local minimum

how fast to decrease temperature?

Page 35: GUM*02 tutorial session UTSA, San Antonio, Texas

comparing methods (1)

Page 36: GUM*02 tutorial session UTSA, San Antonio, Texas

comparing methods (2)

Page 37: GUM*02 tutorial session UTSA, San Antonio, Texas

comparing methods (3)

Page 38: GUM*02 tutorial session UTSA, San Antonio, Texas

recommendations

Passive models: SA, CG Small active models: SA Large active models: SA, GA Network models: usually SOL

Page 39: GUM*02 tutorial session UTSA, San Antonio, Texas

genesis tutorial (1)

objects: paramtableGA paramtableSA paramtableCG

task: parameterize

simple one-compt neuron

Na, Kdr , KM channels

Page 40: GUM*02 tutorial session UTSA, San Antonio, Texas

genesis tutorial (2)

parameters: gmax of Na, Kdr , KM

KM (v) scaling KM minf(v) midpoint

Page 41: GUM*02 tutorial session UTSA, San Antonio, Texas

Conclusions param search algorithms are useful

but: pitfalls, judgment modeler must help computer failure is not always bad! will continue to be active research

area


Recommended