Date post: | 18-Jan-2018 |
Category: |
Documents |
Upload: | janice-margaret-bond |
View: | 233 times |
Download: | 1 times |
If you can't read please download the document
Fundamentals of Computational Neuroscience, T. P. Trappenberg,
2002.
4. Neurons in a network Fundamentals of Computational
Neuroscience,T. P.Trappenberg, 2002. Lecture Notes on Brain and
Computation Byoung-Tak Zhang Biointelligence Laboratory School of
Computer Science and Engineering Graduate Programs in Cognitive
Science, Brain Science and Bioinformatics Brain-Mind-Behavior
Concentration Program Seoul National University This material is
available online at Outline 4.1 4.2 4.3 4.4 4.5 Organizations of
neural networks
Information transmission in networks Population dynamics: modeling
the average behavior of neurons The sigma node Networks with
non-classical synapses: the sigma-pi node 4.1 Organizations of
neural networks
The high-order mental abilities An emerging property of specialized
neural networks The number of neurons in the central nervous
systems 1012 We aim to understand the principal organization of
neuron-like elements and how such structures can support and enable
particular mental processes. The anatomy of the brain areas
Neocortex, cerebral cortex, cortex Cerebellum Subcortical area
4.1.1 Neocortical organization
Brodmanns cortical map 52 cortical area Functional correlates of
different cortical areas Fig. 4.1 Outline of the lateral view of
the human brain including the cortex, cerebellum, and brainstem.
The neocortex is divided into for lobes. The numbers correspond to
Brodmanns classification of cortical areas. 4.1.2 Staining
techniques Fig. 4.2 Examples of stained neocortical slice showing
the layered structure in neocortex. (B) Illustration of different
staining techniques. 4.1.3 Common neuronal types in the
neocortex
Pyramidal cell Stellate cell 4.1.4 The layered structure of
neocortex
A generally layered structure of the neocortex Laminar-specific
structure Fig 4.2 A Fig 4.2 C Fig. 4.2 Examples of stained
neocortical slice showing the layered structure in neocortex. (A)
Nissl stained visual cortex showing cell bodies. (C) Different
sizes of cortical layers in different areas. 4.1.5 Columnar
organization and cortical modules
Fig. 4.3 Columnar organization and topographic maps in neocortex.
(A) Ocular dominance columns. (B) Schematic illustration of the
relation between orientation and ocular dominance columns. (C)
Topographic representation of the visual field in the primary
visual cortex. (D) Topographic representation of touch-sensitive
areas of the body I the somatosensory cortex. 4.1.6 Connectivity
between neocortical layers
Fig. 4.4 Schematic connectivity patterns between neurons in a
cortical layer. Open cell bodies represent (spiny) excitatory
neurons such as the pyramidal neuron and the spiny stellate neuron.
Their axons are plotted with solid lines that end at open triangles
that represent the axon terminal. The dendritic boutons are
indicated by open circles. Inhibitory (smooth) stellate neurons
have solid cell bodies and synaptic terminal, and the axons are
represented by dashed lines. 4.1.7 Cortical parameter 4.2
Information transmission in networks 4.2.1 The simple chain
Biologically not reasonable A single presynaptic spike not
sufficient to elicit a postsynaptic spike. Synaptic transmission is
lossy. The death of a single neuron would disrupt the transmission.
Fig. 4.5 (A) A sequential transmission line of four nodes. Parallel
chains are made out of many such sequential transmission lies
without connections between them. 4.2.2 Diverging-converging
chains
The number of neurons, N Divergence rate, m Convergence rate, C
Fully connected network N = m = C Synaptic efficiency (weight), w
Feedback loop Fig. 4.5 (B) Diverging/converging chains where each
node can contact several other nodes in neighboring transmission
chains. 4.2.3 Immunity of random networks to spontaneous background
activity (1)
Cortical neurons typically fire with some background activity Mean
5Hz, variance 3Hz Neuron that has excitatory dendritic synapses.
Spiking arriving in each time interval (1ms) Mean = * = 50 Variance
2 = 0.003 A spike arriving at a synapse with weight w=1 would
elicit a spike The neuron to be immune against the background
firing. w < 1/50 = 0.02 4.2.3 Immunity of random networks to
spontaneous background activity (2)
To compare these value to experimental data How to measure the
average synaptic efficiency To stimulate a presynaptic neuron while
recording from the postsynaptic neuron. Asynchronous gain The
average number of extra spikes that are added to the spikes of a
postsynaptic neuron by each presynaptic spikes. If 100 presynaptic
spikes (100Hz) lead 2 postsynaptic spikes (2Hz) during 1ms,
synaptic efficiency is 5/100 = 0.02 Fig. 4.6 Schematic illustration
of the influence of a single presynaptic spike on the average
firing rate of the postsynaptic neuron. The delay in the synaptic
transmission curve is caused by some delay, after which, on
average, more postsynaptic spikes are generated within a short time
window compared to the spontaneous activity of the neurons 4.2.4
Noisy background Large variability
Ex) = 50, variance 2 = 50 The probability ofa postsynaptic spikes
generated by the background firing to be less than a certain value,
pbg The probability of having more than x simultaneous presynaptic
spikes Gaussian distribution If pbg = 0.1, x 59, w < 1/59 0.017
(4.1) (4.2) (4.3) 4.2.5 Information transmission in large random
networks
Previous condition, at least 59-50=9 additional presynaptic spikes
to elicit a meaningful postsynaptic spike Large random networks
1010 Neurons Each of these neurons connects to other neurons
Stimulate 1000 neurons 1000 * = 107 spikes transmitted The
probability of a neuron receives A spike is 107 / 1010 Two spikes
is (107 / 1010)2=10-6 Not sufficient to secondary spikes A
consequence of the small number of connections per neuron relative
to large number of neuron in the network 4.2.6 The spread of
activity in small random networks
Netlets Small networks with only a small number of highly efficient
synapses. Only very few active presynaptic neurons can elicit a
postsynaptic spike in functionally correlated neurons. Absolute
Refractory Period Control the number of active neurons Two
Asymptotic model Small initial activity we get an inactive netlet
Initial large activity we get a nearly maximally active netlet
4.2.7 The expected number of active neurons in netlets
Fig. 4.7 The fraction of active nodes of netlets in the time
interval t + 1 as a function of the fraction of active nodes in the
previous time interval. The different curves correspond to
different numbers of presynaptic spikes that are necessary to
elicit a postsynaptic spike. (A) Netlets with only excitatory
neurons. (B)Netlets with the same amount of excitatory and
inhibitory connections. The expected fraction of active nodes f the
average number of synapses per neuron, C the firing threshold,
attractive fixpoint, firing rate 500Hz (4.4) 4.2.8 Netlets with
inhibition Cortical neuron firing rate range 10~100Hz Inhibitory
neurons the value still exceed. 4.3 Population dynamics: modeling
the average behavior of neurons
Simulation of networks of spiking neurons Computing power problem
Use average firing rate Relationship of such models to population
of spiking neurons What conditions these common approximations are
useful and faithful descriptions of neuronal characteristics Rate
models cannot incorporate all aspects of networks of spiking
neurons Many investigations in computational neuroscience have used
such models 4.3.1 Firing rate The rectangular time window
The average temporal spike rate of a neuron with a window size t:
Gaussian window (4.5) (4.6) Fig. 4.8 (A) Temporal average of a
single spike train with a time widow T hat ahs to be large compared
to the average interspike interval 4.3.2 Population averages The
average population activity A(t) of neurons Very small time windows
(4.7) (4.8) Fig. 4.8 (B)Pool or local population of neurons with
similar response characteristics. The pool average is defined as
the average firing rate over the neurons in the pool within a
relatively small time window. 4.3.3 Population dynamics in response
to slowly varying inputs
The average behavior of a pool of neuron Membrane time constant,
Input current, I Activation function,g Stationary state (4.9)
(4.10) 4.3.4 Rapid response of population
Noise current at100ms, increase external input current. Population
dynamics(eqn 4.9) vs average population spike rate response to
rapidly varying input. Adding noise, fluctuation, more realistic.
Fig. 4.9 Simulation of a population of 1000 independent
integrate-and-fire neurons with a membrane time constant m = 10 ms
and threshold = 10. Each neuron receives an input with white noise
around a mean RIext. this mean is switched from RIext= 11 to RIext
= 16 at t = 100 ms. The spike count of the population almost
instantaneously follows this jump in the input, whereas the average
population rate, calculated from in eqn 4.9 with a linear
activation function, follows this change input only slowly when the
time constant is set to = m . Fig 4.9 4.3.5 Advanced descriptions
of population dynamics
Spike response model Ignored term, i for the postsynaptic neurons,
average synaptic efficiency No spike-time adaptation in neuron. the
mean influence of the postsynaptic potential Noise In slowly
varying input, eqn 4.9 with a gain function, (4.11) (4.12) (4.13)
(4.15) (4.14) Fig (A) The gain function of eqn 4.15 that can be
used to approximate the dynamics of a population response to slowly
varying inputs (adiabatic limit). (B) Examples of physiological
gain functions from a hippocampal pyramidal cell. The discharge
frequency is based on the inverse of the first interspike interval
after the cell started to respond to rectangular current pulses
with different strength 4.4 The sigma node 4.4.1 Minimal neuron
model McCulloch-Pitts node
Rate model The timing of spikes is irrelevant. But spike times play
critical role for fast response 4.4.1 Minimal neuron model Sigma
node Rate value of neuronal groups, r (4.16) Fig Schematic summary
of the functionality of a sigma node most commonly used in networks
of artificial neurons. Such a node weights the input value of each
channel with the corresponding weight value of that channel and
sums up all these weighted inputs. The output of the node is then a
function of this internal activation. (4.17) (4.18) (4.19) 4.4.2
Common activation functions
Activation function, g generalize the sigmoid function (4.20) 4.4.3
The dynamic sigma node 4.4.4 Leaky integrator characteristics
The discrete sigma node to continuous dynamics Continuous, limt0.
Fig Time course of the activation of a leaky integrator node with
initial value h = 0. In the lower curve no input Iin was applied
leading to an exponential decay. The upper curve corresponds to a
constant input current Iin = 0.5. The resting activation of the
node was set to hrest = 0.1. (4.21) 4.4.4 Leaky integrator
characteristics (4.22) (4.23) The leaky integrator dynamics.
without external input Solve change behavior (4.24) 4.4.5 Discrete
formulation of continuous dynamics
The exponential response to short inputs Taking time steps on a
logarithmic scale Method Euler method Higher-order Runge-Kutta
method Adaptive time step algorithm (4.25) 4.5 Networks with
non-classical synapses: the sigma-pi node
Sigma node A very rough abstraction of real neuron Interaction of
different ion channels Information-processing Nonlinear interaction
between input channels Average firing rate 4.5.1 Logical AND and
sigma-pi nodes Nonlinear interaction between two ion channels
Firing threshold Requires at least two spikes in some temporal
proximity Correspond to a logical AND function Generalize this idea
for model, sigma-pi node (4.26) 4.5.2 Divisive inhibition
Interaction between an excitatory synapse and an inhibitory synapse
Divisive inhibition Shunting inhibition (4.27) 4.5.3 further
sources of modulatory effects between synaptic inputs
Fig Some sources of nonlinear (modulatory) effects between synapses
as modeled by sigma-pi nodes. (A) shunting (divisive) inhibition,
which is often recorded as the effect of inhibitory synapses on the
cell body. (B) The effect of simultaneously activated voltage-gated
excitatory synapses that are in close physical proximity to each
other (synaptic clusters) can be larger than the sum of the effect
of each individual synapse. Examples are clusters of AMPA ad NMDA
type synapses. (C) some cortical synaptic terminals have nicotinic
acetylcholine (ACh) receptors. An ACh releases of cholinergic
afferents can thus produce a larger efflux of neurotransmitter and
thereby increases EPSPs in the postsynaptic neuron of this synaptic
terminal. (D) Metabotropic receptors can trigger intracellular
messengers that can influence the gain of ion channels. (E) Ion
channels can be linked to the underlying cytoskeleton with adapter
proteins and can thus influence other ion channels through this
link.0 Conclusion The brain does display characteristic neuronal
organizations Properties of networks of spiking neurons The spread
of neuronal activities through random networks Transmission of
information A sensible activity in random recurrent networks The
self-organization of synaptic efficiencies The sigma node Modeling
the average firing rate of populations of neurons Sigma-pi node
Rate model