+ All Categories
Home > Documents > Supratim Ray [email protected]/~sray/E9282/Lectures/Lecture3.pdf · 2014. 1....

Supratim Ray [email protected]/~sray/E9282/Lectures/Lecture3.pdf · 2014. 1....

Date post: 27-Jan-2021
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
42
Supratim Ray [email protected]
Transcript
  • Supratim Ray

    [email protected]

  • • Biophysics of Action Potentials• Passive Properties – neuron as an electrical circuit

    • Passive Signaling – cable theory

    • Active properties – generation of action potential

    • Techniques• Random Variables and Poisson Distribution

    • Correlations – various techniques

    • Journal Session – Kohn and Smith, 2005, JNS

    2

  • • Biophysics of Action Potentials• Passive Properties – neuron as an electrical circuit

    • Passive Signaling – cable theory

    • Active properties – generation of action potential

    • Techniques• Random Variables and Poisson Distribution

    • Correlations – various techniques

    • Journal Session – Kohn and Smith, 2005, JNS

    3

  • Kandel, Schwartz and Jessell, Principles of Neural Science, Chapter 27

    Sheldon M. Ross, Stochastic Processes, Chapters 1-2 or

    Papoulis, Probability, Random Variables, and Stochastic Processes, Chapters 1-4

    4

  • Start reading the Journal session paper. Most techniques used in the paper will be covered here.

    Spike Data and several Matlab codes will be emailed to you. Many codes are related to the topics covered here, and are part of the Homework assignment.

    5

  • Data acquisition and Questions

    6

  • 7

    • 10x10 grid of electrodes

    • 400 microns tip-to-tip spacing

    • V1 cortex

  • 8Receptive field

    Attended Side

  • 9Figure 27-11: Kandel, Schwartz and Jessell

  • 10http://www.youtube.com/watch?v=8VdFf3egwfg

  • 11Figure 27-14: Kandel, Schwartz and Jessell

  • 12

    >> load SpikeData_022309SRC_001_elecs49_51.mat

    >> rasterplot(SpikeData{1}); xlim([-0.4 0.8]);

  • 13

    >> load SpikeData_022309SRC_001_elecs49_51.mat

    >> [H,timeVals]=psthplot(SpikeDataList{1},1,[-0.4 0.8],4);

    >> plot(timeVals,H); xlim([-0.4 0.8]);

  • In rate code, the precise timing of the spike does not matter. The only metric of interest is the total number of spikes in a given time interval.

    In temporal code, the precise timing of spikes is also important. In particular, if sufficient number of neurons fire synchronously, it has a larger impact on the downstream neuron.

    14

  • Introduction, Basic Concepts

    15

  • Characterized by the property that its observation under a given set of circumstances does not always lead to the same observed outcome, but rather to different outcomes in such a way that there is statistical regularity.

    Sample Space (S): Set of all possible outcomes of an experiment.

    Events: A subset of a sample space, is said to occur if the outcome of the experiment is an element of that subset. Individual members of the sample space are called sample points or elementary events.

    Due to certain consistency problems sometimes it is not possible to assign probabilities to all possible subsets of the sample space. Thus, we consider a certain class F of subsets of S satisfying certain axioms, declare the members of F as events, and then define the probability of an event.

    16

  • Let F be a class of subsets of a sample space S. F is said to be a sigma field of events if

    S is in F

    If A is in F, then so is the complement of A.

    If An is a sequence of elements of F, then the union of the Ans is in F.

    Examples

    F = {phi, S} is a trivial sigma field.

    F = {phi, A, Ac, S} is a sigma-field, where A is any subset of S and Ac denotes the complement of A.

    17

  • A set function P defined on a sigma field F of subsets of the sample space S of a random experiment is called a probability measure if it satisfies the following axioms.

    0 ≤ P(A) ≤ 1 for all A in F.

    P(S) = 1;

    Let A1, A2, …. be mutually exclusive (disjoint) events in F, i.e., the intersection of Ai and Aj is empty (for i≠j), then

    18

  • Experiment: A coin is tossed once and we report the side of the coin.

    Sample Space S = {H,T}

    Sigma Field 1 = {Φ, S}

    Sigma Field 2 = {Φ, {H}, {T}, S}

    Probability Measure on Sigma Field 2

    P({H}) = p

    P({T}) = q = 1-p.

    19

  • Let A and B be two events on a sample space S, on the sigma Field F of subsets of which is defined a probability P. the conditional probability of the event B, given the event A, denoted by P(B/A) is defined by

    If P(A) > 0, and is undefined if P(A)=0.

    Independence of events: Two events A and B are said to be independent if

    20

  • Is a function that maps each element of the sample space S to a number on the real line.

    We can ask questions of the following type: What is the probability that a random variable X is less than a number x? Because probabilities are assigned to events, we need to reformulate the question in terms of the events in the sigma field F.

    Formal definition: Let S be the sample space of a random experiment and F be the sigma field of events. A finite, single valued function X which maps S into R is called a random variable if the inverse images under X of all intervals (-∞,x] are events, i.e., if

    21

    X-1((-¥, x]) = [X £ x]= {w :X(w) £ x} Î F

  • Coin Tossing Experiment.

    S = {T,H}.

    F = {Φ, {H}, {T}, S}

    Define a function X on S as follows: X(T)=0, X(H)=1.

    Φ if x < 0

    = {T} if 0≤x

  • Let X be a random variable. Define a function F on R by

    23

    F(x) = P([X £ x]) = P({w :X(w) £ x})The function is called the distribution function (DF) or the

    cumulative distribution function (CDF) of the Random Variable.

    Probability Distribution Function (pdf) is the derivative of F with respect to x. If the random variable is discrete, we get the density concentrated at certain points. It is also called the probability mass function (pmf) in that case.

    Exercise: cdf and pdf of a single coin toss described before.

  • Experiment: Suppose you take a coin and flip it N times. Suppose also that the probability of getting a heads is p.

    Sample Space has 2^N possible sequences.

    Define a random variable X which is the number of times heads appear in a sequence.

    Random variable X takes a value k with probability

    24

  • 25

    Taken from http://en.wikipedia.org/wiki/Binomial_distribution

  • 26

    Taken from http://en.wikipedia.org/wiki/Normal_distribution

  • Expected Value (Mean)

    27

    Variance

    Binomial Distribution: Mean: np, Var: np(1-p)

    Normal Distribution: Mean:μ, Var:

    http://en.wikipedia.org/wiki/Expected_value

    http://en.wikipedia.org/wiki/Variance

  • Step 1

    Find time intervals in which firing rate is about the same.

    Baseline Period: [-300 -44] ms (256 ms duration)

    Stimulus Period: [150 406] ms (256 ms duration)

    28

  • Step 2

    Define random variables XBL and XST, which give the total number of spikes in a given time interval (baseline and stimulus, respectively).

    29

  • Step 2

    >> XBL = getSpikeCounts(SpikeDataList{1},[-0.3 -0.044]);

    >> XST = getSpikeCounts(SpikeDataList{1},[0.15 0.406]);

    30

  • Step 2

    >> plot(XBL,’g’); hold on; plot(XST,’r’); plot(XBL,’go’); plot(XST,’ro’);

    31

  • Step 3

    Compute the probability mass function (pmf)

    >> [pBL,cBL] = getPMF(XBL); plot(cBL,pBL,’g’)

    32

  • Suppose events (spikes) are occurring at a constant rate (λ)

    Independent increments: if we count the number of spikes in two non-overlapping intervals, they should be independent: knowledge of the number of spikes in one interval should not reveal any information about the number of spikes in another interval (which does not overlap with the first)

    Stationary increments: the number of spikes in a given interval should only depend on the length of the interval, but not on its position.

    Probability of having a single spike in a small interval h is λh.

    Probability of having more than 1 spike in a small interval h is extremely small.

    It can be shown that in this case, the number of spikes in an interval of length t follows a Poisson distribution with parameter λ.t.

    33

  • A random variable X is said to have Poisson distribution with parameter μ if the pmf of X is given by

    34

    P(X = k) = e-mm k

    k!k=0,1,2,3,…

    E(X) = μ

    Var(X) = μ

    Fano Factor = Var/Mean=1

  • A Poisson distribution is applicable in many situations in which some kind of “event” or “change of state” or “flaw” or “failure” occurs in a manner thought of intuitively as “at random.”

    Law of rare events (or Poisson Limit Theorem) gives a Poisson approximation to a Binomial Distribution, under certain circumstances.

    35Proof: See http://en.wikipedia.org/wiki/Poisson_limit_theorem

  • 36

    >> mBL = mean(XBL); vBL = var(XBL);

    >> mST = mean(XST); vBL = var(XST);

    mBL = 2.0, vBL = 1.95, Fano Factor = 0.97

    mST = 6.05, vST = 8.56, Fano Factor = 1.41

    If XBL was indeed Poisson Distributed with a mean of mBL, the theoretical pmf would have been:

    >> gBL = exp(-mBL) * ((mBL).^cBL) ./ factorial(cBL)

    Similarly,

    >> gST = exp(-mST) * ((mST).^cST) ./ factorial(cST)

    Now plot gBL and gST along with pBL and pST

    >> plot(cBL,pBL,’go’); hold on; plot(cBL,pBL,’g’);

    >> plot(cBL,gBL,’g*’); hold on; plot(cBL,gBL,’g--’);

  • 37

  • 38

  • A continuous random variable X is said to have an exponential distribution with parameter λ, λ>0, if its pdf is given by

    39

    f (x) =le-lx

    0

    x≥0

    x

  • Exponential distribution is memoryless, where memorylessproperty is defined as:

    40

    Exponential distribution is the unique distribution possessing this property

    Hazard or failure rate function λ(t) is defined as

    P{X>s+t|X>t} = P{X>s} for s,t≥0

    l(t) =f (t)

    F(t)

    For an exponential distribution, the hazard function is constant.

  • >> isiBL = getISIs(SpikeDataList{1},[-0.3 -0.044]);

    Homework: use the hist command to plot the ISI pdf, and also plot the theoretical exponential distribution with the same mean as isiBL.

    41

  • >> isiBL = getISIs(SpikeDataList{1},[-0.3 -0.044]);

    Homework: use the hist command to plot the ISI pdf, and also plot the theoretical exponential distribution with the same mean as isiBL.

    42


Recommended