Date post: | 02-Apr-2018 |
Category: |
Documents |
Upload: | muhammad-atif |
View: | 231 times |
Download: | 0 times |
of 30
7/27/2019 02 Signal Detection Theory.pdf
1/30
J. Elder PSYC 6256 Principles of Neural Coding
2. SIGNAL DETECTION THEORY
7/27/2019 02 Signal Detection Theory.pdf
2/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
2
Signal Detection Theory
Provides a method for characterizing humanperformance in detecting, discriminating and
estimating signals.
For noisy signals, provides a method for identifyingthe optimal detector (the ideal observer) and for
expressing human performance relative to this.
Origins in radar detection theory Developed through the 1950s and on by Peterson,
Birdsall, Fox, Tanner, Green & Swets
7/27/2019 02 Signal Detection Theory.pdf
3/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
3
Example 1
The observer sits in a dark room On every trial, a dim light will be flashed with 50%
probability.
The observer indicates whether she believes thelight was flashed or not.
This is a yes-no detection task.
7/27/2019 02 Signal Detection Theory.pdf
4/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
4
Noise
In this example, the information useful for the task is the lightenergy of the stimulus.
By the time the stimulus information is received by decisioncentres in the brain, it will be corrupted by many sources of
noise:
photon noise isomerization noise
neural noise
Many of these noise sources are Poisson in nature: thedispersion increases with the mean.
7/27/2019 02 Signal Detection Theory.pdf
5/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
5
Equal-Variance Gaussian Case
It is often possible to approximate this noise asGaussian-distributed, with the same variance for
both stimulus conditions.
Then the noise is independent of the signal state.
7/27/2019 02 Signal Detection Theory.pdf
6/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
6
Discriminability d
p x|S = sH( ) =
1
2exp
x H( )
2
22
p x|S = sL( ) =
1
2exp
x L( )
2
22
H
L
d' =
signal separation
signal dispersion=
H
L
7/27/2019 02 Signal Detection Theory.pdf
7/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
7
Criterion Threshold
The internal response is often approximated as a continuousvariable, called the decision variable.
But to yield an actual decision, this has to be converted to abinary variable (yes/no).
A reasonable way to do this is to define a criterion thresholdz:x z 'yes'
x< z 'no'
x
z
x
7/27/2019 02 Signal Detection Theory.pdf
8/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
8
Effect of Shifting the Criterion
7/27/2019 02 Signal Detection Theory.pdf
9/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
9
How did we calculate these numbers?
p x|S = sH( ) =
1
2exp
x H( )
2
22
p x|S = sL( ) =
1
2exp
x L( )
2
22
H
L
d' = z
FA z
HIT
7/27/2019 02 Signal Detection Theory.pdf
10/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
10
What is the right criterion to use?
Suppose the observer wants to maximize the expected numberof times they are right.
Then the optimal decision rule is to always select the state swith higher probability for the observed internal responsex:
This is the maximum likelihood detector. For the equal-variance case, this means that the criterion is the
average of the two signal levels:
p x|sH( )p x| sL( )1 'yes '
p x|sH( )
p x| sL( )
7/27/2019 02 Signal Detection Theory.pdf
11/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
11
Optimal Performance
The performance of the maximum likelihoodobserver for this yes/no task is given by
p(correct) =p(HIT) =p(CORRECT REJECT) = erfc d
2 2
7/27/2019 02 Signal Detection Theory.pdf
12/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
12
Bias
For this optimal decision rule, the different types oferrors are balanced: p(FA) = p(MISS)
For observers that use a different criterion, thedifferent types of errors will be unbalanced.
Such observers have lower p(correct) and are saidto be biased.
z
7/27/2019 02 Signal Detection Theory.pdf
13/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
13
ROC Curves
Suppose the experiment is repeated many timesunder different instructions.
The first time, the observer is instructed to beextremely stringent in their criterion, only reportingyes when they are 100% sure the light was flashed.
On subsequent repetitions, the observer is instructedto gradually relax their criterion.
7/27/2019 02 Signal Detection Theory.pdf
14/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
14
ROC Curves
As the criterion threshold is swept from right to left, p(HIT)increases, but p(FA) also increases.
The resulting plot of p(HIT) vs p(FA) is called a receiver-operating characteristic (ROC).
d = 0
Increasing d
7/27/2019 02 Signal Detection Theory.pdf
15/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
15
ROC Curves
Note that d remains fixed as the criterion is varied! Thus d is criterion-invariant, and is thus a pure
reflection of the signal-to-noise ratio.
7/27/2019 02 Signal Detection Theory.pdf
16/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
16
Example 2: Motion Direction Discrimination
Random dot kinematogram Signal dots are either all moving up or all moving down Noise dots are moving in random directions
Britten et al (1992)
7/27/2019 02 Signal Detection Theory.pdf
17/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
17
100% Coherence
7/27/2019 02 Signal Detection Theory.pdf
18/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
18
30% Coherence
7/27/2019 02 Signal Detection Theory.pdf
19/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
19
5% Coherence
7/27/2019 02 Signal Detection Theory.pdf
20/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
20
0% Coherence
7/27/2019 02 Signal Detection Theory.pdf
21/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
21
The Medial Temporal Area (V5)
www.thebrain.mcgill.ca
7/27/2019 02 Signal Detection Theory.pdf
22/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
22
Experimental Details
Signal direction always in preferred or anti-preferred direction for cell.
What kind of task is this?
Note that now there isexternal
noise as well asinternal noise.
To calculate neural discrimination performance,assumed neuron paired with identical neuron, tuned
to opposite direction of motion.
7/27/2019 02 Signal Detection Theory.pdf
23/30
Behaviour Neuron
Anti-Preferred
Direction
Preferred
Direction
7/27/2019 02 Signal Detection Theory.pdf
24/30
False Alarm Rate
HitRate
7/27/2019 02 Signal Detection Theory.pdf
25/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
25
Priors
Note that if the probabilities of the two signalstates are not equal, the maximum likelihood
observer will be suboptimal.
In this case we must make use of the posterior ratio.p s
H|x( )
p sL|x( )
1 'yes '
p sH
|x( )p s
L|x( )
7/27/2019 02 Signal Detection Theory.pdf
26/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
26
MAP Inference
Using Bayes rule, we obtain:
Thus we simply scale the likelihoods by the priors.
p sH
| x( )p s
L| x( )
=
p x| sH( )p sH( )
p x| sL( )p sL( )
7/27/2019 02 Signal Detection Theory.pdf
27/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
27
Loss and Risk
Maximizing p(correct) is not always the best thing todo.
How would you adjust your criterion if you wereA venture capitalist trying to detect the next Google?A pilot looking for obstacles on a runway?
7/27/2019 02 Signal Detection Theory.pdf
28/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
28
Loss Function
In general, different types of correct decision or action willyield different payoffs, and different types of errors will yield
different costs.
These differences can be accounted for through a loss function:Let a(x) represent the action of the observer, given internal response x.
Then L s,a(x)( ) represents the cost of taking action a, given world state s.
7/27/2019 02 Signal Detection Theory.pdf
29/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
29
The Ideal Observer
TheIdeal Observeruses the decision rule thatminimizes the Expected Loss, aka the Risk R(a|x):
R(a |x) = L s,a(x)( )p(s,x)s
= L s,a(x)( )p(x| s)s
p(s)
7/27/2019 02 Signal Detection Theory.pdf
30/30
Probability & Bayesian Inference
J. ElderPSYC 6256 Principles of Neural Coding
30
Example 3: Slant Estimation