Multi-Sensor Adaptive Signal Processing and Sensor
Management for Landmines
Multi-Sensor Adaptive Signal Processing and Sensor
Management for Landmines
Mark P. Kolba, Peter A. Torrione, Jeremiah J. Remus and Leslie M. Collins
Electrical and Computer EngineeringDuke University
Work supported by DARPA/ARO MURI
Mark P. Kolba, Peter A. Torrione, Jeremiah J. Remus and Leslie M. Collins
Electrical and Computer EngineeringDuke University
Work supported by DARPA/ARO MURI
Progress as of 2004 (Minneapolis meeting)
• Adaptive Feature Selection (JCFO) successfully applied to landmine problem
• Uncertainties incorporated into MIAP processor improve robustness
• Adaptive tracking technique improves GPR detection performance
• Encouraging sensor management results – simple problem, only simulated data considered
• Multi-sensor data from Georgia Tech processed –using adaptive fusion (had not considered sensor management).
Overview of Progress
• Sensor Management– New simulations– Comparison to theoretical predictions– Extensions and new developments– Processing Georgia Tech data– Application to Automated Mine Detection System
(AMDS) data• Ant system/Swarm algorithm application to mine
detection (not briefed due to time limitations.. maybe next year?)
Problem Statement
• Suite of available sensors• Performance and cost a
function of sensor modality
• Grid to be searched. Adaptively determine– where to go?– what sensor to deploy?– what sensor parameters to
employ?
0 1 2 3 4 5 6 7 8 9 100
1
2
3
4
5
6
7
8
9
10Sensor path through cell grid for unconstrained sensor
Problem Formulation
• Search for N targets using M sensors– Sensors may be unimodal or multimodal
• Operate on a cell grid• Each cell has a state:
– Target– No target
• Possible observations:– Target present– No target present
• As observations are made, the state probabilities of each cell are calculated and updated
0 2 4 6 8 100
1
2
3
4
5
6
7
8
9
10Cell grid containing five targets
Previously: Adaptive Discrimination-Based Sensor Management
• Initial algorithm based on Kastella1
• Select a cell which maximizes
• Can be calculated recursively
• Progress: Focus on considering assumptions that are inconsistent with the MURI application areas – most recent work particularly on considering uncertainty in sensor performance parameters needed in the formulation
1( , ) ( ) ln( ( ) / ( ))
S
sD P Q P s P s Q s
=
=∑D∆
1 K. Kastella, “Discrimination gain to optimize detection and classification,” IEEE Trans. Syst. Man, Cybern., vol. 27, no. 1, pp. 112-116, Jan. 1997
Progress• Extend to consider other priors on target locations• Extend to multiple sensors, each with different
performance, cost• Sequential detection in each cell once selected• Extend to consider unknown number of targets• Consider unknown Pd, Pf for each sensor 2 ways
– Extend Kastella approach to handle uncertainty OR– Need to estimate ROCs (performance degrades if
unknown)• Alternative (pure Bayesian) formulation• Comparison to Theory of Optimal Experiments
• Application to Georgia Tech data• Application to AMDS data• Swarm algorithm - alternative search algorithm
Problem formulation• Kullback-Leibler information, denoted D (for discrimination)
( ) ( ) ( )( )∑
= ==
==1
0ln,
s c
cc sSQ
sSPsSPQPD
P is the current state probabilities Q is the prior state probabilities Sc is the state of cell c s is one of the possible cell states
• State probabilities are calculated sequentially
( ) ( ) ( )( ) ( )∑
=−
−
==
==== 1
01,,
1,,,
jkccckc
kccckckcc
XjSPjSxP
XsSPsSxPXsSP
xc,k is observation k in cell c
Xc,k is observations 1, 2, . . ., k in cell c
• Sensors scheduled to make next observation to maximize expected discrimination gain
( ) ( )[ ] ( )ckckcckckc QPDXQPDEXD ,, ,,1,, −=∆ +
Performance metric• Probability of error, PE
– The N cells with highest probability of being a target should be the N cells containing targets
• PE = 1 if any of the N cells with highest P(Sc = 1) are not one of the N target cells
• PE = 0 if the N cells with highest P(Sc = 1) are the N target cells
• PE = a
if the N cells with highest P(Sc = 1) are the N target cells, and if there is a tie for highest P(Sc = 1); a = r / t, where t is the total number of cells tied and r is the number of those containing targets
Example: Multiple Sensors
0 500 1000 1500 2000 2500 3000 3500 4000 4500 50000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Time
Pe
Performance of different sensor combinations
Discrimination Search: S1, S2, S3Discrimination Search: S1, S2Discrimination Search: S1Direct Search: S1, S2, S3Direct Search: S1, S2Direct Search: S1
S1: Pd = 0.9, Pfa = 0.4, Cost = 1S2: Pd = 0.9, Pfa = 0.1, Cost = 1S3: Pd = 0.99, Pfa = 0.02, Cost = 10
Uncertain PD and PF• Consider a real-world scenario
– Unknown ground– Unfamiliar obstacles– Unknown target and clutter types
• Assess prior distributions on PD and PF– Beta distribution (natural conjugate prior)
PD and PF are uncertain
nextfinished
Calculationcertain uncertain
Datacertain
uncertain
PD/PF
New problem formulation• State probabilities may no longer be calculated sequentially
( ) ( ) ( )
( ) ( )∑=
==
==== 1
0,
,,
|
||
tcckc
cckckcc
tSPtSXP
sSPsSXPXsSP
( ) ( ) ( ) ( ) dpfdpdpffpdfpfpdsSXPsSXP ckcckc ,,|| ,, ββ∫∫ ===
( )sSXP ckc =|, depends only on pd if c is a target cell and only on pf if c is a clutter cell
Thus, ( ) ( )pdpdbinckc nrnrPsSXP ′′== ,,||, β
( ) ( )pfpfbinckc nrnrPsSXP ′′== ,,||, β
if c is a target cell
if c is a clutter cell
r’ and n’ are the parameters of the beta prior
n is the number of observations, r of which were “target present”
( ) ( ) ( ) ( ) ( )( ) ( ) ( ) ( ) ( )nnrnrrnr
rrnnrrnnnrnrP bin +′Γ+−Γ+Γ′−′Γ′Γ−′−+′Γ+′Γ+Γ′Γ
=′′11
1,,|β
where
New problem formulation• We still use discrimination and expected discrimination gain
Method 1
( )[ ] ( ) ( )∑=
+++ ==1
0,1,1,,1, ,,
jkckcckckcckc XjxPQPDXQPDE
• There were two methods considered to calculate ( )kckc XjxP ,1, =+
( ) ( ) ( )∑=
++ =====1
0,,1,,1, |,|
skccckckckckc XsSPsSXjxPXjxP
( ) ( )( ) ( )dppfpsSjxP
sSjxPsSXjxP
ckc
ckcckckc
β∫ ===
=====
+
++
,|
|,|
1,
1,,1,
Assume that determination of P(xc,k+1) is independent of Xc,k. Then,
where p represents pd for a target cell (s = 1) or pffor a clutter cell (s = 0)
( )pfβ here is the original prior density for pd or pf, as appropriate
New problem formulation
Method 2
( ) ( ) ( )∑=
++ =====1
0,,1,,1, |,|
skccckckckckc XsSPsSXjxPXjxP
Assume that determination of P(xc,k+1) should take into account Xc,k. Then,
( ) ( ) ( )dppfpsSjxPsSXjxP ckcckckc β′′===== ∫ ++ ,|,| 1,,1,
( ) ( ) ( )( ) ( )
( )pfdppfpXP
pfpXPXpf
kc
kckc β
β
β ′′==∫ |
||
,
,,
where p represents pd for a target cell (s = 1) or pffor a clutter cell (s = 0)
( )pfβ′′ is the posterior density for pd or pf, as appropriate, given the data observed
( )pfβ′′ is a beta density with parameters r’’ = r’ + r and n’’ = n’ + n
r’ and n’ are the parameters of the beta priorn is the number of observations, r of which were “target present”
0.00180.00100.72890.9903
Effects of uncertainty
• State probabilities– Will be less differentiated from prior
– Method 1• ∆D based on prior
information only• Same as the calculation
under certainty
– Method 2• ∆D based on prior information
and all data taken in cell• Different than the calculation
under certainty
Prior
• Expected discrimination gain
0.01
Target cell Clutter cell
P(Sc = target)
certain certainuncertain uncertain
Observe: 1 1 1 1 1 1 1 1 Observe: 1 0 0 0
An alternative performance metric
• In addition to PE, consider “separation”• Use as a measure of confidence
– May be calculated using averages
– Or by using minima and maxima
– Or by combining the two methods
( ) ( )nN
SP
n
SPnN
icc
n
itc ii
−
=−
==
∑∑−
== 11targettarget
separation
( ) ( )targetmaxargtargetminargseparation =−==ii cc
itc
iSPSP
Discrimination gain comparison• Method 1
– Behavior easily explainable– State probabilities must reach
a certain value for sensor to move on
– Sensors can “stick”• Poor PE performance• Poor separation
• Method 2– Behavior is more complex– Sensors move on without
reaching a set value• Do not “stick”• More intelligent behavior• May move on too quickly• Can cause lower separation
performance
Obs # P(Sc = target) ∆D Obs # P(Sc = target) ∆D
M
M
M
mk
k
+M
M
M
8.0
8.0
M
M
M
1.0
1.0
M
M
M
mk
k
+M
M
M
8.0
8.0
M
M
M
06.0
1.0same different
Performance with uncertain calculation PD/PF: method 1
• Discrimination-directed search under uncertainty performs as well as or worse than discrimination-directed search with no uncertainty
• Low values of n (higher uncertainty) may lead to severe performance degradation (as seen in the second figure) when more targets are present
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertainty with 1 sensor and 1 target
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertainty with 1 sensor and 5 targets
Number of measurementsP
e
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain calculation PD/PF: method 1
• Drastic differences in the separation for varying levels of uncertainty– The more uncertainty the worse the separation
• Poor separation performance means that decisions (target or no target) may not be made as confidently
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertainty with 1 sensor and 1 target
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertainty with 1 sensor and 5 targets
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Analysis of results: method 1
• With 1 sensor and 1 target, the PE performance under uncertainty degrades only slightly– Uncertainty primarily affects the length of time spent
observing a likely target cell– With only one target, that target will be found in about
the same time even with uncertainty
• With 1 sensor and 5 targets, PE performance is severely degraded with low values of n (high uncertainty)– Sensor becomes stuck observing a single cell repeatedly
and never finds the rest of the targets
Theoretical analysis of results: method 1
• We can calculate the expected number of observations made in a cell when that cell is first observed
( )
=else
sequencen observatio ng terminatiain result and if
01
,nr
nrT
( ) ( ) ( ) ( )nrPnrNnrTnobsEK
n
i
r,,,
1 0∑∑= =
⋅=
( ) ( ) ( )( )
∑ ∑= =
−−
−
=
n
i
ri
j jrin
ijNijTrn
nrN1
,min
1
,,,N(r,n) gives the number of possible observation sequences for the given r and n
( ) ( )observed" target no" ,observed"target " |, , rkrXPnrP kc −=
Set K to a large enough value to achieve the desired precision
Analysis of results: method 1
• With low enough n, the expected number of observations required before the sensor will move on becomes extremely high, resulting in the sensor becoming “stuck”
• “Sticking” when only one target is present will not harm performance since there are no other targets for the sensor to miss while stuck
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Expected observations for 1 sensor, 1 target
n
E(o
bs)
uncertaincertain
0 10 20 30 40 50 60 70 80 90 1000
10
20
30
40
50
60
70
80
90
100Expected observations for 1 sensor, 5 targets
nE
(obs
)
uncertaincertain
Theoretical analysis of results: method 1
• We may also calculate the expected separation after cells have been observed for the first time
( )
=else
sequencen observatio ng terminatiain result and if
01
,nr
nrT
( ) ( ) ( ) ( ) ( )nrPnrNnrTsnrSsepEK
n
i
rs ,,,,,
1 0∑∑= =
=
( ) ( ) ( )( )
∑ ∑= =
−−
−
=
n
i
ri
j jrin
ijNijTrn
nrN1
,min
1
,,,N(r,n) gives the number of possible observation sequences for the given r and n
( ) ( )observed" target no" ,observed"target " |, , rkrXPnrP kc −=
Set K to a large enough value to achieve the desired precision
( ) ( )nsobservatio no"" and yes"" with |,, , rkrXsSPsnrS kc −==
Analysis of results: method 1
• At high values of n, the separation performance is comparable to the separation achieved in the certain case
0 10 20 30 40 50 60 70 80 90 100-0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7Expected separation for 1 sensor, 1 target
n
E(s
ep)
uncertaincertain
0 10 20 30 40 50 60 70 80 90 100-0.1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7Expected separation for 1 sensor, 5 targets
nE
(sep
)
uncertaincertain
Performance with uncertain calculation PD/PF: method 2
• Discrimination-directed search under uncertainty performs as well as or worse than discrimination-directed search with no uncertainty
• Performance degradation is more severe with smaller values of n (that is, with less prior knowledge about PD and PF), but is never as degraded as was observed with method 1
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertainty with 1 sensor and 1 target
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertainty with 1 sensor and 5 targets
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain calculation PD/PF: method 2
• Drastic differences in the separation for varying levels of uncertainty– The more uncertainty the worse the separation
• Poor separation performance means that decisions (target or no target) may not be made as confidently
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertainty with 1 sensor and 1 target
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertainty with 1 sensor and 5 targets
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Theoretical analysis of results: method 2
• Separation performance appears comparable• PE performance is noticeably improved over
method 1, especially in the 5 target case– Sensor moves on and does not become “stuck”
• As before, the expected number of observations and the expected separation for the first observations made in the cells may be calculated
Analysis of results: method 2
• Sensors observe in the target cell for about as long as in the certain case; when there are five targets, the sensor moves on more quickly than it would in the certain case
• Sensors will not “stick”
0 10 20 30 40 50 60 70 80 90 1000
2
4
6
8
10
12
14
16
18Expected observations for 1 sensor, 1 target
n
E(o
bs)
uncertaincertain
0 10 20 30 40 50 60 70 80 90 1000
1
2
3
4
5
6
7
8Expected observations for 1 sensor, 5 targets
nE
(obs
)
uncertaincertain
Analysis of results: method 2
• The separation under uncertainty never reaches the separation performance in the certain case; it is always significantly smaller
0 10 20 30 40 50 60 70 80 90 1000
0.1
0.2
0.3
0.4
0.5
0.6
0.7Expected separation for 1 sensor, 1 target
n
E(s
ep)
uncertaincertain
0 10 20 30 40 50 60 70 80 90 1000
0.1
0.2
0.3
0.4
0.5
0.6
0.7Expected separation for 1 sensor, 5 targets
nE
(sep
)
uncertaincertain
Separation comparison
• Neither method is clearly better in terms of separation performance– With 1 target (when method 1 doesn’t stick), it outperforms method 2– With 5 targets (when method 1 does stick), method 2 is much better
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation comparison for methods 1 and 2 with 1 sensor and 1 target
Number of measurements
Sep
arat
ion
method 1method 2
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation comparison for methods 1 and 2 with 1 sensor and 5 targets
Number of measurementsS
epar
atio
n
method 1method 2
Uncertain PD and PF• Let calculation PD and PF be certain again
– Original SM mathematics apply
• Each cell has a PD or PF drawn from a beta density– Data will be generated using the PD or PF for that cell
next
finishedfinished
Calculationcertain uncertain
Datacertain
uncertain
PD/PF
Performance with uncertain data PD/PF: method 1
• Small values of n cause a sharp degradation in performance• Even small amounts of uncertainty have a more pronounced effect than
previously
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertain data with 1 sensor and 1 target
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertain data with 1 sensor and 5 targets
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain data PD/PF: method 1
• Separation performance is fairly good even for large amounts of uncertainty
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertain data with 1 sensor and 1 target
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertain data with 1 sensor and 5 targets
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain data PD/PF: method 2
• Small values of n cause a sharp degradation in performance• Even small amounts of uncertainty have a more pronounced effect than
previously
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertain data with 1 sensor and 1 target
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance under uncertain data with 1 sensor and 5 targets
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain data PD/PF: method 2
• Separation performance is fairly good even for large amounts of uncertainty
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertain data with 1 sensor and 1 target
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation under uncertain data with 1 sensor and 5 targets
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Uncertain PD and PF
• Now let both calculation and data PD and PF be uncertain– Return to the complex SM mathematics derived for the
uncertain calculation case
nextfinished
finishedfinished
Calculationcertain uncertain
Datacertain
uncertain
PD/PF
Performance with uncertain data and calculation PD/PF: method 1
• Low values of n degrade performance as expected• The five-target case performs very poorly at low values of n; these results
are due to the sensor “sticking”
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance with uncertain data and processing with 1 sensor and 1 target
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance with uncertain data and processing with 1 sensor and 5 targets
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain data and calculation PD/PF: method 1
• Separation performance is rather poor for high uncertainty cases
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation with uncertain data and processing with 1 sensor and 1 target
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation with uncertain data and processing with 1 sensor and 5 targets
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain data and calculation PD/PF: method 2
• Uncertainty degrades performance as expected
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance with uncertain data and processing with 1 sensor and 1 target
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Performance with uncertain data and processing with 1 sensor and 5 targets
Number of measurements
Pe
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Performance with uncertain data and calculation PD/PF: method 2
• Separation performance is better with method 2 than it is with method 1
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation with uncertain data and processing with 1 sensor and 1 target
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1Separation with uncertain data and processing with 1 sensor and 5 targets
Number of measurements
Sep
arat
ion
disc: 0 dBn = 100n = 10n = 5direct: 0 dBdirect: 3 dBdirect: 6 dB
Model Parameter Estimation –Alternative to Uncertainty Mitigation
• Response model (ROC) defined by two parameters: slope and median/offset/bias
• Need to estimate the parameters adaptively• Comparison of two methods for parameter
estimation– Bayesian adaptive estimation – Theory of Optimal Experiments
• Dual adaptation: ROC parameter estimation and sensor parameters/sensor movement/sensor selection
• Use Bayes rule to calculate the probability of a set of model parameters λ given the response r (Pd) at a specific input intensity x (Pfa).
• Select next input to minimize the expected entropy
• Update probability of model parameters using the outcome of the current trial
Bayesian adaptive parameter estimation
( | , ) ( )( | , )( | , ) ( )
prob r x probprob r xprob r x prob
λ
λ λλλ λ
=∑
( )[ ( )] ( ) ( | ) where ( ) ( | , ) log ( | , )r rr
E H x H x prob r x H x prob r x prob r xλ
λ λ= = −∑ ∑( 1) arg min( [ ( )])
xx t E H x+ =
1( ) ( | ( ) r, ( ) x)tprob prob r t x tλ λ+ = = =
• Find the parameter values that minimize the squared error between the collected data r and the model n(x,λ)
• Select the next input to maximize the information gained, calculated using
where column vector and B is the Fisher information matrix in the form
Parameter estimation using the Theory of Optimal Experiments
( )2ˆ arg min ( , )x
r n xλ
λ λ = − ∑
1( 1) arg max log |1 |T
xx t F B F− + = +
( , )( )i
n xF i λλ
∂=
∂
( ) ( )( ), ( ),( , )
t i j
n x t n x tB i j
λ λλ λ
∂ ∂=
∂ ∂∑
λ̂
Simulation Result: Estimating ROC Parameters
0 50 100 150 200 250 300
0.0001%
0.001%
0.01%
0.1%
1%
Bias in median estimation
Trial
Per
cent
age
of tr
ue p
aram
eter
val
ue
0 50 100 150 200 250 300
0.01%
0.1%
1%
10%
Variance in median estimation
Trial
Per
cent
age
of tr
ue p
aram
eter
val
ue
0 50 100 150 200 250 3000.1%
1%
10%
Bias in slope estimation
Trial
Per
cent
age
of tr
ue p
aram
eter
val
ue
0 50 100 150 200 250 300
1%
10%
Variance in slope estimation
Trial
Per
cent
age
of tr
ue p
aram
eter
val
ue
Bayesian adaptiveTheory of Opt. Exp.
Bayesian adaptiveTheory of Opt. Exp.
Bayesian adaptiveTheory of Opt. Exp.
Bayesian adaptiveTheory of Opt. Exp.
Observations
• Relative performance depends on parameters of simulation – some regions of slope/bias space better for one or the other.
• Overall, similar performance• TOE 5 times faster to compute than optimal
Bayesian approach
Simulation Results – Grid Search with Adaptive Estimation of Sensor
Performance
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Time
Pe
Known Pd/Pfa20 iters10 iters5 iters
Georgia Tech Multi-Sensor Data
• Subsampled collection grid in 9 x 9 grid• Initial work with first 2 collections – less
complicated, fewer interactions• Used previously developed decision
statistics as sensor outputs in each grid• 81 samples in each grid available• Initially, each sensor has same cost, same
detection performance
Results – Collection 1
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Time
Pe
Performance of discrimination-based and direct search techniques
Discrimination Search: S1, S2, S3Direct Search: S1, S2, S3Direct Search: S1Direct Search: S2Direct Search: S3
Results - Collection 2
0 100 200 300 400 500 600 700 800 900 10000
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Time
Pe
Performance of discrimination-based and direct search techniques
Discrimination Search: S1, S2, S3Direct Search: S1, S2, S3Direct Search: S1Direct Search: S2Direct Search: S3
Application to AMDS DataAMDS CT1, YPG
Lanes 15, 16, 21, 23, 25Summary
• 209 spots investigated. • SIMS are not included in the ROC curves generated by IDA. • 67 mines. 11- M AT 17-MM AT 6- M AP 33-MM AP• 81 Clutter items scored. 25- casings 43- Fragments 13-Fleshcettes• 40 Blanks.• Weather: bright sun, 60-70F, winds gusted to 30mph at times.
• Ground truth released 6/30/05
AMDS Data
• Cyterra radar– Various processors, including one from Duke
• Cyterra MD– Energy processor (prescreening)– Feature-based processor
• NIITEK radar– Various processors, including one from Duke
Duke-Cyterra MD Only CT1, YPG 4/2005, Each Point Rounded.
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1P others
Pd
Pd vs. Nonmines Pd vs Pba Pd vs. Clutter
Duke-NIITEK CT1, YPG 4/2005, Each Point Rounded.
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1P others
Pd
Pd vs. Nonmines Pd vs Pba Pd vs. Clutter
Duke3 Clipped-NIITEK/CyTerra Hybrid, CT1, YPG 4/2005.
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
P others
Pd
Pd vs. Nonmines Pd vs Pba Pd vs. Clutter
Blind Test Results Generated By IDA – Ground Truth Released
6/30/05
Cyterra MD EnergyNIITEK Radar
RVM Fusion of NGPR, CMD
Approach• Assume 3 processes to optimize: MD prescreener,
MD feature-based processor, GPR• RVM takes 209*3=627 operations to achieve a
specific performance, initial MD prescreenerrequires 209 operations
• Can the discrimination-based search optimize the use of the 3 processes to achieve the same performance in less operations?
• Only considered one performance point to date, since ground truth only available for approximately a month
Results
• Goal: 100% Pd at P others (Pfa) of 0.2
• RVM (direct search) requires 627 operations
• All operations assumed to be equal cost
• Discrimination search required 418 operations
• Assumes known Pd, Pf for each sensor
Duke3 Clipped-NIITEK/CyTerra Hybrid, CT1, YPG 4/2005.
0
0.2
0.4
0.6
0.8
1
0 0.2 0.4 0.6 0.8 1
P others
Pd
Pd vs. Nonmines Pd vs Pba Pd vs. Clutter
Accomplishments
• Implemented and evaluated all modifications to discrimination-based search algorithms
• Implemented and evaluated adaptive estimation approaches as an alternative to incorporating uncertainty into the search methodology
• Initial implementations tested on GT and AMDS field data, full versions of both approaches to be tested in the near future on both data sets
Future Work
• Test adaptive estimation techniques on GT data
• Test uncertainty-based techniques on GT data
• Test all techniques on other field data sets such as the AMDS data