REGLERTEKNIK
AUTOMATIC CONTROL
Target Tracking: Lecture 5Multiple Target Tracking: Part II
Gustaf [email protected]
Div. Automatic ControlDept. Electrical Engineering
Linkoping University
December 10, 2014
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 1 / 36
Lecture Outline
1. Conceptual MHT• Fundamental Components• Simplifications• Summary
2. Hypothesis-Based MHT• Assignment Problem• Algorithm
3. Track-Based MHT• Implementation Details• Summary
4. User Interaction
5. Examples• SUPPORT• ADABTS
6. Summary• Concluding Remarks• Learn More. . .
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 2 / 36
Last Lecture
Intro multi-target tracking (MTT)
Single hypothesis tracker (SHT)
• Global nearest neighbor (GNN)• Joint Probabilistic Data Association (JPDA)
Auction algorithm
Fundamental theorem of target tracking
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 3 / 36
Multiple Hypothesis Tracking (MHT)
MHT: consider multiple associations hypotheses over time
Started with the conceptual MHT
Integrated track initialization
Two principal implementations
• hypotheses based• track based
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 4 / 36
Conceptual MHT
Conceptual MHT: basic idea
Described in Reid (1979)
Intuitive hypothesis based brute force implementation
Between consecutive time instants, different association hypotheses,{Θi
k−1}Nhi=1, are kept in memory
Idea: generate all possible hypotheses, and then prune to avoidcombinatorial hypotheses growth
Hypothesis limiting techniques:
• clustering• pruning low probability hypotheses• N-scan pruning• combining similar hypotheses• . . .
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 5 / 36
Conceptual MHT Fundamental Components
Representing Hypotheses
Each hypothesis, {Θik−1}
Nhi=1, is characterized by the number of targets
(tracks) and their corresponding sufficient statistics
Θ1k−1, P(Θ1
k−1)
y2k|k−1y1k|k−1
Θ2k−1, P(Θ2
k−1)
y1k|k−1
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 6 / 36
Conceptual MHT Fundamental Components
Representing Hypotheses
Each hypothesis, {Θik−1}
Nhi=1, is characterized by the number of targets
(tracks) and their corresponding sufficient statistics
Θ1k−1, P(Θ1
k−1)
y2k|k−1y1k|k−1
y2k
y1k
y3k
Θ2k−1, P(Θ2
k−1)
y1k|k−1
y2k
y1k
y3k
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 6 / 36
Conceptual MHT Fundamental Components
Generating Hypotheses
Form Θ`k , {θk ,Θi
k−1}
Θ1k−1, P(Θ1
k−1)
Hyp
otheses
measurements
y1k y2k y3k
T2
FA
NT
FA NT
T1
FA
NT
T1
FANTFANT
FA
NT
FA NT
T1
FANTFANT
FA
NT
FA NT
T1
FANTFANT
FA
NT
FA NTFANT
Θ2k−1, P(Θ2
k−1)
Hyp
otheses
measurements
y1k y2k y3k
FA
NT
FA NT
T1
FA
NT
T1
FANTFANT
FA
NT
FA NT
T1
FANTFANT
FA
NT
FA NTFANT
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 7 / 36
Conceptual MHT Fundamental Components
Computing Hypothesis Probabilities
Let Θ`k , {θk ,Θi
k−1}, then(using the “Fundamental Theorem of TT”)
P(Θ`k |y0:k) ∝ p(yk |Θ`
k , y0:k−1)P(θk |Θik−1, y0:k−1)P(Θi
k−1|y0:k−1)
∝ βmfak
fa βmnt
knt
[ ∏j∈J i
D
P jDp
jk|k−1(y
θ−1k (j)
k )
][ ∏j∈J i
ND
(1− P jDP
jG )
]P(Θi
k−1|y0:k−1)
Note
The sets J iD and J i
ND depend on Θik−1! The number of targets and target
estimates usually differ between hypotheses.
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 8 / 36
Conceptual MHT Fundamental Components
System Overview
New Set ofMeasurements{yik}mk
i=1
Set ofHypotheses{Θi
k−1}Nhi=1
Generate NewHypotheses{Θi
k}Nhi=1
CalculateHyp. Probabilities{P (Θi
k)}Nhi=1
ReduceNumber of
Hypotheses Θik
z−1
UserPresentation
Logic
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 9 / 36
Conceptual MHT Simplifications
Reducing Complexity
Clustering
Pruning of low probability hypotheses
N-scan pruning
Merging similar hypotheses
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 10 / 36
Conceptual MHT Simplifications
Clustering
Group targets without common measurements, and handle the groupsseparately
Θ1k−1
y2k|k−1y1k|k−1
y2k
y1k
y3k
Θ2k−1
y1k|k−1
y2k
y1k
y3k
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 11 / 36
Conceptual MHT Simplifications
Clustering
Group targets without common measurements, and handle the groupsseparately
Θ1k−1
y2k|k−1y1k|k−1
y2k
y1k
y3k
Cluster-1
Cluster-2
Θ2k−1
y1k|k−1
y2k
y1k
y3k
Cluster-1
Cluster-2
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 11 / 36
Conceptual MHT Simplifications
Clustering: cluster management
When targets get closer
• If measurement falls inside the gates of tracks in different clusters,merge the clusters
• The hypotheses for each cluster are combined into a super-hypotheses
When targets separate
• If a group of tracks cluster do not share measurements with the othertracks in the cluster (for a period of time), split the cluster
• Hypotheses for the cluster are also divided into smaller hypothesescorresponding to two smaller clusters.
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 12 / 36
Conceptual MHT Simplifications
Clustering: process clusters separately (1/2)
Hypotheses generation
Form Θ`k , {θk ,Θi
k−1} for each cluster as if the other clusters do notexist.
Without clusteringΘ1
k−1, P(Θ1k−1)
Hyp
otheses
measurements
y1k y2k y3k
T2
FA
NT
FA NT
T1
FA
NT
T1
FANTFANT
FA
NT
FA NT
T1
FANTFANT
FA
NT
FA NT
T1
FANTFANT
FA
NT
FA NTFANT
Cluster-1Θ1
k−1, P(Θ1k−1)
Hyp
otheses
measurements
y1k y2k
T2
NTFA
T1
FA
NT
T1
NTFA
T1
NTFA
T1
NTFA
Cluster-2Θ1
k−1, P(Θ1k−1)
Hyp
otheses
measurements
y3k
FA
NT
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 13 / 36
Conceptual MHT Simplifications
Clustering: process clusters separately (1/2)
Hypotheses generation
Form Θ`k , {θk ,Θi
k−1} for each cluster as if the other clusters do notexist.
Without clusteringΘ2
k−1, P(Θ2k−1)
Hyp
otheses
measurements
y1k y2k y3k
FA
NT
FA NT
T1
FA
NT
T1
FANTFANT
FA
NT
FA NT
T1
FANTFANT
FA
NT
FA NTFANT
Cluster-1Θ2
k−1, P(Θ2k−1)
Hyp
otheses
measurements
y1k y2k
NTFA
T1
FA
NT
T1
NTFA
T1
NTFA
Cluster-2Θ2
k−1, P(Θ2k−1)
Hyp
otheses
measurements
y3k
FA
NT
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 13 / 36
Conceptual MHT Simplifications
Clustering: process clusters separately (2/2)
Hypotheses reduction
For each cluster:
Delete hypotheses with probability below a threshold, γp (e.g.,γp = 0.001)
Deletion Condition: P(Θik) < γp
Keep only the most probable hypotheses with a total probability massabove a threshold, γc (e.g., γc = 0.99)
Deletion Condition:i∑
k=1
P(Θ`kk ) > γc
where `k is a sequence such that P(Θ`kk ) ≥ P(Θ
`k+1
k )
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 14 / 36
Conceptual MHT Simplifications
N-scan Pruning
This scheme assumes thatany uncertainty is perfectlyresolved after N time steps
It is a general commonsenseto choose N ≥ 5 (situationdependent)
The N last ancestors of eachhypothesis must be stored
Case N = 2
time
Hyp
oth
eses
0 1 2 3 4
Θ10
Θ20
Θ30
Θ11
Θ21
Θ31
Θ41
Θ12
Θ22
Θ32
Θ42
Θ52
Θ62
Θ72Θ5
1
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36
Conceptual MHT Simplifications
N-scan Pruning
This scheme assumes thatany uncertainty is perfectlyresolved after N time steps
It is a general commonsenseto choose N ≥ 5 (situationdependent)
The N last ancestors of eachhypothesis must be stored
Case N = 2
time
Hyp
oth
eses
0 1 2 3 4
Θ10
Θ20
Θ30
Θ11
Θ21
Θ31
Θ41
Θ12
Θ22
Θ32
Θ42
Θ52
Θ62
Θ72
Θ42
Θ51
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36
Conceptual MHT Simplifications
N-scan Pruning
This scheme assumes thatany uncertainty is perfectlyresolved after N time steps
It is a general commonsenseto choose N ≥ 5 (situationdependent)
The N last ancestors of eachhypothesis must be stored
Case N = 2
time
Hyp
oth
eses
0 1 2 3 4
Θ31
Θ41
Θ42
Θ52
Θ62
Θ42
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36
Conceptual MHT Simplifications
N-scan Pruning
This scheme assumes thatany uncertainty is perfectlyresolved after N time steps
It is a general commonsenseto choose N ≥ 5 (situationdependent)
The N last ancestors of eachhypothesis must be stored
Case N = 2
time
Hyp
oth
eses
0 1 2 3 4
Θ31
Θ41
Θ42
Θ52
Θ62
Θ42
Θ13
Θ23
Θ33
Θ43
Θ53
Θ63
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36
Conceptual MHT Simplifications
N-scan Pruning
This scheme assumes thatany uncertainty is perfectlyresolved after N time steps
It is a general commonsenseto choose N ≥ 5 (situationdependent)
The N last ancestors of eachhypothesis must be stored
Case N = 2
time
Hyp
oth
eses
0 1 2 3 4
Θ31
Θ41
Θ42
Θ52
Θ62
Θ42
Θ13
Θ23
Θ33
Θ43
Θ53
Θ63
Θ53
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36
Conceptual MHT Simplifications
N-scan Pruning
This scheme assumes thatany uncertainty is perfectlyresolved after N time steps
It is a general commonsenseto choose N ≥ 5 (situationdependent)
The N last ancestors of eachhypothesis must be stored
Case N = 2
time
Hyp
oth
eses
0 1 2 3 4
Θ62
Θ43
Θ53
Θ63
Θ53
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 15 / 36
Conceptual MHT Simplifications
Hypothesis Merging
Reid’s original paper suggests to check for hypothesis pairs with:
the same number of targets (tracks)
similar track estimates
If these conditions are satisfied:
merge the hypotheses
assign the new hypothesis the sum of the combined hypotheses’probability
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 16 / 36
Conceptual MHT Summary
Summary
Attractive method since each hypothesis is
• an alternative representation of reality• easily interpreted
Drawback: generating all possible hypotheses only to discarding(most of) them is inefficient
Some hypotheses contain the same track; hence fewer unique tracksthan hypotheses
Track based methods were popular until an efficient way to implementa hypothesis based MHT was given by Cox and Hingorani (1996)
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 17 / 36
Hypothesis-Based MHT
Hypothesis-Based MHT
Proposed by Cox and Hingorani (1996)
Generate only the best hypotheses, skip hypotheses that will bedeleted
Use the N-best solutions to the assignment problem (introduced lastlecture with GNN)
• Murty’s method, 1968
Find the Nh-best hypothesis, generating as few unnecessaryhypothesis as possible
Hypothesis reduction techniques still apply
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 18 / 36
Hypothesis-Based MHT Assignment Problem
Assignment Problem: repetition (1/2)
Let Θ`k , {θk ,Θi
k−1}.
P(Θ`k |y0:k) ∝ p(yk |Θ`
k , y0:k−1)P(θk |Θik−1, y0:k−1)P(Θi
k−1|y0:k−1)
∝ βmfak
fa βmnt
knt
[ ∏j∈J i
D
P jDp
jk|k−1(y
θ−1k (j)
k )
][ ∏j∈J i
ND
(1−P jDP
jG )
]P(Θi
k−1|y0:k−1)
Divide and multiply the right hand side by
Ci ,
niT∏j=1
(1− P jDP
jG ) =
∏j∈J i
D
(1− P jDP
jG )
∏j∈J i
ND
(1− P jDP
jG )
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 19 / 36
Hypothesis-Based MHT Assignment Problem
Assignment Problem: repetition (2/2)
P(Θ`k |y0:k) ∝ βm
fak
fa βmnt
knt
[ ∏j∈J i
D
P jDp
jk|k−1(y
θ−1k (j)
k )
1− P jDP
jG
]CiP(Θi
k−1|y0:k−1)
× represents −∞.
`ij , logP jDp
jk|k−1(y i
k)(1−P j
DPjG )
.
Logarithmize and form the assignment matrices
A1 T1 T2 fa1 fa2 fa3 nt1 nt2 nt3
y1k `11 `12 log βfa × × log βnt × ×y2k `21 × × log βfa × × log βnt ×y3k × × × × log βfa × × log βnt
A2 T1 fa1 fa2 fa3 nt1 nt2 nt3
y1k `11 log βfa × × log βnt × ×y2k `21 × log βfa × × log βnt ×y3k × × × log βfa × × log βnt
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 20 / 36
Hypothesis-Based MHT Assignment Problem
Assignment Problem: N-best solutions
Given an assignment matrix Ai , the Auction algorithm (or similar)finds the best assignment in polynomial time
Generalizations of this problem to find the N-best solutions:
• Formulate as several best assignment problems• Solve independently using the Auction algorithm• Murty’s method
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 21 / 36
Hypothesis-Based MHT Assignment Problem
Assignment Problem: Murty’s Method
Murty’s Method
Given the assignment matrix Ai ,
Find the best solution using Auction algorithm.
2nd best solution:
• Express the 2nd best solution as the solution of a number of bestsolution assignment problems.
• Find the solution to each of these problems by Auction.• The solution giving the maximum reward (minimum cost) is the second
best solution.
Repeat the procedure for more solutions
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 22 / 36
Hypothesis-Based MHT Algorithm
Algorithm Outline
Aim: Given hypotheses {Θik−1}
Nhi=1 and measurements {y ik}
mki=1, find
the Nh best hypotheses {Θik}
Nhi=1 (avoid generating all hypotheses)
Reminder of Hypothesis Probability
P(Θ`k |y0:k) ∝ βm
fak
fa βmnt
knt
[ ∏j∈J i
D
P jDp
jk|k−1(y
θ−1k (j)
k )
1− P jDP
jG
]︸ ︷︷ ︸
Assignment dependent
CiP(Θik−1|y0:k−1)︸ ︷︷ ︸
Legacy
Find {Θ`k}
Nh`=1 that maximizes P(Θ`
k |y0:k).
Two steps:
• Obtain the solution from the assignment (Murty’s method)• Multiply the obtained quantity by previous hypothesis dependent terms
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 23 / 36
Hypothesis-Based MHT Algorithm
Generating the Nh-best Hypotheses
Input {Θik−1}Nh
i=1, {P(Θik−1|y0:k−1)}Nh
i=1, and {y ik}mk
i=1
Output HYP-LIST (N hypotheses, decreasing probability)PROB-LIST (matching probabilities)
1. Initialize all elements in HYP-LIST and PROB-LIST to ∅ and −1
2. Find assignment matrices {Ai}Nh
i=1 for {Θik−1}Nh
i=1
3. For j = 1 . . .Nh
1. For i = 1 . . .Nh
1. For the assignment matrix Ai find the jth best solution Θjik
2. Compute the probability P(Θjik )
3. Update HYP-LIST and PROB-LIST: If the new hypothesis enters thelist, discard the least probable entry
4. If P(Θjik ) is lower than the lowest probability in PROB-LIST discard Θji
k
and never use Ai again in subsequent recursions
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 24 / 36
Track-Based MHT
Track-Based MHT: motivation
Hypotheses usually contain identical tracks — significantly fewertracks than hypotheses
Idea: Store tracks, T i , not hypotheses, Θi , over time
Θ1k−1, P(Θ1
k−1)
y2k|k−1y1k|k−1
Θ2k−1, P(Θ2
k−1)
y1k|k−1
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 25 / 36
Track-Based MHT
Track-Based MHT: principle
Tracks at time k , {T ik}Nt
i=1
Track scores, Sc(T ik)
Form a track tree, not a hypothesis tree
Delete tracks with low scores
Track
List
new tracks
y1k
NMT 1k−1
y1k
NMT 2k−1
y2k
y1k
y2k
y3k
old tracks
T 2k
T 1k
T 3k
T 4k
T 5k
T 6k
T 7k
T 8k
y2k|k−1y1k|k−1
y2k
y1k
y3k
T 1k−1 T 2
k−1
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 26 / 36
Track-Based MHT Implementation Details
Hypotheses Generation
Hypothesis: a collection of compatible tracks:Θ1
k = {T 1k ,T
5k ,T
8k }, Θ2
k = {T 2k ,T
3k ,T
7k ,T
8k }
Generating hypothesis is needed for reducing the number of tracksfurther and for user presentation
Use only tracks with high score
Keep track compatibility information (e.g., in a binary matrix)
T 1k T 2
k T 3k T 4
k T 5k T 6
k T 7k T 8
k
T 1k 0 0 0 1 1 0 1 1
T 2k 0 1 1 1 1 1 1
T 3k 0 0 0 0 1 1
T 4k 0 0 1 1 1
T 5k 0 1 0 1
T 6k 0 1 1
T 7k 0 1
T 8k 0
Track
List
new tracks
y1k
NMT 1k−1
y1k
NMT 2k−1
y2k
y1k
y2k
y3k
old tracks
T 2k
T 1k
T 3k
T 4k
T 5k
T 6k
T 7k
T 8k
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 27 / 36
Track-Based MHT Implementation Details
Track Scores and Hypotheses Probabilities
Track probability:
P(T ik) =
∑T ik∈Θj
k
P(Θjk)
Hypothesis score:
Sc(Θik) =
∑T jk∈Θi
k
Sc(T jk)
Hypothesis probability:
P(Θik) =
exp(Sc(Θi
k))
1 +∑Nh
j=1 exp(Sc(Θj
k))
Track
List
new tracks
y1k
NMT 1k−1
y1k
NMT 2k−1
y2k
y1k
y2k
y3k
old tracks
T 2k
T 1k
T 3k
T 4k
T 5k
T 6k
T 7k
T 8k
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 28 / 36
Track-Based MHT Implementation Details
Complexity Reducing Techniques
Cluster incompatible tracks for efficient hypothesis generation
Apply N-scan pruning to the track trees
Merge tracks with common recent measurement history
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 29 / 36
Track-Based MHT Summary
System Components
New Set ofMeasurements{yik}mk
i=1
Set ofTracks{T i
k−1}Nti=1
Generate NewTracks{T i
k}Nti=1
DiscardLow Score
Tracks
z−1
UserPresentation
Logic
GenerateHypotheses{Θi
k}Nhi=1
DiscardLow Probability
Tracks
CalculateTrack
Probabilities
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 30 / 36
User Interaction
User Presentation Logic
Maximum probability hypothesis: simplest alternative
• Possibly jumpy; the maximum probability hypothesis can changeerratically
Show track clusters: (weighted) mean, covariance and expectednumber of targets
Keep a separate track list: update at each step with a selection oftracks from different hypotheses
Consult (Blackman and Popoli, 1999) for details
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 31 / 36
Examples SUPPORT
Example: harbor protection (SUPPORT)
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 32 / 36
Examples ADABTS
Example: busy indoor environments
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 33 / 36
Summary Concluding Remarks
Which Multi-TT Method to Use?
ComputationSNR Low Medium High
Low Group TT / PHD GNN GNNMedium MHT GNN or JPDA GNN
High TrBD / MHT MHT Any
GNN and JPDA are very bad in low SNR.
When using GNN, one generally has to enlarge the overconfidentcovariances to account for neglected data association uncertainty.
JPDA has track coalescence and should not be used with closelyspaced targets, see the “coalescence avoiding” versions.
MHT requires significantly higher computational load but it is said tobe able to work reasonably under 10–100 times worse SNR.
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 34 / 36
Summary Learn More. . .
Learning More (1/2)
Samuel S. Blackman.Multiple hypothesis tracking for multiple target tracking.IEEE Transactions on Aerospace and Electronic Systems, 19(1):5–18, January 2004.
Samuel S. Blackman and Robert Popoli.Design and analysis of modern tracking systems.Artech House radar library. Artech House, Inc, 1999.ISBN 1-5853-006-0.
Ingemar J. Cox and Sunita L. Hingorani.An efficient implementation of Reid’s multiple hypothesis tracking algorithm and itsevaluation for the purpose of visual tracking.IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(2):138–150, February1996.
Ingemar J. Cox and Matthew L. Miller.On finding ranked assignments with application to multitarget tracking and motioncorrespondence.IEEE Transactions on Aerospace and Electronic Systems, 31(1):486–489, January 1995.
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 35 / 36
Summary Learn More. . .
Learning More (2/2)
Ingemar J. Cox, Matthew L. Miller, Roy Danchick, and G. E. Newnam.A comparison of two algorithms for determining ranked assignments with application tomultitarget tracking and motion correspondence.IEEE Transactions on Aerospace and Electronic Systems, 33(1):295–301, January 1997.
Roy Danchick and G. E. Newnam.Reformulating Reid’s MHT method with generalised Murty K-best ranked linearassignment algorithm.IEE Proceedings-F Radar and Sonar Navigation, 153(1):13–22, February 2006.
Matthew L. Miller, Harold S. Stone, and Ingemar J. Cox.Optimizing Murty’s ranked assignment method.IEEE Transactions on Aerospace and Electronic Systems, 33(3):851–862, July 1997.
Donald B. Reid.An algorithm for tracking multiple tragets.IEEE Transactions on Automatic Control, 24(6):843–854, December 1979.
G. Hendeby ([email protected]) Target Tracking: Lecture 5 (MHT) December 10, 2014 36 / 36