+ All Categories
Home > Documents > [American Institute of Aeronautics and Astronautics AIAA Guidance, Navigation, and Control...

[American Institute of Aeronautics and Astronautics AIAA Guidance, Navigation, and Control...

Date post: 10-Dec-2016
Category:
Upload: jitendra
View: 212 times
Download: 0 times
Share this document with a friend
11
Target Tracking and Fusion using Imaging Sensor and Ground based Radar Data VPS Naidu * and Girija G. National Aerospace Laboratories, Bangalore, India and J. R. Raol National Aerospace Laboratories, Bangalore, India An image centroid tracking algorithm (ICTA) has been implemented in PC MATLAB for target tracking based on the data obtained from an imaging sensor. The gray level image is converted into binary image using the segmentation technique and reduced into clusters using nearest neighbor criterion. Nearest neighborhood Kalman filter and Probabilistic data association filter are employed for state estimation using centroid measurement of clusters. The simulated results showed that it is possible to achieve tracking accuracies of 0.52 pixels root mean square error in position and 0.2 pixels/frame in velocity. Performance of this algorithm is evaluated using several statistical criteria. ICTA has been extended to multi- sensor scenario, where the trajectories obtained from the imaging sensor are fused with those from a ground based radar using track-to-track fusion technique. Nomenclature p d = proximity distance Im( , ) ij = gray image L I = lower target limit k I = k th pixel intensity U I = upper target limit k = pixel index m = number of pixels in the frame s m = number of pixels covered by the target N = number of pixels in the cluster i n = noise intensity (, ) pij = detection probability of pixel (, ) ij FA p = probability of false alarm t p = detection probability of target pixel v p = detection probability of noise pixel Q = process noise covariance matrix R = measurement noise covariance matrix r = frame signal to noise ratio (SNR) * Scientist, MSDF Lab, FMCD, [email protected]. Scientist & Group Head, MSDF Lab, FMCD, [email protected] Scientist & Head, FMCD, [email protected] American Institute of Aeronautics and Astronautics 1 AIAA Guidance, Navigation, and Control Conference and Exhibit 15 - 18 August 2005, San Francisco, California AIAA 2005-5842 Copyright © 2005 by the authors. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.
Transcript

Target Tracking and Fusion using Imaging Sensor and Ground based Radar Data

VPS Naidu* and Girija G.†National Aerospace Laboratories, Bangalore, India

and

J. R. Raol‡

National Aerospace Laboratories, Bangalore, India

An image centroid tracking algorithm (ICTA) has been implemented in PC MATLAB for target tracking based on the data obtained from an imaging sensor. The gray level image is converted into binary image using the segmentation technique and reduced into clusters using nearest neighbor criterion. Nearest neighborhood Kalman filter and Probabilistic data association filter are employed for state estimation using centroid measurement of clusters. The simulated results showed that it is possible to achieve tracking accuracies of 0.52 pixels root mean square error in position and 0.2 pixels/frame in velocity. Performance of this algorithm is evaluated using several statistical criteria. ICTA has been extended to multi-sensor scenario, where the trajectories obtained from the imaging sensor are fused with those from a ground based radar using track-to-track fusion technique.

Nomenclature

pd = proximity distance

Im( , )i j = gray image

LI = lower target limit

kI = kth pixel intensity

UI = upper target limit

k = pixel index m = number of pixels in the frame

sm = number of pixels covered by the target N = number of pixels in the cluster

in = noise intensity

( , )p i j = detection probability of pixel ( , )i j

FAp = probability of false alarm

tp = detection probability of target pixel

vp = detection probability of noise pixel

Q = process noise covariance matrix R = measurement noise covariance matrix r = frame signal to noise ratio (SNR)

* Scientist, MSDF Lab, FMCD, [email protected]. † Scientist & Group Head, MSDF Lab, FMCD, [email protected]‡ Scientist & Head, FMCD, [email protected]

American Institute of Aeronautics and Astronautics

1

AIAA Guidance, Navigation, and Control Conference and Exhibit15 - 18 August 2005, San Francisco, California

AIAA 2005-5842

Copyright © 2005 by the authors. Published by the American Institute of Aeronautics and Astronautics, Inc., with permission.

'r = average pixel SNR s = total target-related intensity

is = target intensity s = average target intensity T = sampling period, ‘transpose’ V = volume v = centroid measurement noise w = process noise X = state vector

cnx = centroid of cluster

knx = nth coordinate of pixel k

px = x-axis/direction position

vx = x-axis/direction velocity

py = y-axis/direction position

vy = y-axis/direction velocity z = measurement vector

FAµ = number of false alarms

tµ = target mean 2tσ = target variance

nµ = noise mean 2nσ = noise variance

λ = spatial density ( , )i jβ = binary image

I. Introduction n image-based air traffic control or air defense system, automatic target recognition (ATR) is extremely important for the safety and early warning of the perceived threat. ATR process involves automatic target acquisition,

identification and tracking by processing a sequence of images. Thus, ATR requires algorithm for detection, segmentation, feature computation, selection, classification and tracking. An important application of ATR is in guiding pilots of high performance aircraft flying close to the ground during bad weather or at night and in air traffic management (ATM)1.

I

In such scenario, the sensor images could be often cluttered, dim, spurious or noisy. Tracking moving targets using image data, involves processing images from a target of interest and producing at each time step, an estimate of the target’s current position and velocity vectors. Uncertainties in the target motion and in the measured values, usually modeled as additive random noise, lead to corresponding uncertainties in the target state. Also, there is additional uncertainty regarding the origin of the received data, which may or may not include measurements from the targets and may be due to random clutter (false alarms). This leads to the problem of data association1. In this situation tracking algorithms have to include information on detection and false alarm probabilities. A comparison of the commonly used algorithms for data association and tracking namely Nearest Neighbor Kalman filter (NNKF) and Probabilistic data association filter (PDAF) is made in Ref. 2 for single target tracking in clutter. In this paper the results of performance of the two data association algorithms for the image centroid tracking problem are presented.

The main focus in this paper is on the implementation and validation of the algorithm of Ref. 3 for precision tracking with segmentation from imaging sensors. This “image centroid tracking algorithm” (ICTA) is independent of the size of targets and is less sensitive to the intensity of the targets. In general typical characteristics of the target obtained by motion recognition or by object (pattern) recognition methods are used in associating images to the target being tracked. Motion recognition characteristics of a target are its location, velocity and acceleration (i.e.

American Institute of Aeronautics and Astronautics

2

state vector) which could be generated using data from successive frames (inter-scan level). Object (pattern) recognition characteristics are its geometric structure (shape, size), energy level distribution (i.e. different gray level in the image) in one or more spectral bands which are obtained using image data at the intra scan level.

The ICTA combines both object and motion recognition methods for practical target tracking from imaging sensors. The characteristics of the image considered are the intensity and size of the cluster. The pixel intensity is discretized into several layers of gray level intensities and it is assumed that sufficient target pixel intensities are within the limits of certain target layers. The ICTA implementation involves the conversion of the data from the image into a binary image by applying upper and lower threshold limits for the “target layers”. The binary target image is then converted to clusters using nearest neighbor criterion. If the target size is known, the information is used to set limits for removing those clusters that differ sufficiently from the size of the target cluster to reduce computational complexity. The centroid of the clusters is then calculated and this information is used for tracking the target. Thus, ICTA involves the following steps:

Intra-scan (single image) level: i. Identifying potential targets by image segmentation methods

ii. Calculation of centroid of the identified targets Inter-scan (between images) level:

iii. Tracking centroid using single or multiple target tracking techniques iv. Separation of the true and false targets by association based on both motion and object characteristics

In this paper, the results of tracking the centroid of single and multiple synthetic images in clutter is presented. The tracking performance of ICTA is evaluated in terms of Percentage-Fit-Error (PFE), Root Mean Square Position Error (RMSPE), and Root Mean Square Velocity Error (RMSVE).

If the target whose centroid is being tracked is also being observed by ground based radar, the data from the radar could be used together with the image sensor data. A state vector fusion algorithm4 is used and the results are presented. Fusion of data from various sensors observing the targets would help in building more robust ATR systems.

II. Segmentation and Centroid Estimation Technique Segmentation means decomposition of the image under into different regions. In ICTA algorithm, particle

segmentation is used to separate the target (object of interest) from background, when target is not fully visible3. It is assumed that the pixel intensities are discretized into 256 gray levels. Particle segmentation is done in two steps: i). the gray level image is transformed into binary image using lower and upper threshold limits of the target. These thresholds of target are determined using the pixel intensity histograms from the target and its surroundings. ii) The detected pixels are grouped into clusters with nearest neighbor technique3.

The gray image is converted into binary image with intensity by a hard limit on the intensity: Im(i,j) j)β(i,

L1 I Im(i,j) Iβ(i,j)

0 otherwiseU≤ ≤⎧

= ⎨⎩

(1)

where and are the lower and upper threshold limits of the target intensity. LI UIThe detection probability of the pixel can be defined as: )ji,(

{ }{ }β(i,j) 1 (i,j)

β(i,j) 0 1 (i,j)

P p

P p

= =

= = − (2)

where ∫−−

=U

L

2

2I

I

2)(

21)ji,( dxep

xσµ

πσ, considering the gray image as having a Gaussian distribution

with mean

j)I(i,

µ and variance 2σThe binary image is then grouped into clusters using the nearest neighbor technique. A pixel is considered as

belonging to the cluster only if the distance between this pixel and at least one other pixel of the cluster is less than the proximity distance . The is chosen as: pd pd

American Institute of Aeronautics and Astronautics

3

v

pt p

dp

11<< (3)

where and are detection probabilities of target and noise pixels respectively. tp vpBy choosing the proximity distance as in Eq. (3), fewer noise clusters are obtained. The affects the size, shape and number of clusters obtained by clustering.

pd

The centroid of the cluster is determined using:

=

== N

kk

N

kkn

n

I

Ixx

k

c

1

1 (4)

where is the knx n th co-ordinate of pixel k , is the number of pixels in the cluster, is the pixel index and is N k kI

k th pixel intensity

III. Image Centroid Tracking Algorithm In this paper, the performance of two of the commonly used algorithms for tracking targets in the presence of background clutter are compared for centroid tracking application. The algorithms are the nearest neighbor Kalman filter (NNKF) and probabilistic data association filter (PDAF). . In general gating and data association enable tracking in multi target scenario. Gating helps in deciding if an observation (which includes clutter, false alarms and electronic counter measures) is a probable candidate for track maintenance or track update. Data association is the step to associate the measurements to the targets with certainty when several targets are in the same neighborhood. Two approaches to data association are possible: i) using the nearest neighbor (NN) approach in which a unique pairing is determined so that at most one observation can be paired with a previously established track. The method is based upon likelihood theory and the goal is to minimize an overall distance function that considers all observation-to-track pairings that satisfy a preliminary gating test, ii) decision is achieved using probabilistic data association (PDA) algorithm in which a track is updated by a weighted sum of innovations from multiple validated measurements. The NNKF or PDAF is necessary for the centroid tracking application because in the neighborhood of the predicted location for the target centroid during tracking, several centroids could be found due to splitting of the target cluster or due to noise clusters.

IV. Models for Data Generation A. Synthetic image data generation The mathematical basis for generation of synthetic image5 is briefly described below: Consider two-dimensional array of

ηξ ×= mmm (5)

pixels where each pixel is represented by a single index .m,,1i = and the intensity I of pixel i is given by

iii nsI += (6)

where, is the target intensity and is the noise intensity in pixel i , which is assumed to be Gaussian with zero

mean and covariance . is in

2σThe total target-related intensity is given by:

∑==

m

iiss

1 (7)

If the number of pixels covered by the target is denoted by , then the average target intensity over its extent is given by:

sm

American Institute of Aeronautics and Astronautics

4

ss m

s=µ (8)

The average pixel SNR (over the extent of the target) is ' sr µ

σ= (9)

Using Eqs. (5-9), synthetic images (rectangular or circular) in a frame can be generated by using the following inputs:

(i) Target pixel intensity ( ):is ( )2, ttN σµ

(ii) Noise pixel intensity( ): in ( )2, nnN σµ (iii) Target type: 1- 'Rectangle' (base NX & height NY),

Target type: 2- 'Circle' (radius) (iv) Position of the target in each scan: (x-position and y-position)

In order to simulate the motion of the target in the frame, kinematic models of target motion are used. Constant velocity kinematic model is used for generation of the data which determines the position of the target in each scan.

A.1 State Model The following state and measurement models are used to describe the constant velocity target motion. State model:

)(

02

0

0

02

)(

1000100

0010001

)1( 2

2

kw

T

TT

T

kXT

T

kX

⎥⎥⎥⎥⎥⎥

⎢⎢⎢⎢⎢⎢

+

⎥⎥⎥⎥

⎢⎢⎢⎢

=+ (10)

where state ( ) [ , , , ]p v p vX k x x y y= , =sampling period and is the zero mean Gaussian noise with

variance Q .

T )(kw

A.2 Measurement Model

(11) )1()1(01000001

)1( +++⎥⎦

⎤⎢⎣

⎡=+ kvkXkz

Where is the centroid measurement noise that is zero mean Gaussian noise with covariance matrix: )(kv

. (12) ⎥⎥⎦

⎢⎢⎣

⎡= 2

2

00

y

xRσ

σ

Both process noise and centroid measurement noise are assumed to be uncorrelated.

A.3 Clutter Model In a practical scenario, the image measurements could have clutter caused by interference from other targets and

due to limited resolution capability of sensor. Clutter is simulated by assuming a probability of false alarms and assuming that it has a Poisson distribution. Assume a sensor has N resolution cells (pixels). Detection is declared in a cell if the output of this exceeds a certain threshold. Due to sensor noise or background noise, the sensor may give detections even though the sensor points to a region where there are no targets. The number of false alarms in these cells is given as:

!)()(

mVem

mV

FAλµ λ−= (13)

where V is the volume of cells under consideration, N λ is the spatial density given by:

American Institute of Aeronautics and Astronautics

5

FAN pV

λ = (14)

where is false alarm probability density FAp

V. Results and Discussions A two-dimensional array of pixels is considered for the image. A two-dimensional array of pixels,

which is modeled as white Gaussian random field with a mean

6464×

tµ and variance is used to generate a rectangular

target of size (9x9). The image is converted into binary image using the upper (

2tσ

110UI = ) and lower ( 90LI = ) limits of a target layer (eq. 1), and then grouped into clusters by the nearest neighbor technique using the optimal proximity distance ( =2). pd pd

The initial state vectors of target 1 and target 2 in the image frame are:

Track 1:[ ]10 0.99 10 0.99 T and track 2: [ ]20 0 60 0.99 T−

The total number of scans is fifty. The image frame rate is 1frame/sec. The target and noise parameters used in this simulation are as follows3: target pixel intensity is and noise pixel intensity is . 2(100,10 )N 2(50,50 )N

The centroid of each cluster is calculated and used for state estimation in the measurement update part of the NNKF or PDAF filters to track the target in clutter. The NNKF and PDAF algorithms include track initiation and track deletion features which are essential in multi target tracking in clutter1. The ICTA algorithm was first validated with a single target data. It was found that the performance of both the NNKF and PDAF filters were similar for tracking the centroid of the target in clutter. The performance of the two algorithms is evaluated in terms of 5:

i. The percentage fit error (PFE) in x & y ˆ( )100*

( )norm x xPFEx

norm x−

= & ˆ( )100*

( )norm y yPFEy

norm y−

= (15)

ii. The root mean square position error

∑−+−

==

N

i

iiii )yy()xx(N

RMSPE1

22

21

(16)

iii. The root mean square velocity error

∑−+−

==

N

i

iiii )yy()xx(N

RMSVE1

22

21

(17)

iv. The root sum square position error

= RSSPE 22 )yy()xx( −+− (18)

v. The root sum square velocity error

= RSSVE 22 )ˆ()ˆ( yyxx −+− (19)

vi. Whether the State errors fall within the theoretical bounds dictated by , the state error covariance matrix

P

where and are the measurements, and are the estimated target locations in x and y coordinates, respectively.

x y x y

Fig. 1 shows the frame, which includes the estimated and true data of two crossing targets in clutter. The frame shows the background clutter and the two synthetic target images at 50th scan. The red line shows the true data and the blue line shows the estimated trajectory using PDAF algorithm for tracking. Figs. 2a and 2b show estimated positions and velocities compared with the true values. The state errors with their bounds, RSSPE and RSSVE values for the two targets are shown in Figs. 3a and 3b. Percentage fit error in x- & y-positions and root mean square position error for two targets are within the acceptable range as shown in Table-1. State errors and root sum

American Institute of Aeronautics and Astronautics

6

square errors in position and velocities respectively using twenty-five Monte Carlo simulations are shown in Fig. 4. The state errors are within the theoretical bounds and the root sum square errors are fraction of a pixel, which shows the robustness of the algorithm.

V. Track-to-Track Fusion Having validated the performance of the ICTA for multiple-target tracking, the algorithm is used for providing

input to state vector fusion when the data of position from ground based radars is available in Cartesian coordinate frame. Flow diagram for fusion of data from imaging sensor and ground based radar is shown in Fig. 5. The two state vectors are fused using the following relations6:

The tracks which are state vector estimates from the imaging sensor (track i) and ground based radar (track j) and their covariance matrices at scan k are shown below:

Track i : ( ) ( )kP,kx ii

Track j : ( ) ( )kP,kx jj (21) The fused state estimate is given by

( ) ( ) ( ) ( ) ( ) ( )[ ]k|kxk|kxkPk|kPk|kxkx ijijiic −+= −1 (22) The combined covariance matrix associated with the estimate of Eq. (22) are given by

( ) ( ) ( ) ( ) ( )k|kPkPk|kPk\kPkP iijiic1−−= (23)

where ijP is cross covariance between and ( k|kxi ) ( )k|kx j , and is given by

( ) ( ) ( )k|kPk|kPkP jiij += (24) The true, estimated and fused trajectories are shown in Fig. 6 from which it is clear that the fused trajectory matches the true trajectory. A measurement loss in imaging sensor is simulated from 15sec to 25sec and in the ground based radar from 30sec to 45sec. Track extrapolation has been done during these periods. Track deviation can be observed in Fig. 7during these durations. Percentage fit error in x & y directions and root mean square error in position & velocity for before and after fusion are shown in Table-2. From Fig. 7 and Table-2, it is observed that fusion of data gives better results when there is a measurement loss in either of the sensors thereby demonstrating the robustness and better accuracy achieved because of fusion.

VI. Concluding Remarks Image Centroid Tracking Algorithm (ICTA) has been implemented in PC MATLAB for accurate target tracking based on the data obtained from imaging sensors, when the target is not fully visible during tracking. Using segmentation technique the gray level image has been converted into binary image and reduced into clusters using nearest neighbor (NN) criteria. Tracking filters (NNKF, PDAF) were employed for state estimation using centroid measurement of clusters. The state estimates obtained using the ICTA algorithm are fused with the state vector obtained by processing the ground based radar data which is also tracking the target.

References 1Y. Bar Shalam and T. E. Fortmann, Tracking and Data Association, Academic Press, New York, 1988. 2Girija G. and J. R. Raol, “Comparison of methods for Association and Fusion of Multi-Sensor Data for Tracking

Applications,” AIAA Guidance, Navigation, and Control Conference & Exhibit, AIAA-2001-4106, Montreal, Quebec, Canada, Aug. 2001.

3E. Oron, A. Kumar and Y. Bar Shalom, “Precision Tracking with Segmentation for Imaging Sensor,” IEEE Trans. on Aero. and Elec. Systems, Vol. 29, 1993, pp. 977-987.

4K. C. Chang, R. K. Saha and Y. Bar Shalom, “On Optimal Track-to-Track Fusion,” IEEE Trans. on Aero. and Elec. Systems, Vol. 33, 1997, pp. 1276-1276.

5A. Kumar, Y. Bar Shalom and E. Oron “Precision Tracking based on Segmentation with Optimal Layering for Imaging Sensor,” IEEE Trans. on Analy. and Machine Intelligence, Vol. 17, 1995, pp. 182-188.

6VPS Naidu, Girija G. and J. R. Raol, “Evaluation of data Association and Fusion Algorithms for Tracking in the Presence of Measurement Loss,” AIAA Guidance, Navigation, and Control Conference & Exhibit, Austin, USA, Aug. 2003.

American Institute of Aeronautics and Astronautics

7

Table-1 Percentage fit error and root mean square errors PFEx PFEy RMSPE RMSVE

Track1 1.34 1.41 0.52 0.25 Track2 2.19 3.39 0.92 0.2

10 20 30 40 50 60

10

20

30

40

50

60

x-axis

y-ax

is

Clutter

Track-1

Track-2

Target-2

Target-1

Figure 1. Tracking of two Targets in presence of clutter

Table-2 Percentage fit error and root mean square error with radar, ICTA and fusion

ND=50samples Duration of measurement loss

PFEx PFEy RMSPE RMSVE

Radar 30 to 45 sec. 2.56 2.59 1.39 0.24 Imaging sensor 15 to 25 sec. 4.29 2.86 1.98 0.37

Combined (fused) 1.28 1.17 0.65 0.41

American Institute of Aeronautics and Astronautics

8

Figure 2a. True and estimated positions and Figure 2b. True and estimated positions and velocities for track-1 velocities for track-2

Figure 3a. State errors in position and velocity, Figure 3b. Sate errors in position and velocity, RSSPE and RSSVE for Target-1 RSSPE and RSSVE for Target-2

American Institute of Aeronautics and Astronautics

9

Figure 4. State errors and root sum square errors in positions and velocities using twenty-five Monte Carlo simulations

Image data

Segmentation &

Clustering

Centroid detection

Measurement-to-track association

Tracking: NNKF/PDAF

Measurement ),( φr

Polar to Cartesian conversion

Measurement-to-track association

Tracking: NNKF/PDAF

Track-to-track fusion

Estimated Track

Radar FLIR

Figure 5. Flow diagram for fusion of data from imaging sensor and ground based radar

American Institute of Aeronautics and Astronautics

10

Figure 6a. True and estimated positions and Figure 6b. True and estimated positions and velocities velocities from imaging sensor from ground based radar

10 15 20 25 30 35 40 45 50 55 6010

15

20

25

30

35

40

45

50

55

60

axis

Y-

fused

radar

ICTA

tr eu

axis

X-

Figure 6c. Fused trajectory

15 20 25 30 35 40 45 50 55 60

15

20

25

30

35

40

45

50

55

60

Y-a

xis

X-axis

fusedradarICTAtrue

ilo s radar

maging datas data loss

Figure7. Fused trajectory in presence of measurement loss

American Institute of Aeronautics and Astronautics

11


Recommended