Software Requirements Specification
for the
Java Quasi-Connected Components (JQCC) Software
William P. Champlin
University of Colorado at Colorado Springs (UCCS)
March 30, 2010
Version Release
Date
Responsible Party Major Changes
0.1 03/30/10 William P. Champlin Initial Document Release
2
Table of Contents
1. INTRODUCTION .................................................................................................................... 3
1.1 PURPOSE .................................................................................................................................. 3 1.2 SCOPE. ...................................................................................................................................... 3 1.3 ACRONYMS .............................................................................................................................. 4 1.4 REFERENCES ............................................................................................................................ 4
2. THE GENERAL DESCRIPTION .......................................................................................... 4
2.1 PRODUCT PERSPECTIVE ......................................................................................................... 4 2.2 PRODUCT FUNCTIONS ............................................................................................................. 4 2.3 USER CHARACTERISTICS ........................................................................................................ 5 2.4 GENERAL CONSTRAINTS......................................................................................................... 5 2.5 ASSUMPTIONS AND DEPENDENCIES ....................................................................................... 7
3. SPECIFIC REQUIREMENTS ................................................................................................ 7
3.1. FUNCTIONAL REQUIREMENTS ............................................................................................ 7 3.1.1. CAMERA SIMULATOR FUNCTION (CAMSIM) .................................................................... 7 3.1.1.1. PURPOSE .......................................................................................................................... 7 3.1.1.2. Inputs ............................................................................................................................... 8 3.1.1.3. Operations ....................................................................................................................... 8 3.1.1.4. Outputs ............................................................................................................................ 9 3.1.2. TRACKER FUNCTION ........................................................................................................... 9 3.1.2.1. PURPOSE .......................................................................................................................... 9 3.1.2.2. Inputs ............................................................................................................................... 9 3.1.2.3. Operations ..................................................................................................................... 11 3.1.2.4. Outputs .......................................................................................................................... 13 3.1.3. DISPLAY FUNCTION .......................................................................................................... 13 3.1.3.1. PURPOSE ........................................................................................................................ 13 3.1.3.2. Inputs ............................................................................................................................. 13 3.1.3.3. Operations ..................................................................................................................... 13 3.1.3.4. Outputs .......................................................................................................................... 14 3.1.4. LOGGING FUNCTION ......................................................................................................... 14 3.1.4.1. PURPOSE ........................................................................................................................ 14 3.1.4.2. Inputs ............................................................................................................................. 14 3.1.4.3. Operations ..................................................................................................................... 15 3.1.4.4. Outputs .......................................................................................................................... 15 3.2. EXTERNAL INTERFACE REQUIREMENTS ............................................................................ 15 3.3. PERFORMANCE REQUIREMENTS .......................................................................................... 15 3.4. DESIGN CONSTRAINTS ......................................................................................................... 16 3.5. QUALITY CHARACTERISTICS .............................................................................................. 16 3.6. OTHER REQUIREMENTS ....................................................................................................... 16
4. SUPPORTING INFORMATION ......................................................................................... 16
3
1. Introduction
The following subsections provide an overview of this document.
1.1 Purpose
The purpose of this SRS is to define the requirements for Java Quasi-Connected Components
(JQCC) program. The requirements defined here will be used in assisting maintainers in
further improvements and maintenance of the JQCC software, and by users who need to
understand what JQCC does.
1.2 Scope.
JQCC is based on the Lehigh Omni-Directional Tracking System (LOTS). LOTS was
developed under the lead of Dr. Terrance Boult initially from the Lehigh University Vision and
Software Technology (VAST) lab, and now for the UCCS VAST lab. It was primarily used
for the tracking of human motion, such as tracking snipers in 360° (omni-directional) images.
It has been and continues to be adapted to new domain areas, such as tracking of ships and
unmanned aerial vehicles (UAV’s). The goal of LOTS was to detect objects that blend well
into background with significant clutter where targets may only occasionally move (i.e.
movement is detectable only every nth frame).
LOTS employs identification of primary and secondary backgrounds which allows
thresholding to be performed more accurately so fewer background pixels are thresholded thus
reducing target false alarms. It also provides a unique dual thresholding technique instead of a
single threshold which allows more pixels to be associated with targets, thus improving the
probability of detection and tracking while also reducing false alarms. The technique, called
Quasi-Connected Components (QCC), connects target pixels above a high threshold with those
that are below, yet above the background, thus putting more pixels on target.
The objective of JQCC is to extracting and re-host the core functionality of LOTS in Java,
which includes the following mechanisms employed by LOTS and defined in [1] and [2]:
Backgrounding – identifying one or more backgrounds for the purposes of subtracting
them to leave only candidate object intensity values
Thresholding – techniques for separating background objects from moving target objects
Pixel labeling – classifying an individual pixel in various ways, such as being above a
particular threshold, belonging to a background, or being a target object pixel
Target pixel grouping – clustering the pixels that belong to target objects
Target object centroiding – determining the center or location of a target object
Target association across frames – tracking an object as it moves across pixel space and
frame time
An additional emphasis of the JQCC project is to determine the applicability of the LOTS
algorithm techniques for tracking objects in a new domain area. The domain selected is that of
4
tracking astronomical objects such as satellites, debris or other celestial objects. This requires
adapting, modify and tuning the LOTS algorithm parameters to support this domain area.
1.3 Acronyms
Acronym Description
CAMSIM Camera Simulator
JQCC Java QCC
LOTS Lehigh Omni-Directional Object Tracking System
MMX Multi-Media Extensions
QCC Quasi-Connected Components
QTRACK QCC LOTS C++ Tracking Software
SRS Software Requirements Specification
UAV Unmanned Aerial Vehicle
UCCS University of Colorado at Colorado Springs
VAST Vision and Software Technology
1.4 References
[1] T.E. Boult, R.J. Micheals, X. Gao, M. Eckmann, “Into the woods: visual
surveillance of non-cooperative and camouflaged targets in complex outdoor settings”,
in Proc. Of the IEEE, Oct. 2001
[2] T.E. Boult, T. Zhang, R.C. Johnson, “Two threshold are better than one”, CVPR,
pp.1-8, 2007 IEEE Conference on Computer Vision and Pattern Recognition
2. The General Description
The following subparagraphs describe the 5 general factors affecting the product and its
requirements.
2.1 Product Perspective
The JQCC product is a maintenance re-host of the QCC portion of the original LOTS software
system (the QTRACK program), which has no supporting requirements, design or user's
documentation and, due to its numerous external library dependencies, requires extensive
effort to re-host to a new platform - such as Red Hat Linux. This project then reverse
engineers the LOTS requirements and design and then reengineers the design and code into the
more portable Java programming language which allows the software to execute on numerous
target platforms without recompiling.
2.2 Product Functions
The product functions are broken down into four use cases. A Simulate Camera use case is
provided to remotely read and send images to the Track Objects use case for processing. Use
of a "networked" simulator allows for easy replacement for another source of images, such as a
real camera system, without modifying the JQCC software. Next, the primary use case, Track
Objects, takes images and executes the QCC algorithm and provides the results to the Display
Images use case which provides visual confirmation of tracking to the user. The Display
5
Images use case provides the user with a view of the raw or intermediate algorithm processed
images in addition to region of interest (ROI) boxes drawn over target detections. The Track
Objects use case also provides both tracked and candidate target information to the Log Target
Data use case which records the information to a log file. The log file contains object
locations, tracked confidence, and pixel area for both candidate targets and targets the software
considers "real". The log file is available to be examined by the user to determine the path the
target moved across the field of view. The user may also want to examine the confidence and
target area to determine if algorithm tuning is required to improve tracking performance.
Figure 1, JQCC Use Case Diagram, depicts the interaction of these 4 capabilities with a user.
Figure 1 JQCC Use Case Diagram
In this SRS, the Simulate Camera use case is allocated to the Camera Simulator function, the
Track Objects use case to the Tracker function, the Display Images use case to the Display
function, and the Log Target Data use case to the Logging function.
2.3 User Characteristics
Because JQCC is implemented in Java, it can be installed and operated by any user with
limited system administration knowledge. A person wishing to perform maintenance on JQCC
should be familiar with the Java programming language, the C++ programming language (if
more legacy LOTS functionality is ported into JQCC), reverse engineering techniques,
requirements analysis and the ability to update the documentation.
2.4 General Constraints
Table 1 shows the subset of the existing LOTS capabilities required to be supported by
JQCC. An additional constraint is that JQCC must be developed using the Java
programming language to determine if it is up the task of executing as a high
performance object tracking application and to discover some of its limitations.
6
Table 1 Ported LOTS Capabilities
Existing LOTS Topic Description JQCC
Support
Comment / Constraints
Learning mode / learning
mode countdown
Used to blend input image pixels
quickly into the background
Partial Learning mode turned off since tracking
doesn't occur with it enabled, although the
code has been converted and images are
blended with it enabled
Blurring images for
thresholding
Used to average region pixel values
for thresholding. Applied to initial
images when establishing initial
reference images and then on a
periodic basis
Yes
Image/video stabilization Image "pyramids" used to reduce
pixel resolution to allow for jitter at
individual pixel level
No
Kalman filter processing Detections generated from an error
model built up over time from actual
track positions
No Not implemented / working in QTRACK
Grayscale images Image pixels values input in range 0-
255
Yes
Color Images (YUV,
RGB, etc)
Number and type of pixel bands is 3
(3 bytes per pixel) for two different
color modes.
Partial Camera simulator will convert RGB/BGR 3
band images to grayscale before sending.
Users can convert any image type i.e. YUV
to RGB or grayscale using offline tools
Video Lost Processing Special processing to keep checking
based on a countdown if input video
fails
No Appears to be a work around for "bad" input
images. The camera simulator would have to
support this if implemented in the future.
Some MPEG images appear to experience
this and must be converted to JPEG frame
sequences
Video Tamper Processing Perform total re-initialization if too
many pixels are changed for a
"moving camera"
Partial Converted but never debugged/tested
Image “chips” Small jpg images of alarm ROIs sent
out across the network
No Requires remote network access to a client.
Alarms Generated based on map location,
shape, and other rules
No
Event History Defines a list of alarm events that
have occurred
No
Special Rectangles Only perform tracking inside of
designated rectangles
Yes
MMX Hardware accelerated that uses single
instruction multiple data
No
Image Logging Logs whole or sub-regions of an
image. Logs raw, variance, all
reference, parent, etc.
No
Video 4 linux Video Capture No
Display of different output
frame types i.e. ref, ref2,
thresholded, etc
Different frames put up for
debugging
Yes Not all frame types in QTRACK appear to
generate a non-zero image i.e. Parent before
or after QCC connect. JQCC only will match
the existing capability
Support for movie files AVI files Partial Current QTRACK uses FFMPEG API to read
AVI files. No good AVI reader for Java, so
images must be converted to MPEG 1.0 or
JPEG image sequences using FFMPEG
program or other offline utility
7
Existing LOTS Topic Description JQCC
Support
Comment / Constraints
Use of cords Specify valid image region to track –
used for omni images to discard
everything outside of a given radius
No 2 mechanisms are used to define areas to
track within - special rectangles and cords.
Currently no omni camera images are
supplied for testing this
Keep Track Points Trace of tracking history over image No
Calibration Calibrates image locations based on
lat/lon coordinates
No
Pan-Tilt-Zoom (PTZ)
Overlay Text
Annotation text overlaid on image
display for PTZ cameras
No
Control of contrast,
brightness and whiteness
Scaling into some optimal range. Partial Disabled by default in QTRACK. Note says
it confuses people. Gamma scale slider to be
used in JQCC on display
Remove Lighting Effects Changes to entire image or just sub
regions due to gradual or abrupt
lighting changes
Yes
Use of Edge data Make tracking less sensitive to
lighting changes
No
Fake targets Inject fake target into tracked regions Partial Fake target in "canned" images but not
dynamically written over input images
Set Region Priorities Set priority based on confidence
which is then used to adjust threshold
lower around higher priority regions
Partial Priorities are set but "critical" priorities are
not leveraged to reduce threshold around
since they are tied to unsupported alarm
processing
Dynamic Update of
Variance frame
Causes variance to be updated –
increased if pixel is above
thresholded, decreased if below
Yes
Image unwarping Flatten out omni-directional images No No omni-directional camera images were
tested for JQCC
Multiple Display support Displaying different tracking regions
on up to 5 different displays
No
Down sampling Rows and columns down sampled to
reduce resolution and increase
tracking performance
No
2.5 Assumptions and Dependencies
JQCC will track objects in canned images in a manner consistent with the original LOTS
(QTRACK) program. Therefore, the QTRACK program will provide the "gold standard" by
which JQCC is to be evaluated and any failure to meet requirements in the legacy QTRACK
system will be deemed acceptable in JQCC also.
3. Specific Requirements
3.1. Functional Requirements
3.1.1.Camera Simulator Function (CAMSIM)
3.1.1.1. Purpose
The purpose of this function is to simulate the function of providing images to the
Tracker function to allow for execution and verification of the JQCC program as a whole.
8
3.1.1.2. Inputs
This function allows the user to select a set of frames, in the form of a movie file or in an
image sequence and then "play" or transmit those images into the JQCC tracking function
at a user specifiable rate. This function also allows the user to suspend and resume image
frame transmission. Table 2 describes each input.
Table 2 Camera Simulator Function Inputs
Input Source Constraints
Image frames User MPEG 1.0 movies may have to be created from
existing AVI or other files using a third party tool
i.e. FFMPEG, ImageJ, QuickTime Professional,
etc. JPEG image sequences must have a
sequentially incrementing integer identifier with a
constant number of digits in the file names i.e.
image000.jpg, image001.jpg, etc. MPEG frames
must be 3 band BGR and JPEG sequences and can
be either 1 band grayscale or 3 band RGB
Transmit rate User / Display
Function
MPEG rates are 0-25 frames per second (fps) and
JPEG rates are 0-50 fps
Transmit / Resume User / Display
Function
Transmit must not be enabled until all images are
loaded
Pause User / Display
Function
Pause must not be enabled unless transmitting
images is currently in progress
3.1.1.3. Operations
Operations performed by this function are defined as follows:
1. JQCC shall provide a GUI control which allows a user to select an input set of
images<3.1.1.3.1>.
a. The input set of images shall be in the form of an MPEG 1.0 movie file or a
sequence of JPEG images<3.1.1.3.1.a>.
i. MPEG 1.0 movie files shall contain 3 band BGR images<3.1.1.3.1.a.i>.
ii. JPEG 1.0 image sequences shall contain either 1 band grayscale images or 3
band RGB images<3.1.1.3.1.a.ii>.
2. JQCC shall read the selected set of images into memory<3.1.1.3.2>.
3. JQCC shall provide a GUI control which allows a user to transmit the selected set of
input images<3.1.1.3.3>.
a. Transmission shall be the sequential writing of each image file across a TCP/IP
socket to the Tracker function<3.1.1.3.3.a>.
4. JQCC shall provide a GUI control which allows a user to set the rate of image
transmission<3.1.1.3.4>.
a. The rate shall vary from 0 fps to 25 fps for MPEG 1.0 images<3.1.1.3.4.a>.
b. The rate shall vary from 0 fps to 50 fps for JPEG images <3.1.1.3.4.b>.
5. JQCC shall provide a GUI control which allows a user to begin or resume image
transmission<3.1.1.3.5>.
9
a. Image transmission shall be enabled only after the movie file or images are
loaded or after a pause action has occurred<3.1.1.3.5.a>
6. JQCC shall provide a GUI control which allows a user to pause (stop) image
transmission<3.1.1.3.6>.
a. The pause ability shall be enabled only during image transmission<3.1.1.3.6.a>
7. JQCC CAMSIM function shall provide a "server" TCP/IP socket connection that can
be connected to by a Tracker function client immediately after program
startup<3.1.1.3.7>
3.1.1.4. Outputs
CAMSIM outputs are described in Table 3.
Table 3 Camera Simulator Outputs
Output Destination(s) Constraints
Image frames Tracker
Function
Image size must be at least 40x40 pixels at a
transmission rate between 0 and 50 fps. Frame
rates beyond 50 fps are allowed but are limited by
the Tracker Function's ability to process them.
3.1.2.Tracker Function
3.1.2.1. Purpose
The purpose of this function is to track low contrast objects across image frames supplied
by CAMSIM and pass every nth frame to the Display Function with object ROIs drawn
in the image surrounding each tracked target. This function also sends tracking data to
the Logging Function for data recording. This function contains the QCC algorithm
functionality.
3.1.2.2. Inputs
This function inputs the information defined in Tables 4-5.
10
Table 4 Calibration File
Input Source Constraints
Comment Calibration
text file
"#"<text> where text is a string up to 132 printable ASCII
characters. Note that a comment can occur on any line in
the file. Only entry on the line
Scenario Calibration
text file
"SCENARIO"<whitespace><scenario name> where
<scenario name> is a predefined name hardcoded in the
application i.e. "SCENARIO_WATERSIDE", etc. Only
entry on the line
Sensor Type Calibration
text file
"SENSOR_TYPE"<whitespace><scenario type #> where
<scenario type #> is a predefined type hardcoded in the
application. Only entry on the line
Image Width Calibration
text file
"IMAGE_WIDTH"<whitespace><input image width>
where <input image width> has range 40-1024. Only
entry on the line
Image Height Calibration
text file
"IMAGE_HEIGHT"<whitespace><input image height>
where <input image height> has range 40-1024. Only
entry on the line
Update Rate Calibration
text file
"UPDATE_RATE"<whitespace><rate> where <rate> is
an integer 1-10000. Only entry on the line
Tracking Separation Calibration
text file
"TRACKING_SEPARATION"<whitespace><separation
time in ms> where <separation time in ms> has range 0
to 100000. Only entry on the line
Threshold Calibration
text file
"THRESHOLD"<whitespace><threshold> where
<threshold> has range 1 - 99. Only entry on the line
High Delta Calibration
text file
"HIGH_DELTA"<whitespace><high delta value> where
<high delta value> has range 1 - 100. Only entry on the
line
Min Size Calibration
text file
"MIN_SIZE"<whitespace><min target pixels> where
<min target pixels> has range 1 to 1024^2. Only entry on
the line
Max Size Calibration
text file
"MAX_SIZE"<whitespace><max target pixels> where
<max target pixels> has range 1 to 1024^2. Only entry
on the line
Quite Level Calibration
text file
" QUITE_LEVEL"<whitespace><quite level> where
<quiet level> has range -99 to 999. Only entry on the
line
Number of Initial Reference
Frames
Calibration
text file
"INITIAL_FRAMES"<whitespace><number> where
<number> is greater than or equal 1
11
Table 5 Rectangle File
Input Source Constraints
Comment Rectangle
File
"#"<text> where text is a string up to 132 printable ASCII
characters. Note that a comment can occur on any line in
the file
Number of Rectangles Rectangle
File
Positive integer. Only entry on the line
Rectangle Definition Rectangle
File
<X coord><white space><Y coord><white
space><Width><white space><Height>. All 4 entries
must appear on one line. This entry is repeated for the
"Number of Rectangles" number of entries
3.1.2.3. Operations
Operations performed by this function are defined as follows:
1. JQCC shall track targets as they move within a sequence of image frames over time
using the QCC technique<3.1.2.3.1>.
a. Tracking shall be the identification, display and recording of the target’s position
as it moves across a sequence of image frames<3.1.2.3.1.a>.
i. Identification shall be the association of target labels over time<3.1.2.3.1.a.i>.
1. A label shall be a unique integer identifier assigned to all pixels of the
same target<3.1.2.3.1.a.i.1>.
ii. Association shall be the correlation of targets over time<3.1.2.3.1.a.ii>.
iii. Association shall be done for both overlapping and non-overlapping
targets<3.1.2.3.1.a.iii>.
iv. Display of a target’s position shall consist of drawing a region of interest (ROI) box
around the candidate target’s pixels in an output image frame while erasing the ROI
box drawn at the target’s previous position<3.1.2.3.1.a.iv>.
v. Recording of a target’s position shall consist of sending, for each candidate target:
ID (label), confidence, X/Y position of the target centroid, starting row, ending row,
starting column, ending column and pixel area to the Logging
Function<3.1.2.3.1.a.v>.
vi. A sequence of image frames shall be a movie in MPEG 1.0 format input from a file
or a series of JPEG image files that have sequential numbers in their
filenames<3.1.2.3.1.a.vi>.
vii. A target shall be a group of pixels (a.k.a. blob) which move together across a
series of image frames relative to a stationary background<3.1.2.3.1.vii>.
2. JQCC shall track targets by:
a. Computing an initial reference frame as the median of the first n input frames,
where n is a user settable value<3.1.2.3.2.a>.
b. Computing a high per pixel threshold that is a user configurable amount above
the primary background<3.1.2.3.2.b>.
c. Identifying candidate target pixels as those exceeding the high
threshold<3.1.2.3.2.c>.
12
d. Computing, for each pixel in the surrounding candidate target pixels region, a
lower per pixel threshold that is below the high threshold and at or above the
primary background<3.1.2.3.2.d>.
e. Grouping candidate high threshold pixels with those neighboring pixels that are
below the high threshold and exceed the low threshold<3.1.2.3.2.e>.
f. Dynamically adjusting the high per pixel threshold when a configurable
maximum number of pixels exceeds the current threshold and repeating steps 2c
through 2e<3.1.2.3.2.f>.
g. Centroiding each resulting connected group of pixels (a target) to determine
target X,Y positions<3.1.2.3.2.g>.
h. Classifying a target as a track if it has a pixel count within a user configurable
min to max number of pixels, inclusive<3.1.2.3.2.h>.
i. Assigning labels to target pixels<3.1.2.3.2.i>.
j. Blending in input images into primary and secondary background reference
frames by computing the difference between the input frame and each
background frame<3.1.2.3.2.j>.
3. Image pixels shall be 8 bits representing grayscale intensity values<3.1.2.3.3>.
4. One output frame shall be sent to the Display function for each nth input image
where n is a user settable “skip frames” parameter<3.1.2.3.4>.
5. Output frames shall consist of those output images defined in Table 6<3.1.2.3.5>.
6. JQCC shall perform tracking with rectangular regions defined in a rectangle
file<3.1.2.3.6>.
7. JQCC shall initialize tracking tuning parameters from the calibration text file
<3.1.2.3.7> to:
a. Increase/decrease high threshold (Th)<3.1.2.3.7.a>.
b. Increase/decrease blending of new frames into reference frames by:
i. Setting the alpha blending threshold for using fast blending
technique<3.1.2.3.7.b.i>.
ii. Set the eta blending value for using slow blending technique<3.1.2.3.7.b.ii>.
c. Set the dynamic threshold constant (cu)<3.1.2.3.7.c>.
d. Increase/decrease the minimum target area<3.1.2.3.7.d>.
e. Increase/decrease target confidence before considering a target a
track<3.1.2.3.7.e>.
f. Set the output “skip frames” parameter<3.1.2.3.7.f>.
g. Disable or enable tracking<3.1.2.3.7.g>.
8. The difference frame shall be a frame where each pixel is the difference between the
reference image and the current input image<3.1.2.3.8>.
9. The label image frame shall be a frame where all pixels belonging to the same target
have the same values<3.1.2.3.9>.
10. The reference image frame shall be the primary background frame<3.1.2.3.10>.
11. The variance image frame shall be a frame containing all pixels exceeding the pixel
high threshold value<3.1.2.3.11>.
13
3.1.2.4. Outputs
This function outputs the information defined in Table 6.
Table 6 Tracker Function Outputs
Output Destination Constraints
Output images Display
Function
Supported output image types must be in 8 bit
grayscale and include the raw input frame, variance
frame, difference frame, primary reference frame,
secondary reference frame and the label frame
Identified target ROIs Display
Function
ROI drawn surrounding all identified target pixels
by setting pixels in the raw or variance output frame
Candidate target Regions Logging
Function Logged data includes target confidence, the X,Y
pixel centroid location, the row and column
bounds and the targets pixel area.
3.1.3.Display Function
3.1.3.1. Purpose
The purpose of this function is to accept image frames from the Tracker Function and
output them in a resizable graphical window to the user. This function also provides GUI
controls that allow the user to vary a subset of the tuning parameters described in section
3.1.2 that are used to adjust tracking performance. It also provides a control to adjust
image contrast.
3.1.3.2. Inputs
Inputs to this function are defined in Table 7.
Table 7 Display Function Inputs
Input Source Constraints
Output images Display
Function
Supported output image types must be in 8 bit
grayscale and include the raw input frame,
variance frame, difference frame, primary
reference frame, secondary reference frame and
the label frame
3.1.3.3. Operations
Operations performed by this function are defined as follows:
1. JQCC shall display 8 bit grayscale images in a graphical window<3.1.3.3.1>.
14
a. The graphical window shall be resizable<3.1.3.3.1.a>.
2. JQCC shall provide GUI controls that allow the user to adjust image
contrast<3.1.3.3.2>.
3. JQCC shall provide GUI controls that allow the user to adjust display and tracking
control parameters<3.1.3.3.3> to:
a. Increase/decrease contrast<3.1.3.3.3.a>.
b. Increase/decrease high threshold (Th)<3.1.3.3.3.b>.
c. Increase/decrease target confidence before considering a target a
track<3.1.3.3.c>.
d. Set the output “skip frames” parameter<3.1.3.3.d>.
e. Disable or enable tracking<3.1.3.3.e>.
3.1.3.4. Outputs
This function outputs the information defined in Table 8, one output entry per line.
Table 8 Display Function Outputs
Output Destination Constraints
Tracking images User Supported output image types displayed in
grayscale and include the raw input frame,
variance frame, difference frame, primary
reference frame, secondary reference frame and
the label frame. Images are resized to fit the
current GUI frame size.
Current user settable
values
User Values include those user settable values defined
in the Operations section.
3.1.4. Logging Function
3.1.4.1. Purpose
The purpose of this function is to record data associated with each candidate track to a
log file. A user can use the recorded results to determine if tracking parameters need
further tuning. The logged data also provides a history of the movement of tracked
objects over the course of execution.
3.1.4.2. Inputs
Inputs to this function are defined in Table 9.
Table 9 Display Function Inputs
Input Source Constraints
Candidate target Regions Tracking Logged data includes target confidence, the
15
Function X,Y pixel centroid location, the row and
column bounds and the targets pixel area.
3.1.4.3. Operations
Operations performed by this function are defined as follows:
1. JQCC shall create a new track log file for each execution of the
program<3.1.4.3.1>.
a. The track log file shall be an ASCII text file<3.1.4.3.1.a>.
2. The track log file name shall contain the date and time of initial program
execution<3.1.4.3.2>.
3. JQCC shall record candidate target regions to the track log file when received
from the Tracker Function<3.1.4.3.3>.
4. Candidate target regions shall consist of target confidence, the object centroid X,Y
location, the starting and ending row of the object, the starting and ending column
of the object, and the pixel area covered by the object<3.1.4.3.4>.
3.1.4.4. Outputs
This function outputs the information defined in Table 10, Track Log File. Note that all
output entries exist on the same line as a single record and each record can be repeated
any number of times.
Table 10 Track Log File
Output Destination Constraints
Candidate target Regions User/Track Log
File Logged data includes target confidence, the
X,Y pixel centroid location, the row and
column bounds and the targets pixel area.
3.2. External Interface Requirements
User interface requirements are defined within each function in section 3.1, as applicable.
3.3. Performance Requirements
Performance requirements are as follows:
1. JQCC shall run on one desktop or laptop running the operating system specified in
section 3.6<3.3.1>.
2. JQCC shall support one user at time<3.3.2>.
3. JQCC shall process 640x480 frames input at a rate of up to 17 Hz with a goal of 34
Hz<3.3.3>.
16
4. JQCC shall process 720x320 frames input at a rate of 25 Hz with a goal of 50
Hz<3.3.4>.
5. JQCC shall support input images with resolutions up to and including 1024x1024
pixels<3.3.5>.
3.4. Design Constraints
JQCC shall be developed using the Java programming language exclusively<3.4>
3.5. Quality Characteristics
This paragraph is not applicable to this project.
3.6. Other Requirements
1. JQCC shall execute under both MS Windows and Linux Operating Systems<3.6.1>
2. JQCC shall track astronomical objects<3.6.2>
4. Supporting Information
This paragraph is not applicable to this project