+ All Categories
Home > Documents > [IEEE NAECON 2009 - IEEE National Aerospace and Electronics Conference - Dayton, OH, USA...

[IEEE NAECON 2009 - IEEE National Aerospace and Electronics Conference - Dayton, OH, USA...

Date post: 13-Oct-2016
Category:
Upload: praveen
View: 212 times
Download: 0 times
Share this document with a friend
4
A Flexible Evaluation Framework for Collaborative Layered Sensing Systems Adam Langdon, Dr. Praveen Chawla EDAptive Computing Inc. Dayton, Ohio Abstract—In this paper, we describe a robust framework for developing and evaluating layered sensing systems and specifically fusion algorithms used for collaborative sensing. We will discuss how our rapidly customizable analysis framework for Systems-of-Systems, namely EDAptive® Syscape™, provides a foundation for modeling collaborative sensor configurations and analyzing their performance, in terms of information fusion and cognitive processing. I. INTRODUCTION The task of identifying and tracking threats and producing timely situational awareness requires a vast network of sensors, software, and other resources working together flawlessly [1]. The ability of these components to intelligently collaborate greatly increases the opportunity for success. One valuable form of collaboration is sensor fusion, in which sensors must exchange and merge data to form a more complete picture of the situation. With up-to-date, fused information on threats and targets, operators can make quick, informed decisions. Creating this accurate picture requires many different types of sensors and related systems dispersed across the battlespace. However, translation of all of this data into concise, useful, and timely information remains extremely difficult. Sensor fusion provides the means to take disparate data and form a more complete set of information with a higher degree of confidence. Through data fusion, sensors can collaborate, share information, and exploit the strengths of each other to produce improved situational awareness. New types of sensor fusion are required as threats evolve and new sensors capabilities become available. Current fusion methods typically target specific sensors and are difficult to integrate. Furthermore, as new algorithms are developed, custom methods are needed to evaluate them. As a result, the impact of sensor fusion on a layered sensing system as a whole is difficult to assess. A common framework is needed to represent and evaluate fusion methods within the context of a complex system-of-systems. Sensor fusion continues to be a highly researched area and many different approaches have emerged to fit different scenarios. These solutions range from low-level algorithms to high level visualization methodologies to aid a warfighter in decision-making. One such approach is focusing on visualization cues to provide feedback to the fusion process [2]. In this way, users can guide the fusion process by assigning weights to particular features in a visual manner. Other approaches also focus on visualization as a way to make sensor fusion and decision making more intuitive. One particular effort attempts to reproduce the human ability to produce three dimensional images from stereo views [3]. In this way, it is thought that users can better detect targets if they can utilize texture and depth. In addition to a focus on visualization, sensor fusion solutions are being developed to deal with the increasing size and complexity of sensor networks. These solutions attempt to account for the additional constraints that these networks impose, including latency and resource availability. One approach to this issue attempts to embrace a web-centric approach to combine local sensor data with other types of information available remotely to create an enhanced battle space picture [4]. In general, these types of solutions demonstrate the importance of both visualization and a system-of-systems approach to sensor fusion. We will show how our solution attempts to incorporate both these aspects to provide a framework for dealing with the increasing complexity of sensor networks. We first describe the analysis framework and the specific features that enable the modeling of layered sensing systems. We then discuss the benefits of such an approach and areas for further research. II. SENSOR SYSTEM AND FUSION MODELING A. Analysis Framework Edaptive Computing Inc. (Edaptive) has leveraged its mature system-of-systems analysis framework, namely EDAptive® Syscape™, to develop an intuitive modeling environment for creating, and flexibly executing models. Syscape in conjunction with reusable libraries of parameterizable, executable models and custom plug-in modules for analysis provides a good starting point which could then be rapidly customized for varying analysis needs. Syscape is a highly flexible and customizable framework technology. Syscape provides a system designer the ability to 978-1-4244-4495-3/09/$25.00 ©2009 IEEE 40
Transcript

A Flexible Evaluation Framework for Collaborative Layered

Sensing Systems

Adam Langdon, Dr. Praveen Chawla

EDAptive Computing Inc.

Dayton, Ohio

Abstract—In this paper, we describe a robust framework for

developing and evaluating layered sensing systems and

specifically fusion algorithms used for collaborative sensing. We

will discuss how our rapidly customizable analysis framework

for Systems-of-Systems, namely EDAptive® Syscape™, provides

a foundation for modeling collaborative sensor configurations

and analyzing their performance, in terms of information fusion

and cognitive processing.

I. INTRODUCTION

The task of identifying and tracking threats and producing timely situational awareness requires a vast network of sensors, software, and other resources working together flawlessly [1]. The ability of these components to intelligently collaborate greatly increases the opportunity for success. One valuable form of collaboration is sensor fusion, in which sensors must exchange and merge data to form a more complete picture of the situation.

With up-to-date, fused information on threats and targets, operators can make quick, informed decisions. Creating this accurate picture requires many different types of sensors and related systems dispersed across the battlespace. However, translation of all of this data into concise, useful, and timely information remains extremely difficult. Sensor fusion provides the means to take disparate data and form a more complete set of information with a higher degree of confidence. Through data fusion, sensors can collaborate, share information, and exploit the strengths of each other to produce improved situational awareness.

New types of sensor fusion are required as threats evolve and new sensors capabilities become available. Current fusion methods typically target specific sensors and are difficult to integrate. Furthermore, as new algorithms are developed, custom methods are needed to evaluate them. As a result, the impact of sensor fusion on a layered sensing system as a whole is difficult to assess. A common framework is needed to represent and evaluate fusion methods within the context of a complex system-of-systems.

Sensor fusion continues to be a highly researched area and many different approaches have emerged to fit different scenarios. These solutions range from low-level algorithms to

high level visualization methodologies to aid a warfighter in decision-making. One such approach is focusing on visualization cues to provide feedback to the fusion process [2]. In this way, users can guide the fusion process by assigning weights to particular features in a visual manner. Other approaches also focus on visualization as a way to make sensor fusion and decision making more intuitive. One particular effort attempts to reproduce the human ability to produce three dimensional images from stereo views [3]. In this way, it is thought that users can better detect targets if they can utilize texture and depth.

In addition to a focus on visualization, sensor fusion solutions are being developed to deal with the increasing size and complexity of sensor networks. These solutions attempt to account for the additional constraints that these networks impose, including latency and resource availability. One approach to this issue attempts to embrace a web-centric approach to combine local sensor data with other types of information available remotely to create an enhanced battle space picture [4]. In general, these types of solutions demonstrate the importance of both visualization and a system-of-systems approach to sensor fusion. We will show how our solution attempts to incorporate both these aspects to provide a framework for dealing with the increasing complexity of sensor networks. We first describe the analysis framework and the specific features that enable the modeling of layered sensing systems. We then discuss the benefits of such an approach and areas for further research.

II. SENSOR SYSTEM AND FUSION MODELING

A. Analysis Framework

Edaptive Computing Inc. (Edaptive) has leveraged its mature system-of-systems analysis framework, namely EDAptive® Syscape™, to develop an intuitive modeling environment for creating, and flexibly executing models. Syscape in conjunction with reusable libraries of parameterizable, executable models and custom plug-in modules for analysis provides a good starting point which could then be rapidly customized for varying analysis needs.

Syscape is a highly flexible and customizable framework technology. Syscape provides a system designer the ability to

978-1-4244-4495-3/09/$25.00 ©2009 IEEE 40

capture the structure and behavior of a system-of-systems and then perform analysis on structure and behavior that aids in decision making. Hierarchical system structure is captured using an intuitive graphical user interface with drag-and-drop capabilities. The components of a system may be customized in appearance and graphically interconnected to capture relationships and dependencies. Syscape is domain independent and enables the user to specify their component or system behavior in any format that may be represented as a computer language or data file. This behavior is associated with a system or component as attachments. The attachments in Syscape may be grouped according to user-defined views; each view represents a different aspect of the system and provides the user with a multi-domain view of the system. Functional aspects such as behavior may be easily combined with cost, schedule, and risk when performing system analysis. The analysis capabilities are enabled through a well-defined programming interface through which users can write custom Java plug-ins. Plug-ins may be developed to perform specific analysis or execution as needed by the user for a particular problem. This flexibility gives users in different disciplines the freedom and power to create analysis and execution that affects their decision making process.

The five major parts to the Syscape framework

technology as shown in Fig. 1 are:

1) Library Browser – An explorer-like management

system permits users to create and organize reusable libraries

of component models. Datastores enable the categorization of

libraries and models into relevant domains, while the drag-

and-drop interface enhances familiarity of use. A Design

Browser view shows the system-of-systems as a hierarchical

tree structure which may be used for navigation.

2) Menus and Plug-Ins – A well-defined API provides

users the ability to create plug-ins that simulate a system

representation, perform trade-off analysis, and exchange

information with third-party tools. Together with a set of

models, custom plug-ins and menus may be packaged for

release as Syscape Modules (for a specific technology area),

similar in scope to Simulink™ toolboxes.

3) Hierarchical System Design Editor – The core

component of Syscape provides a graphical view of the

system and permits the user to explore the system

hierarchically. Systems may be built from models in the

reusable libraries and from connections that form the

relationships between subsystems. Syscape supports the

creation of customized blocks, complete with unique shapes,

colors, and icons, giving flexibility of design similar to tools

such as Microsoft Visio or Mathworks Simulink. However,

Syscape provides a means to execute and analyze the system

unlike Visio, and is not limited to behavior expressed

proprietary formats like Simulink.

4) Views and Attachments – Attachments are the

mechanism by which information and data is associated with

the various models within a system. These attachments

provide the behavior for Syscape models and give Syscape its

domain independence, which is one of its greatest strengths

as a system capture and analysis tool. Attachments may be

viewed, edited, or executed within Syscape or with the

attachment’s native application. Views allow the user to

categorize the attachments on a model such that there is a

separation of information when presented to the user or

operated upon by a plug-in.

5) Properties and Graphics – Embedded properties

provide the user with a means to modify a design from its

default specification and values. This gives Syscape a

powerful parametric-based analysis capability, enabling users

to perform large complex analysis with but a few settings.

Properties allow the user to specify technology alternatives,

value ranges, or equation relationships for each model within

the system. A full range of standard graphical properties are

also available to facilitate the customization of designs

according to user needs.

Figure 1: Syscape Analysis Framework

Using these features, the overall concept for simulation and analysis has at its core three primary steps:

1) Construcing a Structural Representation of the System This step is performed at the start of an analysis cycle.

Components from the model library are used to create a hierarchical model of the system. These components are instantiated with key performance parameter values.

2) Analyzing the System This step is a composition of what could be many,

interrelated analyses. Analysis of time and cost savings, resource allocation, key performance parameters, process configuration, and underlying infrastructure are few examples of the types of analysis that may be performed. Unique analysis needs may require customization of model behavior and development of new plug-in modules. Optimization capabilities permit automatic rank ordering of various choices, given a goodness criterion as a mathematical expression.

3) Producing Reports Once analysis is complete, the user is able to create

standardized reports in pre-determined formats, such as reports, charts, plots, and other custom visualization formats. This capability can be used to create real-time dashboards if live data feeds are used to drive model execution.

Menus and Plug-InsProvide external execution capabilities

Extend Syscape™ for particular applications

Allow user-created content through API’s

Menus and PlugMenus and Plug--InsIns

Provide external execution capabilitiesProvide external execution capabilities

Extend Syscape™ for particular applicationsExtend Syscape™ for particular applications

Allow userAllow user--created content through API’screated content through API’s

Hierarchical Design EditorProvide familiar Work Breakdown Structure view

Represent complex system-of-systems intuitively

Support top-down and bottom-up system design

Hierarchical Design EditorHierarchical Design Editor

Provide familiar Work Breakdown Structure viewProvide familiar Work Breakdown Structure view

Represent complex systemRepresent complex system--ofof--systems intuitivelysystems intuitively

Support topSupport top--down and bottomdown and bottom--up system designup system design

Views and Attachments

Allow domain experts to separate concernsAssociate any data file formats to designs and systems

Maintain heterogeneous information with designs

Views and AttachmentsViews and Attachments

Allow domain experts to separate concernsAllow domain experts to separate concerns

Associate any data file formats to designs and systemsAssociate any data file formats to designs and systems

Maintain heterogeneous information with designsMaintain heterogeneous information with designs

Properties and Graphics

Characterize systems with user-defined propertiesParameterize system designs and requirements

Customize visualization of systems and designs

Properties and GraphicsProperties and Graphics

Characterize systems with userCharacterize systems with user--defined propertiesdefined properties

Parameterize system designs and requirementsParameterize system designs and requirements

Customize visualization of systems and designsCustomize visualization of systems and designs

Library Browser

Organize libraries and designs easily

Provide familiar Explorer-like interface

Support drag-and-drop system construction

Library BrowserLibrary Browser

Organize libraries and designs easilyOrganize libraries and designs easily

Provide familiar ExplorerProvide familiar Explorer--like interfacelike interface

Support dragSupport drag--andand--drop system constructiondrop system construction

333222

111

555

444

Visualization and Optimization

Animate simulation resultsOptimize properties based on specific constraints

Visualization and OptimizationVisualization and Optimization

Animate simulation resultsAnimate simulation results

Optimize properties based on specific constraintsOptimize properties based on specific constraints

6

41

B. Layered Sensing Simulation

The framework we have described can enable the evaluation of collaborative sensor systems and provide intuitive mechanisms for visualizing the battlespace. The Syscape framework can also be used for rapidly evaluating sensor fusion methods. The solution employs system-of-systems modeling to allow for the analysis of sensor fusion effectiveness and its impact on system performance. Reusable sensor models can be instantiated and configured to construct a complex, hierarchical sensor system, as shown in Fig. 2.

Figure 2. Layered Sening Simulation in Syscape

One of the most beneficial aspects of a model-based

framework is its flexibility. Our approach allows developers

to model sensor systems at various levels of abstraction and

capture different views of the system. Developers can attach

data in various formats to the models to represent all the

necessary knowledge needed for proper fusion and ultimately

effective decision making. These models can then be reused

to quickly capture new sensor configurations. They can

modify model parameters such as range or tracking

requirements to evaluate different configurations. They can

also modify characteristics of the environment, such as the

detection background. Based on the goals of evaluation,

developers can combine both low-fidelity statistical sensor

models and high-fidelity physics-based models with fusion

algorithms that target features across several levels of detail.

In the same way, new types of sensors can be captured and

inserted into the system model easily. In addition, new and

evolving sensor fusion algorithms can also be captured and

inserted into the system model. These algorithms can be

represented as part of the connections between the various

sensors within the network. In addition to knowledge

representation, this visual framework facilitates the

integration of dynamic execution logic, allowing for further

customization.

Another key benefit of such a simulation framework is the

cost and time savings of testing new layered sensing

architectures. Even low-fidelity simulations, which can be

constructed relatively quickly with such a framework, can

help rule out architectures that will not meet basic mission

requirements. This activity narrows the space of potential

solutions and reduces the cost of full-scale testing. Modeling

also provides access to more rigorous types of testing, such as

model checking and theorem proving. Formal methods such

as these help evaluate trust by proving mathematically that

certain properties of a layered sensing systems will always be

true.

To design and evaluate layered sensing systems and fusion methods, developers can define specific features and metrics for comparison. Based on these features, developers can compare sensor fusion algorithms and the structures of the sensor network. Such evaluation metrics may include time needed to classify and identify a target or the confidence values associated with these results. Developers can also examine the fault tolerance of a sensor network given a specific fusion algorithm. For example, if a specific sensor is required for another task, they can measure if the information from the remaining available resources can be properly fused to maintain tracking operations. Syscape provides a customizable visualization dashboard for viewing these metrics during simulation for rapid comparison and evaluation (Fig 3).

Figure 3. Visualization Dashboard for System Evaluation

This Syscape framework also promotes the analysis of communication structures of sensors and other resources involved and how this affects fusion. Based on various dependencies, features can be extracted locally and exchanged remotely among sensors. Developers can employ Syscape to quickly model different types of structures, including trees and partially connected graphs. They can then simulate these structures to determine how decision making is affected among each node.

III. CONCLUSION

Meeting the layered sensing vision will require adaptable, collaborative systems-of-sensors. In addition, the information produced by such systems must be properly exploited to achieve relevant situation awareness. We have demonstrated a flexible modeling and simulation framework that can aid developers in the design and evaluation of these types of sensor systems. By providing better methods for representing

Fusion algorithms captured as connections between

sensor nodes

Fusion algorithms captured Fusion algorithms captured

as connections between as connections between

sensor nodessensor nodes

Reusable, customizable sensor components

Reusable, customizable Reusable, customizable

sensor componentssensor components

Dynamic execution logicDynamic execution logicDynamic execution logic

42

complex sensor architectures and data, new fusion techniques can be evaluated quickly to select the right types of fusion for the right situation. This framework provides simulation capability that encompasses a comprehensive, system-of-systems model well-suited for layered sensing evaluation. In addition, its flexibility reduces the time needed to conduct such evaluations. As a result, a better understanding of how sensor deployments affect the integrated battlespace picture will be possible.

REFERENCES

[1] Bryant, M.; Johnson, P.; Kent, B.; Nowak, M.; Rogers, S. “Layered Sensing: Its Definition, Attributes, and Guiding Principles for AFRL Strategic Technology Development.” Sensors Directorate, Air Force Research Laboratory, Wright-Patterson Air Force Base, Ohio, 2008

[2] Tian, G. Y. Gledhill, D. "Visualisation Based Feedback Control for Multiple Sensor Fusion," iv, pp. 553-556, Tenth International Conference on Information Visualisation (IV'06), 2006

[3] Watkins, W. R.; CuQlock-Knopp, V. G.; Jordan, J. B.; Marinos, A. J.; Phillips, M. D.; Merritt, J. O. Sensor Fusion: A Preattentive Vision Approach Proc. SPIE Vol. 4029, p. 59-67, Targets and Backgrounds VI: Characterization, Visualization, and the Detection Process, Wendell, R. W.; Dieter, C.; Reynolds, W.R.; Eds. 07/2000

[4] Paul, J.L. Smart Sensor Web: Web-based exploitation of sensor fusion for visualization of the tactical battlefield. Aerospace and Electronic Systems Magazine, Volume 16, Issue 5, May 2001 Page(s):29 – 36

43


Recommended