+ All Categories
Home > Documents > Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Date post: 10-Feb-2022
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
30
Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts Across Real and Virtual Manned-Unmanned Fleets ? Oktay Arslan ?? , Bahadir Armagan ??? , and Gokhan Inalhan Controls and Avionics Laboratory, Istanbul Technical University, {oktay.arslan,bahadir.armagan,inalhan}@itu.edu.tr http://cal.uubf.itu.edu.tr Abstract. The increasing use of unmanned vehicles in civilian (metropoli- tan traffic monitoring, rapid assessment of disaster areas) and military (reconnaissance, target identification, tracking and engagement) domains has driven critical requirements for the interoperability of manned-unmanned systems. In this work, we provide the design and the development of a flight mission simulator structured for joint real-time simulation across manned-unmanned fleets, and the mission control center. The hardware structure within the network simulator is tailored to mimic the dis- tributed nature of each of the vehicle’s processors and communication modules. Open-source flight simulation software, FlightGear, is modi- fied for networked operations and it is used as the 3D visualization el- ement for the pilot and the mission controls. The UAV dynamics and low-level control algorithms are embedded within the xPC target com- puters. Equipped with 3D flight simulation displays and touch-screen C2 interface at the desktop pilot level, the platform also allows us to rapidly prototype and test pilot-unmanned fleet supervisory control and pursuit- evasion game designs. In addition, the unique design enables seamless integration of real unmanned air vehicles within a simulated scenario. Hardware-in-the-loop testing of network bus compatible mission com- puters and avionics systems provides us with validation of the C2 archi- tectures and the hardware designs on a realistic lab-scale platform before the actual flight experiments. ? A part of this work is funded under the DPT HAGU program administered through ITU ROTAM. ?? Research Assistant, Controls and Avionics Laboratory ??? Research Engineer, Controls and Avionics Laboratory Corresponding Author: Assistant Professor, Faculty of Aeronautics and Astronau- tics, Istanbul Technical University
Transcript
Page 1: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Development of a Mission Simulator for Designand Testing of C2 Algorithms and HMI

Concepts Across Real and VirtualManned-Unmanned Fleets ?

Oktay Arslan ??, Bahadir Armagan ? ? ?, and Gokhan Inalhan †

Controls and Avionics Laboratory, Istanbul Technical University,{oktay.arslan,bahadir.armagan,inalhan}@itu.edu.tr

http://cal.uubf.itu.edu.tr

Abstract. The increasing use of unmanned vehicles in civilian (metropoli-tan traffic monitoring, rapid assessment of disaster areas) and military(reconnaissance, target identification, tracking and engagement) domainshas driven critical requirements for the interoperability of manned-unmannedsystems. In this work, we provide the design and the development of aflight mission simulator structured for joint real-time simulation acrossmanned-unmanned fleets, and the mission control center. The hardwarestructure within the network simulator is tailored to mimic the dis-tributed nature of each of the vehicle’s processors and communicationmodules. Open-source flight simulation software, FlightGear, is modi-fied for networked operations and it is used as the 3D visualization el-ement for the pilot and the mission controls. The UAV dynamics andlow-level control algorithms are embedded within the xPC target com-puters. Equipped with 3D flight simulation displays and touch-screen C2interface at the desktop pilot level, the platform also allows us to rapidlyprototype and test pilot-unmanned fleet supervisory control and pursuit-evasion game designs. In addition, the unique design enables seamlessintegration of real unmanned air vehicles within a simulated scenario.Hardware-in-the-loop testing of network bus compatible mission com-puters and avionics systems provides us with validation of the C2 archi-tectures and the hardware designs on a realistic lab-scale platform beforethe actual flight experiments.

? A part of this work is funded under the DPT HAGU program administered throughITU ROTAM.

?? Research Assistant, Controls and Avionics Laboratory? ? ? Research Engineer, Controls and Avionics Laboratory

† Corresponding Author: Assistant Professor, Faculty of Aeronautics and Astronau-tics, Istanbul Technical University

Page 2: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

1 Introduction

During the last decade, Unmanned Air Vehicles (UAVs) have enjoyed practicalsuccess at scenarios involving reconnaissance, surveillance and active tracking.Independent of the role of the unmanned system (ranging from support scenariosto scenarios that are only achievable by customized unmanned vehicles), thereare two major drivers that make such systems favorable over manned vehicles.These are the physical working conditions (such as remote locations with harshterrain or chemical or radioactive spill) and the scenario constraints (such asuninterrupted day long service) that make the operation of manned systems orvehicles impractical, risky or just cost ineffective.

With the ever growing involvement of UAVs in complex application areas(such as dynamically changing urban rescue operations), the types and the num-ber of tasks easily outgrow one vehicle’s (or a set of UAV operators’ commandand control) limited capabilities and thus require a fleet of UAVs to work au-tonomously in a collaborative fashion to achieve desired operational goals. In ad-dition, the increasing use of unmanned vehicles in civilian and military airspacedomains has driven critical requirements for the interoperability of manned-unmanned systems – rather this be for pure collision-avoidance purposes (sense-avoid) or for achieving common mission goals that require significant coordinatedactions (mark-image) [7].

In order to benefit from all of these growing capabilities of UAVs, it is essentialto develop new control structures and algorithms that allow manned-unmannedsystem interoperability [4] as abstracted in Figure 1. In addition, the increasingcomplexities in missions and the safety-critical requirements on the avionicssystems demand that all the control and coordination algorithms, the avionicshardware and the vehicles be tested on realistic testbeds [2, 9] before using themin the real mission scenarios [6, 19].

Towards this end, an experimental network mission simulator, as shown inFigure 2, is developed for joint real-time simulation across manned-unmannedfleets, and the mission control center. The in-house developed mission simulatorallows rapid prototyping, software-in-the-loop (SIL) and hardware-in-the-loop(HIL) testing of various coordination algorithms among manned and unmannedfleets, and the mission control center. Open-source flight simulation software,FlightGear [16], is modified for networked operations and it is used as the 3Dvisualization element for the pilot and the mission controls. The manned vehicledynamics, UAV dynamics and low-level control algorithms are embedded withinthe xPC computers using Matlab/Simulink rapid prototyping technique for real-time execution of the mathematical models and control algorithms. Equippedwith touch-screen C2 interface at the pilot station, the platform also allows us torapidly prototype and test pilot-unmanned fleet supervisory control designs [3].

The mission simulator is structured around two distinct bus systems rep-resented by the visualization and the mission layer. In addition, the hardwarestructure within the network simulator is tailored to mimic the distributed na-ture of each of the vehicle’s processors and communication modules. This allowsrapid integration of new hardware elements to both the simulation units, and

Page 3: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 1. An overview of the distributed command-control (C2) problem in which amission is carried out by human operators, and unmanned-manned vehicles. All entities,both humans and vehicles, can be considered as members of a cooperative team workingtowards accomplishing a common goal. From a software engineering point of view,entities carry common software objects, such as human operators (who can be airborneco-pilots or commanders in the mission control center) can be represented by Commandand Control Interfaces (C2I). This functional commonality is carried out to the missionsimulator design requirements.

to the simulation scenario. As such, hardware-in-the-loop testing of network buscompatible in-house developed micro-avionics systems [14] is carried out at flightcontrols and mission planning levels. One distinct feature of the mission simu-lator is the ability to integrate real unmanned air or ground vehicles within thesimulation scenario coupled with virtual unmanned and manned vehicles. Thisallows in-flight testing of vehicles in simulated scenarios.

However these capabilities require data exchange across not only similar units(UAVs), but also across dissimilar units (ground stations, manned vehicles).In addition, the communication and data distribution in the mission layer ofthe cooperative unmanned-manned vehicle network is complicated by mission-driven information flow requirements. While achieving fleet-level coordination,the unmanned-manned airborne fleet network must also remain in communica-tion with other non-fleet entities such as commander units as well as groundstations, information nodes [13] which forward requests and enhance situationalawareness.

Figure 3 illustrates the cooperative unmanned-manned vehicle network froma communication and information distribution perspective. Each entity withinthis network, such as unmanned-manned vehicle or ground asset, has severallocal information sources such as sensors, embedded-computers or C2 interfaces.This information may be purely payload data about the environment or couldinvolve the states of the vehicle. In addition, this information can be propagatedto the network for data fusion, or this data set may be commands or mission

Page 4: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 2. General architecture of Cooperative Manned-Unmanned Network Simulator

Page 5: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 3. A conceptual view of the Communication and Information Distribution Prob-lem shows vehicles as nodes, embedded computers and sensors as local informationsources, and communication modules.

Fig. 4. Working Concept of the Communication and Information Distribution Module

Page 6: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

related requests that need to be delivered to an individual vehicle or a group ofvehicles in the network in order to achieve the predefined mission collaboratively.Because of such mission-driven information flow requirements, a standalone andmulti-functional module is needed for communication and information distri-bution with other entities in the overall network. Driven by this need, we havedeveloped a standardized Command and Information Distribution Module (CID)as a part of the mission simulator hardware. Figure 4 shows the working con-cept diagram of this module. The module has several communication interfacesincluding Ethernet, CAN and serial link allowing rapid integration of a varietyof additional piloted simulation vehicles, virtual or real unmanned vehicles tothe existing simulation both at the visualization and the mission layer.

The organization of this work is as follows: In Section 2, we present the generaldesign and the architecture of the mission simulator. First, the unique elementsof the visualization layer: simulation control center and the networked FlightGear implementation is introduced. Second, the CID module which plays animportant role within the mission layer is described. In Section 3, the mannedsimulator and its software/hardware elements are introduced. In this section,as a part of C2 implementations, the prototype of a Human-Machine-Interfacedisplay system is described. These units not only serves as the primary flightdisplay, but also serves as airborne (in manned simulation) pilot/operator su-pervisory control device for unmanned air vehicle fleets in coordinated missions.In Section 4, unmanned vehicle simulator components including virtual and realvehicles are illustrated. In this section, two distinct mission simulator experi-ments are described. The first experiment involves hardware-in-the-loop test-ing of a fleet level autonomous target/task assignment algorithm implementedacross a UAV network [10]. The second experiment is software-in-the-loop testingof real-time probabilistic trajectory planning [11, 12] and mode-based nonlinearcontrol [18] algorithms for a UAV operating within a city-like complex and denseenvironment. The chapter concludes with the on-going research activities aimedat extending the capabilities of the mission simulator.

Page 7: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

2 General Design and Architecture of the Simulator

The general design of the mission simulator is structured around two layers: thevisualization and mission layer. These two layers represent two different databus structures and data flows. As seen in Figure 2, simulation elements such aspiloted vehicle simulator, unmanned vehicles, real unmanned vehicles (groundvehicles and micro-UAVs), ground stations and the simulation control computerscarry distinct wired and wireless connections to these two data layers.

Visualization Layer entails the passage of the visualization and simulationrelated data packets (i.e. packets which result in a coherent visual picture ofthe whole scenario to the operators and the simulator pilot) across the wiredethernet network using UDP packets. The visualization layer uses open-sourceFlightGear flight simulator packet structure to allow direct integration to theflight simulator visualization elements. These visualization elements include thethree panel environment display for the pilot of the manned simulator (as shownin Figure 5) and the pilot/operator panel for tactical/simulation displays. TheMission Layer is accomplished via wireless communications (serial and Ethernet)across each unique entity existing within the simulation using predefined datapacket numbers and structures. Mission critical data such as target assignments,payload readings, commands and requests are delivered through this wirelessmission layer link.

Fig. 5. Multiple monitor visualization as seen by the pilot during a joint mission withunmanned helicopters

The reason for this layered approach stems from the need to differentiatethe real operational wireless communication links in mission scenarios from thewired communication links to obtain visualization and simulation informationfor manned vehicle simulator and the operators. This break-up of layers notonly represents a realistic mission implementation, but also allows HIL testingand analysis of wireless network structures (point-to-point, point-to-multi point,ad-hoc) that mimic the real operational communication links as it would be im-plemented and experienced in a real mission. The modular design of the missionsimulator elements is tailored according to the need for joint real-time simu-lation across groups of manned and unmanned fleets, and the mission controlcenters. As such, the hardware structure within the network simulator is tailoredto mimic the distributed nature of each of the vehicle’s processors and communi-cation modules. Note that the layered bus approach to the visualization and the

Page 8: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

mission layer allows direct integration of new manned and unmanned simulationelements within the scenario once linked to the corresponding wired and wirelessbuses.

In the next two subsection, we will review the structure of the simulationcontrol center and basic building blocks of the visualization and the mission lay-ers. These are the FlightGear software implementation used in the visualizationlayer, and the communication and information distribution modules used in themission layer. In the two sections to follow the hardware and software elementsfor both manned and unmanned vehicle simulations, and the C2 implementa-tions in typical mission scenarios, will be discussed.

2.1 Simulation Control Center and FlightGear: Visual LayerImplementation

Network simulator visualization requires integration of components with differ-ent specific purposes. Visualization includes cockpit simulation with multipledisplays and a tactical interface for visualizing simulation objects on a 2D map.Integration of these components requires multiple node communication over theEthernet network. Underlying network packet structure is the key point of thecommunication. Well defined packet structure of the network also enables theexpendability of visualization with any additional visualization component suchas a 3D projected visualization or another display system for piloted simulation.For this reason, an open source flight simulator with networked operation option,FlightGear and its OpenSceneGraph visualization element, is used for both sim-ulation and visualization of the mission scenario. The packet structure within thevisualization layer follows the FlightGear native data format identified by threedistinct packet structures: Dynamic States Packet (D), FlightGear MultiplayerPacket (M) and Flight Dynamics Packet (FDM). D packets represent vehicledynamic states such as position, orientation and velocity information suppliedby the computers running the mathematical model of each of the manned andunmanned vehicles. M packets include information about position, orientation,velocity and the model of aerial vehicle. FDM packets include generic type andconfiguration information about the aerial or ground vehicles.

The simulation and the visualization is controlled by the simulation controlcenter consisting of three elements: simulation operator computer, world com-puter and the FlightGear server:

Operator Panel is used for configuring the network setting of the simulator.All information about the simulation (numbers of manned-unmanned computers,IP addresses of the machines) are input from this GUI.

World Computer is used for controlling of the packet traffic of the visualiza-tion layer. Basically, an in-house developed program called State Sender runson this computer and makes packet conversion from Dynamic (D) packet to the

Page 9: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Multiplayer (M) or Flight Dynamic Model (FDM) packets. M and FDM pack-ets are needed for visualization of the simulation. State Sender, being an add-onmodule to FlightGear, basically is a piece of C++ code that takes dynamic statesinformation of a vehicle or any other simulation object and when it receives suchinformation it makes the packet conversion. All the unmanned-manned comput-ers which run the mathematical models of the vehicles (xPC computers) in thesimulator send their states as Dynamic (D) packet to these programs at a spe-cific frequency and State Sender programs convert the dynamic (D) packet tothe multiplayer (P) packet and send them to the FlightGear server and the 3DMulti-monitor visualization systems of manned simulator. In addition to that,for the manned simulator, the State Sender program makes a dynamic (D) toFlight Dynamic Model (FDM) packet conversion and sends the FDM packetsto the 3D Multi-monitor visualization system for cockpit visualization of themanned simulator.

Fig. 6. Basic CID module structure: thread interactions and shared data

FlightGear Server (fg server) is “an Internet and LAN based multiplayerserver for the FlightGear Project. It provides services to link pilots together so

Page 10: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

they can see each other when flying”1. Fg server is designed for communicatingwith FlightGear only. However, if the data structures of FlightGear are imple-mented, other programs may send packets to, and thus talk with fg server. Sinceboth FlightGear and fg server projects are open-source projects, group simula-tion of aerial and ground vehicles can be accomplished easily with a separateprogram that carries the main data structures as implemented in FlightGear. Fgserver in mission simulator runs in a separate computer, listening for incomingconnections. The protocol for exchanging packets in fg server is UDP. Throughthe Fg server not only we can create a complete list and picture of all the simula-tion elements within the mission, but also generate tactical display informationto the pilot/operator and the simulation controllers by linking to the server.

2.2 Communication and Information Distribution Module (CID):Mission Layer Implementation

CID module is an in-house developed communication and information distri-bution module for heterogenous teams of unmanned and manned vehicles, andground assets. The module implements a customized routing algorithm thattakes the data from these different communication interfaces and routes to thedestination addresses according to the real-time updated routing tables. CIDmodule’s objective is to gather control messages which are sent by other enti-ties in the network from control channels, and serve the requests or commandsaccording to the control message types. Figure 4 shows the working conceptdiagram of this module.

The module has several communication interfaces including Ethernet, CANand serial link. Our first hardware implementation of the module is based ona PC-104 structured card stack which includes ethernet, CAN and serial links.Currently, the module code is being ported to run on Gumstix hardware mod-ules to result in considerable weight advantage as a stand-alone unit. Each localprocessor unit of a vehicle and the remote CID modules of other vehicles areconnected to the module with both data and control channels. Data about envi-ronment, states, or other specific information are gathered from local units anddistributed to specific destinations via data channels. Also, routing of informa-tion flow is changed according to the commands, requests and special messagestaken via the control channels.

The software CID module is written in C++ and boost thread library wasused for thread management [8, 17]. General architecture of the software was de-veloped as modular as possible by using a highly object oriented design method-ology. Figure 7 illustrates an overview of the class hierarchy used in the software.Since there are numerous threads need to be synchronized during execution of thecode, classical thread synchronization problems were encountered (producers-consumers and readers-writers problems). Monitor concept with Mesa semanticis utilized in order to solve these classic problems.

1 http://fgms.sourceforge.net/

Page 11: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 7. Level 2 use case diagram for Communication and Information DistributionModule

Page 12: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Functional Semantics The operation of the CID module can be separated intofour systems: configuring, controlling, forwarding and transporting informationsystems. There are three main packet types as depicted in Figure 7 Use Casediagram: configuration packet, control packet, and data packet. First of all, theCID module has to be configured according to special configuration data whichincludes the name of the communication interfaces, parameters, address of unitsand the other CID modules in the networks. When the module turns on, Config-ure System handles all operations related with initialization of connection tables,creation of controller objects and all other control communication interfaces.

Basically, the CID module operates in two different planes: the control plane,and forward plane. Parsing of control messages taken via control channels andcontrolling of the data channels are handled in the control plane. Control In-formation use case handles operation related with parsing of control messages.One controller objects running in this plane manages the preparation of the datacommunication interface object and the forwarder objects according to controlmessages. This object can control the information flow by updating routing ta-bles of forwarder objects. Forwarding plane is responsible for the continuous orperiodic routing of the data taken via data channel. Forwarding Information usecase has several subtasks related with updating of routing table, periodic andcontinuous routing operations. There may be some data routing requests fromother clients, and in this case controller object can update the routing table byusing routines of forwarding plane. Routing of information may be requested ascontinuously or periodically. These two planes heavily use low level functionswhich are implemented in Transport Information use case and these two planesperform the actual process of sending and receiving a packet received on a log-ical interface to an outbound logical interface. Transport Information use caseincludes operations about configuration of several physical communication inter-faces including CAN, Ethernet and Serial. Low level send and receive operationsare implemented by functions in this use case. The overall module software im-plementation is tested under several cases transmitting data between not onlysimilar units, but also dissimilar units. The specific test cases used for inter-vehicle and intra-team data include ad-hoc network structures resulting fromGround Station and Mission Coordination Computer connections.

Page 13: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

3 Manned Vehicle Simulation

Manned vehicle simulation system as shown in Figure 8 consists of a generic pilotstation and a dedicated computer rack. The generic pilot station is equippedwith both standard helicopter (cyclic, pedal, collective, throttle) and aircraft(thrust, flap) pilot input devices. In addition, the pilot could employ a fourdegrees of freedom joystick as a side stick. The pilot console includes a selectionof switches (such as engine, autopilot, stability augmentation system (SAS) on-off) and warning lights (such as altitude, engine failure) that can be configuredfor different operational needs and conditions. This is illustrated in Figure 9. Thepilot station is equipped with a three-monitor visualization system that providesscenery information of the flight scenario from FlightGear. The computer racksystem includes a set of dual bus (CAN and ethernet) connected computersfor pilot input A/D conversion, real-time execution of the vehicle mathematicalmodel, mission and flight controls, and C2 pilot/operator interface. Followingthe schematic in Figure 8, we present in detail each of the elements within thepilot station, the computer rack system, and their respective working schemes.

Fig. 8. Components of the manned simulation

3D Multi-monitor Visualization Multiple Display System enables 3D cock-pit simulation in FlightGear environment. Multiple monitor configurations re-quire three separate personal computers to be connected with each other through

Page 14: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

the Ethernet network. Multiple display synchronization property of FlightGearprovides flight dynamics (rather this be native to FlightGear or external drivenby a computer) exchange between two or more FlightGear programs running indifferent computers. Computers used for multiple display system are equippedwith high capacity graphics cards, and these cards are connected to wide screenLCD monitors. Each of these monitors are configured and located to certain po-sitions for establishing realistic cockpit simulation. In cockpit simulation mode,two computers run their simulation with the FDM packets that they receivefrom the main computer which is accepted to be the central (or main) com-puter. Specifically for the piloted simulator, central computer running Flight-Gear (rather this be in native or external dynamics mode) captures the flightdynamics of the current aircraft, and sends them to the left and right computerover the visualization layer.

Fig. 9. The pilot panel and the connection of the pilot controls to the simulator

xPC Computer–Dynamics The mission simulator has the capability of simu-lating a wide-range of vehicle dynamics including both FlightGear’s built-in andin-house developed aircrafts’ and helicopters’ models. The Mathworks’s xPCTarget product is used for real-time simulation of different vehicle dynamicsand testing of different control algorithms. Therefore, dynamic models of severaltypes of air vehicles such as helicopters (EC-120, Vario) and fighter aircrafts (F-16, F-4) have been developed in Matlab/Simulink environment, and then thesemodels are compiled into C code via Real-Time Workshop. Finally, the automat-ically generated C code is downloaded to a low-cost computer, named as xPCComputer, running the xPC Target real-time kernel via Ethernet connection.

One of the dynamic models that is frequently used in the mission simulator isa high fidelity model of an F-16 aircraft from a NASA report [15]. This model in-cludes a six degrees of freedom full envelope flight dynamics, implements engineand atmosphere models, and has four direct input channels of an aircraft; eleva-tor, aileron, rudder and throttle. In addition, the model is enhanced to includeactuator saturations and rate limitations.

Page 15: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 10. Mode Based Agile Maneuvering

This model can also be used as a good representation of flight dynamics of anunmanned combat air vehicle. By designing reference tracking control laws, it ispossible to autonomously fly agile maneuvers with the aircraft. In general it is avery challenging task to describe general motion of an unmanned air vehicle, anddesign a single control law that handles the complete reference tracking needsfor such maneuvers. From the inspection of the well known smooth aerobaticmaneuvers and more complex combat maneuvers, we see that this task can bequantized by decomposing general maneuvers to maneuver modes. Via thesemodes both the system dynamics and the control task is simplified. Using thesemodes, we have developed a structured finite state automaton that spans thefull-flight-envelope maneuvers and designed nonlinear sliding manifold controlsystem to the maneuver sequences resulting from this automaton [18]. Figure 10represents the agile maneuverability of this control approach, flying the F-16autonomously through a series of complex combat maneuvers.

xPC Computer/Pilot Input, Control Panel A/D IO This xPC computeremploys a set of National Instruments A/D input-output card, CAN interfacecard, and in-house built multiplexers. As illustrated in Figure 9, the controlinputs from the pilot including flight controls and the console come in the formof both analog and digital signals. The signals are converted to CAN messagesthrough this unit. Later, the pilot input CAN messages are fed back to the xPCcomputer executing the dynamic model.

Page 16: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

xPC Computer/Controller This computer provides the ability to implementthe controller (stability augmentation or autopilot algorithm) of the mannedvehicle external to the dynamic model. Due to its fast development and richbuilt-in blocks, Matlab/Simulink environment is chosen for development of thecontroller algorithms.

Mission Coordination Computer (MCC) is used for coordination betweenother manned-unmanned vehicles in the fleet when performing a mission col-laboratively. Higher level planning and coordination algorithms are run on thiscomputer. Specific usage of the MCC, which is also an integral component ofthe unmanned vehicles will be described in the next section. One of the uniqueapplications associated with the mission simulator is development of new human-machine-interface designs that provide unmanned vehicle group command andcontrol capability to the pilot/operator screens. An extension of the primaryflight displays with this capability is illustrated in the next subsection.

Fig. 11. Event-Driven Decision Support Architecture

3.1 C2 Application: Expansion of the Human-Machine-Interface(HMI) to decision-support system for manned and unmannedfleets

Distributed command and control of vehicles in a dynamically changing environ-ment is a challenging problem as the number of the vehicles and complexity of the

Page 17: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

mission increases. In order to solve this challenging problem, one approach thatcan be used is to change the role of manned vehicle from low level commandingof unmanned vehicle to supervisory commanding [3]. However, it is not alwaysfeasible to apply basic supervisory command and control for a large numberof unmanned vehicles performing complex missions with strict time constraint.Therefore, a decision support or autonomous decision-making system can be de-signed for the commander that decrease the workload. Figure 11 illustrates theoverall process of such a decision making support system.

Fig. 12. Primary Flight Display of the manned simulation

A key part of such an event-driven process interaction hinges on the UAVsto autonomously coordinate the target/task assignments and distributions. Theexperimental illustration of this within the mission simulator will be illustratedin the next section. However, it is important to note that supervisory controland interruption is desired in all mission critical phases. Towards this goal, wehave developed a Human-Machine-Interface display systems which allow thissupervisory control capability over unmanned vehicle networks over the mannedsimulation platform. In addition to showing manned vehicle flight informationdata, this computer also tracks and monitors the action of unmanned vehiclefleet within the joint mission.

On the primary pilot interface flight and mission information is displayed in5 separate screens. These screens are MAP, Primary Flight Display (PFD), Syn-thetic Vision System (SVS), Engine Display Unit (EDU), and Health and UsageMonitoring System (HUMS). The primary display functionality is illustrated inFigure 12. SVS [1, 5] screen provides significant situational awareness with 3Dview of terrain and aerial vehicles. A view of SVS system is shown in Figure 14.

In addition, a C2 pilot/operator interface module is designed as a GUI basedprogram that interacts with the pilot. This module can be seen in Figure 13.

Page 18: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 13. Command and Control Interface GUI

Fig. 14. SVS screen view

Page 19: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Basically, this interface takes dynamic model packets of each vehicle and othercommands from different components of network simulator as inputs, and thenserves this information on a 2D map to the pilot as basic as possible. Thisinterface mainly has two parts; a 2D map which shows other vehicles, way points,trails, and a command panel that contains several commands like assigning avehicle for a specific task. It has capability of interactive assignment of nextwaypoint by simply ”touching” an aerial vehicle in view of the screen.

The display hardware consists of a PC with touchscreen LCD displays. ThePC has TCP port for intercommunication between the display and the missionsimulator, and USB ports are used for various data loadings such as streamingvideo, terrain databases, and data recordings such as flight data. The touchscreeninput capability provides smooth operator interfacing and a flexible software de-sign environment. The interface software consists of separately developed graph-ical objects tightly coupled with each other based on a state machine. Everyobject in the object oriented hierarchy is run by a state machine. The operatorcan navigate though screens by an adaptive menu, which changes its navigationbutton positions and functions.

Current work on the interfaces is aimed at a) increasing ergonomics of tacticaloperations and navigational experience by means of an 3D artificial tunnel-in-the-sky which has real-time reshaping capability for pursuit-evasion and navi-gation scenarios, and b) enabling voice control commands for the C2 interfacesacross manned-unmanned vehicle joint operations.

Page 20: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

4 Unmanned Vehicle Simulation

Unmanned vehicle simulation integration within the mission simulator can beachieved through virtual vehicles and real unmanned vehicles. The virtual un-manned simulation is run on a rack of standard PCs which has four compo-nents: mission coordination computer (MCC), xPC computer/controller, xPCcomputer/dynamics, and CID module. As there are two types of informationflow in the network, namely mission information and visualization information,there are two different communication buses: CAN and Ethernet bus. CAN busis used for mission related information flow, while Ethernet is used for data flowto the visualization system. The components of the virtual unmanned vehicle(VUV) simulation is shown in the Figure 15. The real unmanned vehicles con-nect to the mission layer through their own CID module and vehicle informationdata is transported to the visualization layer through the mission simulator’sown dedicated CID module which collects data from micro-air or ground vehi-cles naturally flying or driving outside the lab environment.

In the next subsections, we review the C2 mission algorithmic implementa-tions on both the virtual and the real unmanned vehicles.

Fig. 15. Components of Virtual Unmanned Vehicle Simulation

Page 21: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

4.1 Virtual Unmanned Air Vehicle

The simulation rack structure of the virtual unmanned air vehicle is identicalto the manned system except for the pilot controls and the C2 interface com-puters. The system consists of four computers including the CID module. Eachindividual vehicle has to use CID module in order to join the mission network.This CID module implements a simple communication protocol that is usedamong components of the network for communication and information distri-bution. xPC Computer/Dynamics is used for executing the dynamics algorithmof the unmanned vehicle in real-time, while the control computer provides theclosed-loop control and command functionality. Mission Coordination Computeris used for coordination between other manned-unmanned vehicles in the fleetwhen performing a mission collaboratively. The program structure that is imple-mented on MCC, has three sub modules; communication module, flight missionmodule, and task module. As shown in Figure 16, the Flight Mission Module isresponsible for data flow control and sequencing between tasks. CID module isused for communication with other MCCs, and the task modules may be anyspecial coordinated planning algorithm such as task assignment or a specificpursuit and evasion implementation between two vehicles.

Fig. 16. Mission Coordination Computer Program Architecture

In the next subsection, we review a specific implementation of an autonomouslarge-scale target-task assignment algorithm as implemented across a UAV net-work within the mission simulator.

4.2 C2 Implementation: Hardware-in-the-loop Testing of aLarge-scale Autonomous Target-Task Assignment Problem fora UAV Network

As a first step coordination algorithm implementation within the mission simu-lator, we considered one of the basic problems. The problem consists of n targets

Page 22: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

to be visited by m UAVs and the vehicles should autonomously find the way-point selections that results in the minimum total path traveled by the UAVfleet. In addition, all the mission coordination and the communication betweenthe units had to be done autonomously without any human intervention. To ad-dress the first step of this problem, we have developed a large-scale distributedtask/target assignment method that allows autonomous and coordinated task-assignment in real-time for a UAV fleet. By using delayed column generationapproach on the most primitive non-convex supply-demand formulation, a com-putationally tractable distributed coordination structure (i.e. a market createdby the UAV fleet for tasks/targets) is exploited. This particular structure issolved via a fleet-optimal dual simplex ascent in which each UAV updates itsrespective flight plan costs with a linear update of way-point task values asevaluated by the market. The complete theoretical treatment and algorithmicstructure of this problem can be found in [10]. The basic implementation of thiswithin the network simulator is given in Algorithm 1:

Algorithm 1 Distributed Synchronized Implementation of Autonomous Target-Task Selection Algorithm for ith UAVInput: n waypoint positions and the ith UAVs position.Output: The fleet shortest path optimal waypoint assignments for ith UAV

PHASE I: Initialization1: Compute the distances between the waypoints and form the n× n matrix V2: Compute the distance to the waypoints and form the n× 1 matrix di

3: for every possible path ith UAV can travel do4: Add the path to the matrix Di

5: Calculate the cost of this path to vector ci

6: end for7: Read the (n + m)× (n + m) initial predefined B route matrix from memory8: Communicate The costs for corresponding paths in the initial B matrix to start

the master iterationPHASE II: Dual Simplex Ascent

9: repeat10: Solve the restricted master iteration and find the dual variables vector: [π µ]′ =

[c̄′1 . . . c̄′m]′B−1

11: Remove the ineffective flight path (if any) with maximum cost12: Solve the sub-problem corresponding to the ith UAV by updating the flight

costs: cqi updated = cq

i − πdq1 − µi

13: Select the minimum negative cqi updated (sub-problem cost) and the associated

path14: Exchange the selected path and its flight cost and the paths and the associated

costs generated by other vehicles15: Insert the first feasible flight path with minimum cost solution and its cost to

the restricted master problem.16: until There are no paths with negative subproblem costs

Page 23: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

In this particular experiment, we use the Mission Coordination Computers(MCCs) to embed the real-time implementation of the task/target assignmentalgorithm and run simulations if they are in a real mission. Vehicle dynamicsand the low level control algorithms for each UAV are embedded within theirunique xPC target modules. These modules simulate an autonomous UAV whichreceives a sequence of targets from a UDP based communication channel andexecutes these commands to reach the targets. MCCs run a C program that isused as a bridge for the communication layer between the UAVs and the Mat-lab based computation layer of each of the UAVs. Each UAVs MCC send itsrespective targets to the UAV control systems during the execution of the targetassignment algorithm. The world computer which is responsible for simulation ofthe mission informs the MCCs if a new target appears in the scenario. Figure 17provides a detailed look at the computational behavior of the algorithm. Thedata points correspond to average values of 600 different simulation scenarios.Note that a total of 250 targets can be solved across 5 UAVs in approximately50 seconds. This illustrates the real-time implementation capability of the algo-rithm in comparison to the only off-line capable standard integer programmingapproach [10]. The numeric solution to a scenario covering 500 waypoints withdynamic pop-ups is given in Figure 18. Figure 19 provides the extensive numer-ical verification of the restricted horizon polynomial-time computational growthproperty of this distributed formulation.

Fig. 17. The computational behavior of the algorithms shows strong correlation withthe number of way-points per UAV given a particular snap-shot of scenario

4.3 Real Unmanned Air and Ground Vehicle Simulation

One of the unique features of the mission simulator is the fact that the existingreal unmanned micro-air and ground vehicles within the laboratory can connectto the mission simulation scenario using their customized CID modules. Thisfeature not only provides in-flight verification capability of mission coordination

Page 24: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 18. The waypoint routes for a random pop-up task-target assignment for fourvehicles. The algorithm is implemented further in the receeding horizon mode for fivehundred waypoints.

Fig. 19. The numeric computational complexity analysis suggests polynomial-timegrowth for restricted time horizons.

Page 25: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

algorithms, but also creates a financially less cumbersome testing environment inwhich some of the vehicles are simulated instead of being flown. Figure 20 showshow the laboratory unmanned air and ground vehicles are perceived within themission network once connected to the mission simulator through their respectiveCID modules.

Fig. 20. Components of Real Unmanned Simulation

Mission Coordination Computer In the avionics system of the unmannedvehicles, instead of a desktop PC, an embedded PC104 computer and an ARMprocessor is used for mission planning algorithms.

MPC555/Controller Similarly, in the real avionics system an embedded MPC555computer is used for running the real-time controller algorithms. Since this em-bedded processor can also be programmed by Matlab/Simulink, there is almostno extra effort to modify the controller algorithms used in xPC computer invirtual simulations. This results in the ability of software-in-the-loop verifiedalgorithms to be directly ported to the real hardware for hardware-in-the-loopand real flight testing.

Real UAV/UGV Avionics System and the hardware-in-the-loop in-tegration to the mission simulator A collection of in-house modified realUAVs/UGVs (including a Vario Acrobatic Helicopter, Trainer 60 fixed wing plat-form and a 1/8 scale Humvee [9]) each equipped with sensors, embedded proces-sors can be connected to the mission simulator during simulated scenarios. The

Page 26: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 21. Cross-platform compatible micro-avionics system and the hardware-in-the-loop integration to the mission simulator

cross-platform compatible architecture of the in-house developed microavionicssystem [14] is illustrated in Figure 21 and it used on all the unmanned platformswith minor modifications. This bus based microavionics system is structuredaround a main CAN bus and all the components including flight and missioncontroller boards and sensors communicate with each other via this bus. Sinceeach sensor has different type communication interface such serial or I2C links,a simple circuit named SmartCan with universal interface is designed to connectthese sensors to the CAN Bus. SmartCAN basically makes packet conversion be-tween different types of communication interfaces to the CAN interface. Thesevehicles can be operated autonomously or remotely by a human pilot. The op-eration mode selection is achieved by the means of a switch board that allowsswitching of the controls. In addition to the standard hardware-in-the-loop test-ing capability for flight controllers, the mission simulator also allows debugging offlight test problems through hook-up of the CAN Simulator to the network. TheCAN Simulator machine is capable of playing back and emulating the sensoryinformation of flight test data collected in the flight tests.

Ground Station Ground station computer, as seen in Figure 22, is used fortracking the vehicle and the mission data, creating new missions, and sendingspecial commands to any vehicle while the mission is being performed. Thiscomputer is also connected to the network via CID module like other vehicles.

The complete integration of the real unmanned vehicle platform (includingground stations) also enables us to do complete mission related performance

Page 27: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

Fig. 22. Ground Station GUI

tests of the real-time control algorithms on actual hardware not only outside,but also in the laboratory. For complex and risky tests involving agile maneuversin city-like dense environments, mission simulator is utilized for software in-the-loop testing of path planning and control algorithms. In the next subsection,we review two probabilistic flight trajectory planning and controls algorithmsdesigned as a part of the mission simulator development.

4.4 Software-in-the-loop Testing: Flight trajectory planning andcontrols in complex environments.

Motion planning and trajectory tracking control system design for autonomousexecution maneuvers in complex environments in search of tasks/targets is alow level enabling technology for UAV operations driven by performance goals.Towards this goal, we have developed a probabilistic trajectory planner thatuses distinct flight modes from which almost any aggressive maneuver (or acombination of) can be created. This allows significant decreases in control inputspace and thus search dimensions, resulting in a natural way to design controllersand implement trajectory planning using the closed-form flight modes.

Fig. 23. Mode-Based trajectory generation strategy and complete solution of and dy-namically feasible path planner

For fixed-wing UAVs a three-step probabilistic trajectory planner, as shownin Figure 23, is utilized to solve the motion planning problem. In the first step,

Page 28: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

the algorithm rapidly explores the environment through a randomized reach-ability tree (RRT) search using approximate line segment models. In order todecrease the computational time, we disregard the vehicle’s dynamic constraintsand use RRT for searching only the configuration space of vehicle. The resultingconnecting path is converted into flight way points through a line-of-sight seg-mentation. These way points are refined with a single-query Probabilistic RoadMap (PRM) planner that creates dynamically feasible flight paths by applyingthe feasibility constraints of the aircraft model. These feasibility constraints arecharacterized by a hybrid representation of aircraft dynamics. The general mo-tion of the aircraft has been divided into a discrete set of maneuver modes whichhas continuous flight dynamics and with each mode there is associated feasibilityconstraints such as flight envelope and actuator saturation limitations. The prob-lematic issue of narrow passages is addressed through non-uniform distributedcapture regions, which prefer state solutions that align the vehicle to enter themilestone region in line with the next milestone to come. Detailed explanationsof the related algorithms for this implementation can be found in [12].

Fig. 24. Probabilistic B-Spline trajectory generation strategy and example solutionpath result of an unmanned helicopter

For helicopter like Vertical Take-Off and Landing (VTOL) systems, a vari-ant two step planner as illustrated in Figure 24 is used for solving the motionplanning problem. In the first step, the planner explores the environment us-ing RRT algorithm. The resulting connecting path is converted into flight waypoints through a line-of-sight segmentation to filter long detours. Remainingpoints, named as way points, generally appear in entering and exiting regionsof the narrow passages that are formed between the obstacles. We segment thehard path planning problem to a series of small path generating problems onthese locations by adding way-points. In the second step, each consecutive waypoint is connected with third-order B-spline curves that represent constant in-ertial acceleration paths and these curves are repaired probabilistically until adynamically feasible path is obtained. Since B-Spline curves have local supportproperty, these repairing processes can be made on local interest of path. De-tailed demonstrations of this algorithm can be found in [11].

Page 29: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

With these two distinct flight planning and controls approaches, we can testthe coordination algorithms while ensuring that the target/task assignmentsin complex and dense environments can be achieved within the existing flightenvelopes of the vehicles.

Conclusions

In this work, an experimental mission simulator has been developed for joint sim-ulation of manned and unmanned vehicle fleets. This platform serves not onlyfor testing of distributed command and control algorithms and interfaces, butalso provides real device hardware-in-the-loop test and real flight test vehicle in-tegration capabilities within simulated scenarios. As such, it provides a realistictest platform to demonstrate and validate research on C2 algorithms and en-abling technologies for joint manned-unmanned operations. Current research onthis platform is focused on modifying hardware and algorithms with STANAGcompliant devices and message formats, adding speech command capability forhuman operator and pilots, and providing augmented reality concept to the SVSscreen by integration of real time streaming video from real unmanned vehiclesflying in simulated scenarios.

Acknowledgements

Authors would like to thank ASELSAN, Nazim Kemal Ure, Emre Koyuncu,Sertac Karaman, Ahmet Cetinkaya, Mirac Aksugur, Ismail Bayezit, Serdar Ates,Taner Mutlu, Fatih Erdem Gunduz for their valuable contributions to this work.

References

1. R. Behringer, C. Tam, J. McGee, V. Sundareswaran, and M. Vassiliou, editors.System for synthetic vision and augmented reality in future flight decks, volume4023. The International Society for Optical Engineering, Proceedings of SPIE,2000.

2. R. S. Christiansen. Design of an Autopilot for Small Unmanned Aerial Vehicles.Msc. thesis, Brigham Young University, 2004.

3. M. L. Cummings and S. Guerlain. An interactive decision support tool for real-time in-flight replanning of autonomous vehicles. AIAA 3rd Unmanned UnlimitedTechnical Conference, Workshop and Exhibit, pages 1–8, 2004.

4. J. Hollan, E. Hutchins, and D. Kirsh. Distributed cognition: Toward a new founda-tion for human-computer interaction research. ACM Transactions on Computer-Human Interaction, 7:174–196, June 2000.

5. W. F. Horne, R. R. Tucker, S. Ekiert, C. H. Hannan, J. Radke, and J. A. Zak.Synthetic vision system technology demonstration methodology and results to date.Technical Report 1-19, SAE Technical Paper Series, 1992.

6. J. How, Y. Kuwata, and E. King. Flight demonstrations of cooperative controlfor UAV teams. In 3rd Conference Unmanned Unlimited Technical Conference,Chicago, Illinios, September 2004.

Page 30: Development of a Mission Simulator for Design and Testing of C2 Algorithms and HMI Concepts

7. G. Inalhan, D. M. Stipanovic, and C. Tomlin. Decentralized optimization, withapplication to multiple aircraft coordination. In the Proceedings of the 41st IEEEConference on Decision and Control, Las Vegas, 2002.

8. E. D. Jones, R. S. Roberts, and T. C. S. Hsia. STOMP: a software architecture forthe design and simulation of UAV-based sensor networks. In Robotics and Automa-tion, 2003. Proceedings. ICRA ’03. IEEE International Conference on, volume 3,pages 3321–3326 vol.3, 2003.

9. S. Karaman, M. Aksugur, T. Baltaci, M. Bronz, C. Kurtulus, G. Inalhan, E. Altug,and L. Guvenc. Aricopter : Aerobotic platform for advances in flight, vision con-trols and distributed autonomy. In IEEE Intelligent Vehicles Symposium, Istanbul,Turkey, June 2007.

10. S. Karaman and G. Inalhan. Large-scale task/target assignment for UAV fleetsusing a distributed branch and price optimization scheme. In Int. Federation ofAutomatic Control World Congress (IFAC WC’08), Seoul, South Korea, June 2008.

11. E. Koyuncu and G. Inalhan. A probabilistic b-spline motion planning algorithm forunmanned helicopters flying in dense 3d environments. In International ConferenceIntelligent Robots and Systems, Nice, September 2008.

12. E. Koyuncu, N. K. Ure, and G. Inalhan. A probabilistic algorithm for mode basedmotion planning of agile air vehicles in complex environments. In Int. Federationof Automatic Control World Congress (IFAC WC’08), Seoul, South Korea, June2008.

13. A. A. Makarenko. A decentralized Architecture for Active Sensor Networks. PhDthesis, The University of Sydney, November 2004.

14. T. Mutlu, S. Comak, I. Bayezit, G. Inalhan, and L. Guvenc. Development ofa cross-compatible micro-avionics system for aerorobotics. In IEEE IntelligentVehicles Symposium, Istanbul, Turkey, June 2007.

15. T. L. Nguyen, M. E. Ogburn, W. P. Gilbert, K. S. Kibler, P. W. Brown, and P. L.Deal. Simulator study of stall/post-stall characteristics of a fighter airplane withrelaxed longitudinal static stability. Technical paper, NASA, December 1979.

16. GPL OpenSource. The flightgear flight simulator, 1996.17. GPL OpenSource. Boost c++ libraries, 2001.18. N. K. Ure and G. Inalhan. Mode based hybrid controller design for agile maneu-

vering F-16 aircraft. In manuscript form for review in Journal of Process Control,Turkey, 2008.

19. M. Valenti, T. Schouwenaarsy, Y. Kuwataz, E. Feronx, and J. How. Implementationof a manned vehicle - UAV mission system. AIAA Guidance, Navigation, andControl Conference and Exhibit, 2004.


Recommended