+ All Categories
Home > Documents > [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri...

[American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri...

Date post: 15-Dec-2016
Category:
Upload: bonnie
View: 213 times
Download: 1 times
Share this document with a friend
8
The Unmanned Autonomous Aircraft Control System (UAACS) Shivang Patel * , Chris Brinton , and Bart Gallet Mosaic ATM, Leesburg, VA, 20175 Bonnie Schwartz § Air Force Research Lab, Wright Patterson Air Force Base, OH Mosaic ATM is designing a command and control system for Unmanned Aerial Sys- tems (UAS) to facilitate integration of unmanned aircrafts into the current and NextGen National Airspace System (NAS). Referred to as the Unmanned Aircraft Autonomous Control System (UAACS), this solution aims to integrate state-of-the-art air traffic con- trol (ATC) automated speech recognition (ASR) with robust planning and guidance for UAS. The system will enable a UAS to listen to and interpret ATC instructions given to it to perform real-time situation assessment, and to select\recommend the best course of action satisfying all requirements without sacrificing the safety and security of neigh- boring aircraft. Artificial Intelligence (AI) based planning and hierarchical task networks (HTNs) provide knowledge-based task planning that leverages state-of-the-art routing and guidance algorithms to achieve agile and predictable tactical maneuvering in areas of high traffic congestion and reduced separation distances. UAACS attempts to interface with the UAS at the task planning and sequencing level, but it possesses the ability to manage UAS guidance systems if situational complexity or pending performance demands exceed the known capabilities of the UAS. I. Introduction The current command and control paradigm for managing air traffic is highly dependent on voice commu- nication. This approach has benefited the development of ATC over the last century inn a number of ways, including a low level of required aircraft equipage and the ability to handle contingency situations and adapt to new requirements easily due to the flexibility and adaptability of human air traffic controllers and pilots. However, the reliance on voice communication in ATC operations presents challenges to the use of an UAS. 1 The current operation of UAS requires consistent and significant activity on the part of the human remote pilot to provide all communication with ATC and to direct the aircraft. The remote pilot must interpret each ATC command, direct the aircraft accordingly, and communicate the appropriate response back to ATC. Further, if the ATC to human operator communication link fails, there is currently no capability for direct ATC to UAS command and control and thus no ability to preserve safety and maintain maximum mission effectiveness during such a communications failure. Conforming to FAA regulations during autonomous flight is critical for integration in the NAS. 2 We have developed a prototype system that combines ASR with artificial intelligence planning to solve the autonomous aircraft planning problem. We are using the Simple Hierarchical Ordered Planner (SHOP) 3 to provide goal-oriented task planning capabilities to control a simulated UAV. We have a created a synthetic simulation environment to simulate flight of an autonomous aircraft and the communication between the aircraft and ATC. The UAACS prototype uses the Sphinx speech recognition engine 4, 5 to provide recognition of ATC communication. We have also demonstrated the ability to perform task planning and replanning based on simulated environmental and situational conditions. * System Engineer, Autonomous Systems Group, [email protected] AIAA Member President, [email protected], AIAA Member System Engineer, Autonomous System Group, [email protected] § Computer Engineer, [email protected] 1 of 8 American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 29 - 31 March 2011, St. Louis, Missouri AIAA 2011-1457 Copyright © 2011 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.
Transcript
Page 1: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

The Unmanned Autonomous Aircraft Control System

(UAACS)

Shivang Patel �, Chris Brinton y, and Bart Gallet z

Mosaic ATM, Leesburg, VA, 20175

Bonnie Schwartz x

Air Force Research Lab, Wright Patterson Air Force Base, OH

Mosaic ATM is designing a command and control system for Unmanned Aerial Sys-tems (UAS) to facilitate integration of unmanned aircrafts into the current and NextGenNational Airspace System (NAS). Referred to as the Unmanned Aircraft AutonomousControl System (UAACS), this solution aims to integrate state-of-the-art air tra�c con-trol (ATC) automated speech recognition (ASR) with robust planning and guidance forUAS. The system will enable a UAS to listen to and interpret ATC instructions givento it to perform real-time situation assessment, and to selectnrecommend the best courseof action satisfying all requirements without sacri�cing the safety and security of neigh-boring aircraft. Arti�cial Intelligence (AI) based planning and hierarchical task networks(HTNs) provide knowledge-based task planning that leverages state-of-the-art routing andguidance algorithms to achieve agile and predictable tactical maneuvering in areas of hightra�c congestion and reduced separation distances. UAACS attempts to interface withthe UAS at the task planning and sequencing level, but it possesses the ability to manageUAS guidance systems if situational complexity or pending performance demands exceedthe known capabilities of the UAS.

I. Introduction

The current command and control paradigm for managing air tra�c is highly dependent on voice commu-nication. This approach has bene�ted the development of ATC over the last century inn a number of ways,including a low level of required aircraft equipage and the ability to handle contingency situations and adaptto new requirements easily due to the exibility and adaptability of human air tra�c controllers and pilots.However, the reliance on voice communication in ATC operations presents challenges to the use of an UAS.1

The current operation of UAS requires consistent and signi�cant activity on the part of the human remotepilot to provide all communication with ATC and to direct the aircraft. The remote pilot must interpret eachATC command, direct the aircraft accordingly, and communicate the appropriate response back to ATC.Further, if the ATC to human operator communication link fails, there is currently no capability for directATC to UAS command and control and thus no ability to preserve safety and maintain maximum missione�ectiveness during such a communications failure. Conforming to FAA regulations during autonomous ight is critical for integration in the NAS.2

We have developed a prototype system that combines ASR with arti�cial intelligence planning to solvethe autonomous aircraft planning problem. We are using the Simple Hierarchical Ordered Planner (SHOP)3

to provide goal-oriented task planning capabilities to control a simulated UAV. We have a created a syntheticsimulation environment to simulate ight of an autonomous aircraft and the communication between theaircraft and ATC. The UAACS prototype uses the Sphinx speech recognition engine4,5 to provide recognitionof ATC communication. We have also demonstrated the ability to perform task planning and replanningbased on simulated environmental and situational conditions.�System Engineer, Autonomous Systems Group, [email protected] AIAA MemberyPresident, [email protected], AIAA MemberzSystem Engineer, Autonomous System Group, [email protected] Engineer, [email protected]

1 of 8

American Institute of Aeronautics and Astronautics

Infotech@Aerospace 201129 - 31 March 2011, St. Louis, Missouri

AIAA 2011-1457

Copyright © 2011 by the American Institute of Aeronautics and Astronautics, Inc. All rights reserved.

Page 2: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

II. System Overview

The UAACS prototype consists of multiple components that provide planning, execution and variousother inputs and outputs to the system. The components can be broken into four categories as shown inFigure 1 the core components, audio InputnOutput (I/O), display, visualization, and simulation components.

Figure 1. UAACS System Concept Diagram, similar components colored together and grouped together

The core UAACS components are the essential components of UAACS they provide the planning andcontrol for the entire system. The core components include the various planners and the Executor thecomponent responsible for carrying out the plan. The audio I/O component represents the speech recognitionand text to speech components that are used for communication with ATC. One alternative to the audio I/Ocomponent for the NextGen NAS is a data link that provides direct ATC communication to the UAS. Thedisplay and visualization components serve as the ground control station for UAACS providing an interfaceto command UAACS and a visual representation of the aircraft’s state. The simulation components are thecomponents needed for our synthetic demonstration environment. These components are needed to simulatethe behavior of the UAS and the behavior of other aircrafts in surrounding airspace. In a real ight testthese components would be replaced by an actual aircraft autopilot that receives commands from UAACSand automatic dependent surveillance-broadcast (ADS-B) that provides information about all the aircraftsin the surrounding area.

The planning system for UAACS is divided into multiple planners, each solving one speci�c subproblemof the autonomous aircraft planning problem. By dividing the planning problem into smaller subproblemseach sub planner can be reused and the planning system can be recon�gured to solve a di�erent type ofplanning problem. All the subplans are later combined by the Executor to create the �nal plan. Once theExecutor assembles the �nal plan, it is responsible for executing each task in the plan. During the executionprocess the Executor monitors the changes to the aircrafts sensor values to determine whether and when theaircraft deviates from the plan and becomes out of conformance.

2 of 8

American Institute of Aeronautics and Astronautics

Page 3: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

III. Results

A. Speech Recognition Performance

The potential bene�ts of speech recognition in air tra�c are signi�cant, especially when applied to UASto reduce remote pilot workload. Several factors can make real-world speech recognition di�cult, includingvariability between speakers, variations in communication channel and presence of noise.6

Mosaic conducted performance tests to compute the accuracy of the speech recognition engine underreal-world conditions. Two test sets were created from ATC audio, one set from audio purchased fromLinguist Data Consortium (LDC), the second test set from ATC audio collected from Dallas Fort Worth(DFW) with signi�cant radio noise.

Mosaic also assembled a number of di�erent acoustic models to determine which model performs the bestwith each test set. Mosaic created an ATC-speci�c acoustic model from 70 hours of the LDC- purchaseddata and another model from three hours of the DFW data. The other models that were used in testingwere the Wall Street Journal (WSJ) and VoxForge models, which are freely available and built on variousnews and media sources.

Figure 2. Results of running DFW test set against various acoustic models

Figure 2 shows the accumulated accuracy results of the DFW test set using the four 8kHz acoustic models.Two di�erent language models were used one created from the DFW training data and the other from theLDC training data. The results showed that using the DFW acoustic model with the DFW language modelperformed the best. The LDC acoustic model with the DFW language model also did well, but betweentransmissions 200 and 400 had a large error. These transmissions contain a female controller while the LDCacoustic model was trained on mostly male controllers. Figure 3 shows the accuracy results of the LDC testset: the results show that the LDC acoustic model performed the best since it was speci�cally trained onthe LDC data. For both test results - the non-ATC acoustic models performed poorly relative to the ATCspeci�c acoustic models.

B. Demo Scenarios

Mosaic created a number of demo scenarios to test and demonstrate the capabilities of UAACS. For all thescenarios the con�guration of the UAACS prototype never changes, simply the input to the aircraft such asthe ATC instruction, environmental condition and position of other aircrafts. The �ve demo scenarios arelisted in sections 1 - 5 with details on the performance of the UAACS prototype for each scenario.

3 of 8

American Institute of Aeronautics and Astronautics

Page 4: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

Figure 3. Accuracy results of LDC test data

1. Scenario 1 No Replanning

The �rst scenario is the case when there are no exceptions and the aircraft ies its plan that was createdduring pre ight planning. The scenario starts with the aircraft outside the Martinsburg Class D airspacewith the current wind conditions at Martinsburg airport. An initial pre ight plan is created picking arunway and pattern entry point based on the wind condition. The plan also includes performing loiteroutside the class D airspace until it receives a clearance into the airspace as shown in Figure 4. Once theaircraft receives clearance into the class D space it ies into the boundary and receives a clearance to enterthe pattern with the same pattern entry point and runway that it choose during pre- ight planning thereforeit continues on its plan.

2. Scenario 2 Replanning with Di�erent Runway Clearance

In the second scenario, the aircraft again starts outside the Martinsburg boundary and plans for a patternentry and runway based on the current wind condition. The aircraft will y into the boundary once it receivesclearance to enter class D airspace. Now when the aircrafts receives its clearance to enter the pattern fromthe controller it is for another runway and pattern entry point than the one chosen during pre- ight planning.UAACS will detect the di�erence from the plan and begin to replan using the new assigned runway andpattern entry point as shown in Figure 5. Once the new expanded plan has been created with the assignedentry point and runway, the aircraft will complete its ight. This demonstrates the ability for UAACS torecognize new unexpected sensor data and replan based on the new data.

3. Scenario 3 Out of Bounds ATC Instruction

The third scenario is the case when the air tra�c controller gives a command that cannot be performed bythe aircraft. For this scenario we used a speed command to keep the example simple. When the aircraftreceives the command it performs a replan to check whether the aircraft can perform the command. Duringthe planning process the planner checks if the new speed is within the bounds of the aircrafts ight dynamicsand to check whether the speed change does not con icts with its current speed.

As the aircraft is ying to Martinsburg airport to land its current speed is 50-55 knots. The controllerthen gives a decrease speed to six-�ve command to the aircraft since this is an unexpected command UAACSissues a replan. The planner realizes that it is going at a slower speed than requested by the controller andtherefore cannot decrease its speed so it returns a text to speech command notifying the controller that itcannot decrease its speed since the aircraft is currently ying slower.

4 of 8

American Institute of Aeronautics and Astronautics

Page 5: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

Figure 4. Scenario 1 - no replanning and loitering outside of boundary waiting for clearance

The controller then gives an increase speed to seven-zero command to the aircraft. Again UAACS issuesa replan since it received an unexpected command. Since the maximum speed of the aircraft is 65 knotsthe planner returns with a text to speech command notifying the controller that the speed is beyond themaximum speed of the aircraft. This demonstrates the ability for UAACS to use the planner to verify sensordata as part of the planning process. The scenario also demonstrates the ability for UAACS to overridecommands from ATC when the command proves to be harmful to the UAS or surrounding aircraft.

4. Scenario 4 Replanning Due to Tra�c

The fourth scenario performs replanning in the presence of a second aircraft in the pattern. As the aircraftenters the pattern, it is sequenced behind a second aircraft. The UAS must perform evasive maneuvers toavoid the con icting aircraft and be correctly sequenced; in this scenario the UAS will perform S-turns toarrive behind the second aircraft. When the second aircraft is initiated its location is continuously sent toUAACS using the tra�c sender component which simulates ADS-B. When the second aircraft is detected

Figure 5. Scenario 2 Flight path before and after replanning using runway and pattern entry point given byATC

5 of 8

American Institute of Aeronautics and Astronautics

Page 6: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

by UAACS it will issue a replan with the location of second aircraft and the sequencing order. The plannerwill return with a plan that includes performing S-turns to sequence behind the second aircraft as shown inFigure 6. This demonstrates the ability of UAACS to detect other aircrafts in the airspace, integrate theirposition into the plan for safety, and to provide sequencing in the pattern.

Figure 6. Scenario 4 - replanning with tra�c in the pattern

5. Scenario 5 Incorrect ATC Command

The last scenario demonstrates how the planner can be used to verify an ATC command based on the currentstate of the aircraft. In the scenario the controller accidentally gives a cleared to land command to the UASwhen it is outside the class D boundary. Since UAACS receives an unexpected ATC command, it performsa replan. The planner recognizes that that the aircraft is outside the class D airspace and should not hear acleared-to-land command; it returns with a plan that contains a text-to-speech error notifying the controllerthat it is not ready for a cleared-to-land command. This again demonstrates the ability for UAACS to verifycommands from ATC.

C. Mission Planning

Figure 7. Mission domain speci�cations, including size of each mission and fuel rates

Mosaic investigated building a mission domain that tries to solve some of the more complex higher-level planning problems. The role of the mission planner is to make decisions on which missions should beexecuted, the order of execution, and details about how the mission should be executed. Figure 7 shows the

6 of 8

American Institute of Aeronautics and Astronautics

Page 7: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

speci�cations for the mission planner for the UAACS prototype. The UAACS mission planner must pickfrom a number of di�erent surveillance missions, the type of surveillance to perform and when, if required,should the UAS should return to refuel.

The mission planner associates a cost to each possible plan it computes. For the UAACS prototype thecost is a function of the fuel used by the plan; other parameters could also be used to compute the cost suchas the power and time. The cost was also used during the planning process to reduce the required searchspace by eliminating plans that could have a total cost greater than a given threshold.

The amount of fuel consumed for each mission is a determined by the distance the UAS must y toreach the mission, the size of the surveillance area and the surveillance method used. The prototype systemimplements three types of surveillance methods: spirals, s-turns, and grid, as shown in Figure 8. Eachsurveillance method has its bene�ts and drawbacks some methods use less fuel but have a coarser surveillancetrajectory, which may not always be desirable.

Figure 8. Three surveillance methods used by the UAACS mission planner

� �Totals : Plans Mincost Maxcost Expansions I n f e r en c e s CPU time Real time

486 538.56 2021.49 8023 113527 31.372 43.290Plans :Plan 1 :( !EXECUTE�MISSION SUAV SURVEY�2 SPIRAL)( !EXECUTE�MISSION SUAV SURVEY�1 SPIRAL)( !EXECUTE�MISSION SUAV TARGET�1 SPIRAL)( !EXECUTE�MISSION SUAV TARGET�2 SPIRAL)

Plan2 :( !EXECUTE�MISSION SUAV SURVEY�2 SPIRAL)( !EXECUTE�MISSION SUAV SURVEY�1 SPIRAL)( !EXECUTE�MISSION SUAV TARGET�1 SPIRAL)( !EXECUTE�MISSION SUAV TARGET�2 S�TURNS)

Plan3 :( !EXECUTE�MISSION SUAV SURVEY�2 SPIRAL)( !EXECUTE�MISSION SUAV SURVEY�1 SPIRAL)( !EXECUTE�MISSION SUAV TARGET�1 SPIRAL)( !REFUEL SUAV)( !EXECUTE�MISSION SUAV TARGET�2 GRID)

. . .� �Listing 1. Example mission plans

Listing 1 shows three of the possible 486 plans that the UAACS mission planner found - listed in orderof total cost. Each plan shows the order in which the missions should be executed in and which surveillancemethod should be used. The third plan also shows performing a refuel before executing the last mission.The plans also show the use of mission precedence when ordering the missions; in this case a Target missioncannot be executed before its matching Survey mission has been executed.

One area of the mission planner that Mosaic did not have time to investigate was the ability to determinewhich sensor package to use based on the current environmental conditions. For example, if two surveillancesensors are on board the aircraft Infrared (IR) and Electro-Optical (EO), the IR can see in day or night buthas a low resolution while the E/O has a higher resolution but can only be used in daylight. The choice ofthe surveillance method will be heavily dependent on the type of sensor package used and the desired sensorresolution.

IV. Conclusion

Mosaic has successfully implemented and demonstrated a UAS control and planning prototype systemin a simulated environment. We have also shown the ability to use ASR to recognize ATC communicationand to incorporate the recognized ATC instructions in the control and planning of the aircraft. We havetested the UAACS prototype against a variety of scenarios and documented the behavior of the system.

7 of 8

American Institute of Aeronautics and Astronautics

Page 8: [American Institute of Aeronautics and Astronautics Infotech@Aerospace 2011 - St. Louis, Missouri ()] Infotech@Aerospace 2011 - The Unmanned Autonomous Aircraft Control System (UAACS)

We have demonstrated the ability to perform planning and replanning based on ATC commands as well asinformation about the surrounding airspace.

The UAACS system will signi�cantly improve the utility of UAS in military operations by reducing theworkload of a remote UAS pilot. Clearly, the operation of UAS will continue to require the involvement ofa human decision-maker; however, the proposed system will allow UAS to meet many mission requirementsautonomously, thereby increasing the e�ciency of UAS operations.

References

1de Cordoba, R., Ferreiros, J., San-Segundo, R., Macias-Guarasa, J., Montero, J., Fernandez, F., D’Haro, L., and Pardo, J.,\Air tra�c control speech recognition system cross-task amp; speaker adaptation," Aerospace and Electronic Systems Magazine,IEEE , Vol. 21, No. 9, 2006, pp. 12 {17.

2Ravich, T. M., \The Integration of Unmanned Aerial Vehicles into the National Airspace," North Dakota Law Review ,Vol. 95, No. 3, 2009, pp. 597.

3Nau, D., Ilghami, O., Kuter, U., Murdock, J. W., Wu, D., and Yaman, F., \SHOP2: An HTN planning system," Journalof Arti�cial Intelligence Research, Vol. 20, 2003, pp. 379{404.

4Walker, W., Lamere, P., Kwok, P., Raj, B., Singh, R., Gouvea, E., Wolf, P., and Woelfel, J., \Sphinx-4: A exible opensource framework for speech recognition," Tech. rep., 2004.

5Lamere, P., Kwok, P., Walker, W., Gouva, E., Singh, R., Raj, B., and Wolf, P., \Design of the cmu sphinx-4 decoder,"IN 8TH EUROPEAN CONF. ON SPEECH COMMUNICATION AND TECHNOLOGY (EUROSPEECH , 2003.

6Fernandez, F., Ferreiros, J., Pardo, J., Sama, V., de Cordoba, R., Marias-Guarasa, J., Montero, J., San Segundo, R.,D’Haro, L., Santamaria, M., and Gonzalez, G., \Automatic Understanding of ATC Speech," Aerospace and Electronic SystemsMagazine, IEEE , Vol. 21, No. 10, 2006, pp. 12 {17.

8 of 8

American Institute of Aeronautics and Astronautics


Recommended