+ All Categories
Home > Documents > Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the...

Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the...

Date post: 19-May-2018
Category:
Upload: vomien
View: 216 times
Download: 1 times
Share this document with a friend
15
296 JOHNS HOPKINS APL TECHNICAL DIGEST, VOLUME 23, NUMBERS 2 AND 3 (2002) P Engineering Visualization David E-P. Colbert and R. Edward Ralston hysics-based, high-fidelity modeling and simulation (M&S) tools that the engineering community develops and employs to analyze the performance of modern weapon systems are necessarily becoming more complex. As the fidelity of these tools increases, so does the volume of pertinent information that they generate. To interpret the wealth of resulting engineering information, APL developed “engineering visualization” as a tool to facilitate a better understanding of weapon systems and their respective models and simulations. Engineering visualization allows the engineer to present the results of M&S-based analysis to the sponsor with unrivaled clarity and efficacy. By providing the highest level of insight possible to both engineer and sponsor, engineering visualization has established itself as an essential systems engineering tool. This article describes the history, development, implementation, and future of engineering visualization in the Laboratory’s Air Defense Systems Department. INTRODUCTION An important role of a weapon system engineer has been to interpret analytical results and present those results to the sponsor. Many early engineering efforts at APL produced tabular engineering informa- tion printouts, which an engineer pored over, looking for patterns and trends in the array of numbers. Later, two-dimensional (2D) plotting tools became readily available. The engineer now had the ability to study a single metric versus another metric in a graphical repre- sentation, looking for trends or anomalies in the graph. As time progressed, three-dimensional (3D) plotting applications became available to the weapon systems engineer that facilitated the ability to study three- parameter modeling outputs, either as a line or a surface in 3D space. Adding a fourth dimension, usually time, to these plots enabled a unique quality of animation that allowed visualization of four-parameter engineer- ing information. The volume and complexity of the engineering infor- mation resulting from these weapon systems models grow directly as the complexity of the threat increases. Also, with recent trends in DoD research focusing more on using statistical computer models of weapon systems, the weapon systems engineer has enormous amounts of information to analyze and interpret. In addition, these analytical results must be communicated to the sponsor in a clear and concise manner. Commercially available engineering analysis tools are not capable of studying all aspects of a weapon system or presenting comprehensive analysis results succinctly. Often, complex analysis can result in a presentation with hundreds of slides and can require several hours to
Transcript
Page 1: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

296 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

P

EngineeringVisualization

David E-P. Colbert and R. Edward Ralston

hysics-based,high-fidelitymodelingandsimulation(M&S)toolsthattheengineeringcommunitydevelopsandemploystoanalyzetheperformanceofmodernweaponsystemsarenecessarilybecomingmorecomplex.Asthefidelityofthesetoolsincreases,sodoesthevolumeofpertinentinformationthattheygenerate.Tointerpretthewealthofresultingengineeringinformation,APLdeveloped“engineeringvisualization”asatooltofacilitateabetterunderstandingofweaponsystemsandtheir respectivemodelsandsimulations.EngineeringvisualizationallowstheengineertopresenttheresultsofM&S-basedanalysistothesponsorwithunrivaledclarityandefficacy.Byprovidingthehighestlevelofinsightpossibletobothengineerandsponsor,engineeringvisualizationhasestablisheditselfasan essential systems engineering tool. This article describes the history, development,implementation,andfutureofengineeringvisualizationintheLaboratory’sAirDefenseSystemsDepartment.

INTRODUCTIONAn important role of a weapon system engineer

has been to interpret analytical results and presentthose results to the sponsor. Many early engineeringefforts at APL produced tabular engineering informa-tionprintouts,whichanengineerporedover, lookingforpatternsandtrendsinthearrayofnumbers.Later,two-dimensional (2D) plotting tools became readilyavailable.Theengineernowhadtheabilitytostudyasinglemetricversusanothermetricinagraphicalrepre-sentation,lookingfortrendsoranomaliesinthegraph.As time progressed, three-dimensional (3D) plottingapplications became available to the weapon systemsengineer that facilitated the ability to study three-parametermodelingoutputs,eitherasalineorasurfacein3Dspace.Addingafourthdimension,usuallytime,to these plots enabled a unique quality of animation

that allowed visualization of four-parameter engineer-inginformation.

Thevolumeandcomplexityoftheengineeringinfor-mation resulting from these weapon systems modelsgrowdirectlyasthecomplexityofthethreatincreases.Also,withrecenttrendsinDoDresearchfocusingmoreonusingstatisticalcomputermodelsofweaponsystems,theweaponsystemsengineerhasenormousamountsofinformationtoanalyzeandinterpret.Inaddition,theseanalyticalresultsmustbecommunicatedtothesponsorinaclearandconcisemanner.

Commerciallyavailableengineeringanalysistoolsarenotcapableofstudyingallaspectsofaweaponsystemorpresentingcomprehensiveanalysisresultssuccinctly.Often, complex analysis can result in a presentationwithhundredsofslidesandcanrequireseveralhoursto

Page 2: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002) 297

ENGINEERING VISUALIZATION

presenttothesponsor.Tomakethisproblemmanage-able,APLhasdeveloped“engineeringvisualization”asa tool to study such voluminous amounts of complexengineeringinformationandtopresentfindingstothesponsorefficientlyandeffectively.

EVOLUTION

Single-Screen Nondistributed VisualizationThe first class of engineering visualization, single-

screen nondistributed visualization, has been used toanalyzeseveralweaponsystems.Thissectiondescribesthegenesisofthisfundamentalvisualizationtool.

Visualization EntitiesThevisualizationsoftware’sbasicpurposeistotake

weapon system simulationdata andefficientlybuild a3Drepresentationof thosedatawithinavisualizationscene(Fig.1).Thusmappingthedataofanobject(e.g.,amissile)toameaningfulvisualrepresentationoftheobject is the fundamental objective. The core C++objectwithinthevisualizationsoftwareis,therefore,theentityobject,whichcontainsareferencetothegraphi-calrepresentationoftheobjectwithinthevisualizationscene,areferencetothedatasource,andthemappingof the data to the graphical object’s various controls(such as position, orientation, and articulations). Inthe visualization software, anything that has a visualrepresentation and is driven by data is represented asan entity object. The visualization software consistsofabout250C++classesandcontainsavisualizationscene; a motif interface; a math library; networking

code, camera, and event scripts; a 3D graphical userinterface;and2Doverlays.

Defending Missile EntityAirdefensesystemsthatuseadefensivemissilehave

detailedphysics-basedmodelsofthatmissile,andthesemodels produce information that must be correctlydepicted. Visualizing the important aspects of thedefending missile begins by creating a 3D representa-tion.Tocreatesucharepresentation,theengineermustobtain engineering diagrams, blueprints, photographs,computer-aided design models, and detailed descrip-tionsofthemissiletothehighestpossiblefidelity.Usingalloftheseresources,hecreatesviasoftwareapolygonal3Drepresentation(orframework)ofthemissiletoscale.Next,photographsareusedtocreatecomputerimages,knownas textures, thatmimic the surfaceappearanceof the missile body. With the textures mapped ontothe polygonal 3D representation or “wireframe,” theengineercaneffectivelyreproducethesize,shape,andappearanceof thedefendingmissilebody.Thenaddi-tional texture-wrapped polygonal representations ofeachstage’srocketmotorflamesmaybeattachedtothetailofeachrespectivemissilestage.Finally,eachpartofthemissilethatistobedrivenbymodelingdata(suchas aero-control surfaces) must have an “articulation,”whichdescribesitslocationwithrespecttothecenterofgravityofthemissileanditsrangeofmotionrelativetothemissile’saxes(Fig.2).

Theresultanttexture-wrappedpolygonalrepresenta-tioncan,however,besohighlydetailedthatcomputergraphics hardware becomes saturated, especially if

Figure 1. The weapon system simulation provides data to the entity, which attaches the data to the graphics object and its articulation in the visualization screen.

numerous such representationsare simultaneously displayed in avisualization scene. Although theperformance of the visualizationsoftware is greatly enhanced byobject culling (removing “hidden”objects), most of the graphicsoptimization comes from polygonreduction. APL achieved polygonreduction by creating several ver-sions of each graphic object, eachwith a different level of detail.Whenrenderingascene,theappro-priate version is chosen based onthedistanceoftheobjectfromtheviewer:thefartheranobjectisfromtheviewer,thesimplerthegraphicobjectcanbe.

Theengineermustnextfocusontheneedtovisualizefromtheper-spective of the six-degree-of-free-dom(6-DOF)modelofthedefend-ingmissile.Theprimaryparametersthatmustbevisualizedarethedata

Entity Graphicobject

Visualizationscreen

Entity Graphicobject

Visualizationscreen

Articulation

Articulation

Articulation

Articulation

Articulation

Articulation

DataWeaponsystem

simulation

•••

Page 3: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

298 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

thatdescribetheactualtranslationandrotationofthemissileatafine time-stepgranularity (i.e., time,posi-

(a) (b)

(c) (d)

Figure 2. Polygonal (a), surface (b), and textured (c) representations of the missile are shown, as well as a missile graphic object in the process of being constructed (d).

Missile metrics Target metrics

TGO: 137.31 Time: 362.00

Figure 3. The missile entity, in this case a Standard Missile-3.

ofthemissile, thedeflectionsof themissile’sfins,andanyattitudecontrolthrustersonthemissile.Withthese

tion,velocity,attitude,andacceler-ation).Byvisualizingtheseparam-eters, the engineer can clearly seeforeachtimestepwherethemissileis,whereitisheading,andwhereitispointing(Fig.3).

Eventinformationalsofacilitatesthevisualizationofhowthebodyofthemissilechangessizeandshapeasthemissilecontinuesalongitsflightpath. Event information includesthemissilelaunchtime,eachmissilestage’s rocket motor burnout andseparation times, the ejection timeof minor elements of the missilebody (e.g., nosecone, dome cover,claspsornuts,bolts),impactevents(e.g., debris clouds, explosions),and translucent spheresaround themissile that represent its lethal killradius.

Additional parameters that theengineer must visualize are thethrust and aero-control surfacedata.Theseincludethemainthrust

Page 4: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002) 299

ENGINEERING VISUALIZATION

parametersaddedtothemissilevisualization,theengi-neercanscaletherocketmotorflamesinproportiontothemainthrustdata;rotateeachfinaccordingtoitsfindeflectiondata;andturnon,turnoff,andscaletheatti-tude control thruster flames according to the attitudecontrol thruster data. Having incorporated the thrustparametersandaero-controlsurfaceintothevisualiza-tion,theengineercanvisualizetheeffectsoftheforcesthat affect theflightof thedefendingmissile.To addrealismtothevisualization,theengineermayevenaddscaledsmoketrailstotherocketmotorflamestosimu-lateplumefromtherocketmotors.

Afterobtainingthemissilebodyrepresentationandcorrespondingdatadrivingeachelementof thebody,theengineeringinformationthatdescribesthemissile’sonboardsensormustbeadded.Theonboardsensorusu-allycomprisesaprotectivecoverandthesensoritself.Atranslucentconeorbeamemanatingfromthesensorfaceisusedtorepresentitsfieldofview.Sensorinforma-tionincludesthehalf-powerbeamwidthofthesensor,the range (if appropriate) of the sensor, the azimuthandelevationofthebeamatalltimes,andthesensoractivationtime.Combiningtheprotectivecoverejec-tion time from the stagedatawith theparameters forthesensorbeam,theengineercanvisualizethepreciselocation, size, shape, andpointingangleof the sensorbeamatalltimes,aswellasanyentitieswhichhappentobecontainedwithinitsfieldofview.

Threat EntityThethreatvisualizationprocesscloselyresemblesthe

processforadefendingmissileexceptthatintelligenceinformationisnowacriticalinput.The3Drepresenta-tion of the threat is again created from any availableengineering diagrams, blueprints, etc., pertaining tothethreat,either fromfreelyavailableor,moreoften,intelligencesources.Usingtheseresources,theengineergeneratesviasoftwareatechnicallyaccuratewireframeofthethreatandmapstexturesontothepolygonsthatmimic the threat’s appearance. Finally, the engineeradds flames to the rocket motors and articulations toany elements of the threat body, both of which aredynamicallydrivenbydata.Thethreat’slaunchvehiclecanalsobevisualizedusingthesameprocess.

The flight of the threat for each time step can bevisualized fromdataon itsposition,velocity,attitude,acceleration, main thrust, attitude control thrusters,and stageevents.Theengineer followsavisualizationprocess similar to the one described for the defend-ingmissile toproducethetrajectory, forces,andstageeventsforthethreat.Theengineermaynowseewherethethreatis,whereitisgoing,howitisorientedalongits flight path, whether it is speeding up or slowingdown,whatforcesareactinguponit,andhowitssizeandshapechangeasitsflightprogresses(Fig.4).

Ship EntityTheengineermustnextaddthelaunchplatformfor

thedefendingmissile,typicallyaship.UsingavailableNavyshipengineeringdiagrams,blueprints,etc.,a3Drepresentation of the launching ship is produced in asimilarmannerasthedefendingmissileandthethreat.Thetrajectoryfortheshiphasbeenlesscomplicatedtodate than for themissileand threatbecause itcanbedescribed by a single latitude, longitude, and heading(Fig. 5).Adding roll, pitch, and yawaswell as surge,sway, and heave is simply a matter of attaching datafromashipmotionmodeltotheshipentity.

Radar EntityHoused on the ship entity, the radar entity (typi-

cally aphased arraySPY face)mustnowbe added tothevisualizationtoshowsearchandtrackparameters.The metrics for the radar include the azimuth, eleva-tion,and instrumentedrangeextentof thevolume inwhichtheradarsearches for thethreat,aswellas the

Figure 4. The threat entity, shown here as a target test vehicle.

Page 5: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

300 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

half-powerbeamwidth,instrumentedrangeextent,anddynamicpointinganglesofthebeam(s)withwhichtheradarsearchesthevolume.Withthisinformation,theengineermayconstruct3Drepresentationsoftheradarbeamsandsearchvolume.These3Drepresentationsarecreatedbyconstructingpolygonalrepresentationsofthebeams and volume, adding color and translucency sothatthevisualizationsimulatespeeringintothebeamsandvolumetodeterminewhatentities(suchasthreats)arepresent.

Theengineermustattachthebeamsandvolumetothe appropriate SPY radar face on the representationoftheshipusinginformationfromthedetachedradarmodel(typicallytheSPYFirmTrackmodel)todescribetheirmotion.Thesearchvolumeframetimedetermineshow long it takes the radar beam to scan the entiresearch volume. Radar beam azimuth and elevationdescribe exactly where each beam is pointed at eachmomentinthevisualization.Theradarmodel(Fig.6)

alsocollectsthestate,type,group,cluster,object,andcorrelationinformationpertainingtothetracksforallthreatsthattheradaristracking.

Fire Control EntityAlso attached to the ship entity is the fire control

entity,orinsomecasestheilluminator(s).Thisentityisrepresentedasasetofbeamsandvolumessimilartothe radar beams and volumes. The illuminator beamstypicallyhavedifferentdimensions thanthose for theradarentity.Illuminatorbeamsareattachedtotheillu-minatordishesontheship,andthebeamsarepointedaccording to the desired pointing angle of the dish.Theilluminatordishesontheshipmustalsoberotatedtowardthethreatbecausetheyaremechanicallyaimed(Fig.7).

Aircraft EntityAll 3D representations of both friendly and threat

aircraft that theengineervisualizes are created in thesamewayasadefendingmissileorthreatiscreated.Thepositionandattitudeoftheaircraftatafinetime-stepareneededtovisualizeitsflight.Thevisualizationnowshowswheretheaircraftisateachpointintime,aswellashowitisorientedatthattime.Foraircraftwithanonboard sensor, the 3D representations for the sensortypically include the sensor itself, the sensor’s searchvolumes,andthesensor’ssearchbeams.Thebeamsandvolumesforthesensoraresimilartothoseoftheradarand illuminator, except that they have different sizesandshapes(Fig.8).

Figure 5. The ship entity, an Aegis cruiser in this visualization.

Figure 6. The outer search volume, the inner search beam, and the threat can be seen in this image of a radar entity.

Page 6: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002) 301

ENGINEERING VISUALIZATION

Terrain EntityThe engineer must also create

the terrain over which all otherentities are located. The terrainentityiseithera3DrepresentationoftheentireEarthorasmallpatchofit.Terrainentitiesmaybegener-ated in several different ways andto several different fidelity levels,depending on the specific require-mentsofthevisualization.

Several geographic datasets areused to create a terrain visualiza-tion. One source is digital terrainelevation data (DTED) from theNational Imagery and MappingAgency (NIMA). DTED datasetscontaininformationformostoftheEarth on ground-level elevationabove mean sea level. This infor-mation is obtained via satellite.Another source is digital eleva-tionmapdata fromNIMA,whichincludes similar elevation infor-mation in a different file formatand for different resolutions thanDTED information. In addition,

Figure 7. The two beams emanating from the ship toward the threats on the horizon are the illuminators of the fire control entity.

Figure 8. The aircraft entity, in this case an E-2C.

Celestial EntityIf thedefendingmissileguides to thethreatwitha

seeker that may be affected by stars entering its fieldofview,thencelestialentitiesareimportant.Celestialentitiesarerepresentedasemissivepointsonacelestialsphere that isatagreatdistance fromtheEarth.Theinstantaneousstarandplanetlocationsonthecelestialsphere are calculated from their right ascension anddeclinationaccordingtotheU.S.NavalObservatory’s Multiyear Interactive Computer Almanac forthespecificlaunch date and time. Using this information, theengineer constructs a geocentric celestial sphere at adistanceof10REfromtheEarth.

digitalfeatureanalysisdatasetscontainculturalfeaturesformostoftheEarth,e.g.,buildings,roads,canals,rail-roads,airports,etc.Theengineerusesseveralsoftwarepackages to generate terrains and integrates the datafromseveralsuchgeographicaldatasetstocreateanyofthreeterrainclasses.

Thefirstclassislow-fidelityterrain,whichistypicallyusedforvisualizationofAnti-AirWarfareanalysiswherescenariosusuallyoccuroverasmallareaandaregener-allynotnearmajorlandmasses.Severalstepsareneces-sarytogeneratethistypeofterrain.First,theappropriateDTEDdatafortheareaoftheworldtobevisualizedareloadedintopolygonreductionsoftwaretoconstructtheinitial flat-Earth 3D polygonal representation of thelocation.Next,theengineercreatesalow-fidelityimagetexturefromtheDTEDordigitalelevationmapdatasetusing any of several software packages and then mapsthetextureontothepolygons.Theengineermustnowconverttheterrainfromitsflat-Earthrepresentationintoanon–flat-Earthmodelwhichisusuallythe1984WorldGeodetic System’s ellipsoid Earth. In this manner, theengineercreatesaterrainthatcontainselevationinfor-mation, not only in the polygons that determine theshapeoftheterrainbutalsointhetexturethatismappedontothepolygons(Fig.9).

Thesecondclass,medium-fidelityterrain,hasarela-tively low-fidelitypolygonal representationbuta rela-tivelyhigh-fidelitytexturemappedontothesepolygons.This terrain class has been used mainly for Theater

Page 7: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

302 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

BallisticMissileDefense(TBMD)analysis,whichtakesplace farabove theEarthand thusdoesnot requireahigh-fidelity polygonal representation of the Earth’ssurface. However, visualization cameras are often intheexo-atmospherelookingdownontheEarth,whichrequires a large and relatively high-fidelity texture tocoverthecamera’sfieldofview.Togenerateaterrainofmediumfidelity,theengineerhastofollowstepsthatare slightly modified from those for the low-fidelityterrain. First he loads the appropriate DTED data for

thepertinentareaoftheworldintopolygonreductionsoftware.Hemayneed to load several datasets intothesoftwareandconcatenatetheminto one large flat-Earth terrain tocover the entire area. Followingthis, eitherDTEDordigitaleleva-tion map datasets are loaded intothesoftware,whichreadsthedata-setsandconvertsthemintoimages.In thisway, theengineer creates arelatively high-resolution texturewrapfortheterrain.Finally,theter-rainisconvertedfromitsflat-Earthrepresentation into the particularcoordinatesystemandEarthmodel.The result is a terrain similar tothelow-fidelityterrain,exceptthatthe medium-fidelity terrain coversa much larger area of the Earth(Fig.10).

Thethirdclassofterrain,andbyfar themostcomplicated togener-ate,isthehigh-fidelityterrain.This

Figure 9. The terrain in this image is a low-fidelity graphic representation of Hawaii, which was created for an Anti-Air Warfare reconstruction visualization. Kauai may be seen just behind the ship.

Figure 10. This middle-fidelity terrain of the Persian Gulf was created for a single-screen nondistributed TBMD visualization.

terrainhasahigh-fidelitypolygonalrepresentationandahigh-fidelitytexture.Itisusedonlywhenitisneces-sary to have all of the characteristics of the medium-fidelity terrain plus the highest geometrical accuracyandprecisionpossibleatthesurfaceoftheEarth.Theseterrainshavebeengeneratedforjointmissionanalysisof Overland Cruise Missile Defense (OCMD) andTBMD, which involve defending against exo-atmo-spheric theater ballistic missiles and terrain-huggingoverlandcruisemissiles,respectively.

Page 8: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002) 303

ENGINEERING VISUALIZATION

To build a terrain of this class, the engineer mustfirst determine what computer resources are availableonthevisualizationplatform, includingmainmemoryandgeometryaswellasthetexturelimitsofthegraph-ics card.The terrain isdesignedbasedon thesehard-ware constraints. To generate a high-fidelity terrainthe engineer must first follow the steps for building amedium-fidelityterrain,thenbeginagainwiththesameDTED datasets and batch-process the terrain as a setofrectangulartileswithfarmorepolygonaldetail.Theterrainisdividedintothissetofrectangulartilessothathardware resources are conserved by only having thehardwarerenderhigh-fidelitytileswhichareclosesttotheviewpoint.Similar to the levelofdetailoptimiza-tionusedforentitiessuchasthemissileandtarget,sim-plerversionsoftheterraintilesareusedasthedistancefromtheviewerincreases.

Even with this terrain optimization, the engineerstillmustperformmoreoptimizationstoensurethatthecapabilities of the target platform’s graphics hardwarearenotexceeded.Therefore,acriticalstepingenerat-ing a high-fidelity terrain is selecting an appropriatepolygonreductionmethodandreductionparametersforthechosenmethod,customizedtothespecificcapabili-tiesofthecomputerplatform.Theengineerthenmapsthe highest-fidelity texture onto the tiles. If culturalfeaturesfortheterrainarerequired,thenthedigitalfea-tureanalysisdatadatasetoverlaysculturalfeaturesontotheterrainasapolygonalrepresentationoftheselectedfeatures(Fig.11).

2D Overlays Since not all data of interest can be meaningfully

shownasa3Dobject,theengineerneedstoalsosup-portother formsofdatadisplay.This isaccomplishedby overlaying 2D graphic representations of the datasuch as 2D plots, rotary dials, scales, compasses, andattitudeindicatorsontothe3Dscene,therebyallowingvisualcorrelation.Inadditiontodisplayingthe2Ddataasgraphicaloverlays,thedatacanalsobedisplayedastext. Although this is not the optimal display choiceformostdataitems,itisessentialtobeabletodisplaytext for such informationas titles, timeofflight,missdistance,andmissilestage.Itisalsousefultohavetextdisplays in the 3D visualization environment such asannotationsfortracknumbersattachedtothreatsandlabels attached to stars in the visualization. The textis rendered using a texture-mapped font for efficiencyand to give the engineer the flexibility of editing thepixmap-basedfonts.

Configuration FilesAs the visualization software matured and APL

amassedalargelibraryofterrainsandgraphicmodels,usersbegantorealizethatmostofthesubsequentwork

invisualizingnewinputdatasetswouldentailmanipu-lating the inputdata rather thandevelopingnew fea-tures or graphic models. The input data come from amultitude of sources and are usually given to APL ina format that needs to be manipulated, resampled, orcombinedbeforetheycanbeusedbythevisualizationsoftware.APLimprovedtheprocessofvisualizingnewdatasets somewhat by integrating a library of classesthat provided useful mathematical operations (e.g.,unit conversions, coordinate transformations, andquaternion-based spherical interpolations). However,theavailabilityofcertaindatawouldsometimesneces-sitate themodificationof theC++visualization code.For example, if APL received data that included anarticulation that had not been previously visualized,thenthesoftwarewouldhavetomodifiedsothatthesedatacoulddrive thearticulationcontrolpointsof thegraphicrepresentation.

Thesolutionwastodevelopaconfigurationfilethatwouldfullydescribethedataandthecontrolpointsofthegraphicrepresentation.Thisconfigurationfilewouldalsospecifyhowthedataweretobemappedtothecor-respondinggraphicrepresentation.Developingavisual-izationwasthenamatterofmodifyingtheconfigurationfileinsteadofmodifyingthevisualizationcode.

APLalsowantedtheconfigurationfiletobeflexibleenough to allow it to read eachdatafile in itsnativeformat. Rather than writing a static configuration fileformat, the Laboratory decided to embed a scriptinglanguage into the visualization software. This wouldallowtheflexibilitytoperformarbitrarydatamanipula-tions, define the control points, and specify the data-to-graphic mapping within a single configuration file.Perlwaschosenasthescriptinglanguagebecauseit isoptimizedforscanningarbitrarytextfilessuchasAPL’svisualization data files, has object-oriented support,canbeembedded intoexistingapplicationswith rela-tive ease, and has an established community that hasdevelopedavoluminoussoftwarecodebasefromwhichAPLdraws.

Visualization Products Because visualizations allow for user interaction,

theyaremostusefulwhenviewedfromadisplaywhileit isbeingrendereddirectly fromthecomputer.Beingable to interactwith thevisualization (e.g., viewingascenefromvaryingangles,changingtheplaybackspeed,enlargingscreensona subdivideddisplay)enables theengineerorsponsortofocusonanyaspectofthevisual-izationdesired.

Sincethehostplatformforthevisualizationsoftwareistoolargetopermiteaseofphysicalportability,APLhasusedseveralmethodsofcapturingthevisualizationontoanalogordigitalmediatocreateportableversionsfordistribution.ThesimplestbutleastpreferredmethodisananalogVHSvideocaptureofthevisualization.To

Page 9: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

304 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

(a) (b)

(c) (d)

(e) (f)

(g) (h)

Figure 11. These eight images display the level of precision to which a high-fidelity terrain must be generated while still covering a large area. In (a) and (b), the relatively small distance between the missile and the mountain ridge over which it has just flown may be seen. In (c)–(e), the red profile of the cruise missile is seen weaving through valleys (light green) in the terrain. The scenario as seen from space is depicted in (f)–(h). This high-fidelity terrain was generated for a mixed-mission TBMD and OCMD multiscreen distributed visualization, another class of visualization which is discussed later in this article.

Page 10: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002) 305

ENGINEERING VISUALIZATION

createtheVHSvideotape,avideocameraismounteddirectly facing the computer monitor of a runningvisualization.

AnalternativeandpreferredmethodofcapturingtoVHSvideotapeistofirstcreateadigitalvideocaptureandthencreatethevideotapefromthedigitalversion.To make a digital capture of the visualization, APLadded code to step through the visualization (usuallyat 30 frames per second), capturing each individualframe to disk. After the visualization has saved all ofthe frames to disk, movie-editing software is used toassemble the frames and compress them into a singlequicktimemovie.Oncethequicktimemoviehasbeengenerated, it may be postprocessed to include titles,sound, labels, transitions, and other effects whichimprove the quality of the movie. The final movie isthenwrittentovideotape.

To produce an even higher-quality visualizationproduct,thequicktimemoviemaybewrittentoaDVD-RdiskforplaybackonatelevisionwithaDVDplayer.For thehighest-qualityportablevisualizationproduct,theengineerwritesthequicktimemovietoacompactdiskforplaybackonadesktoporlaptopcomputer.

Thedigitalcaptureprocesscanbelengthybecause,althoughthevisualizationsoftwarecandisplaytheren-deredimagesathighframerates,thecapturesoftwareislimitedbythespeedatwhichitcanwritetheimagestodisk.APLacceleratedthisprocessbypurchasingspe-cializedcomputerhardwarethatcouldsavetheimagestodiskasfastastheycouldberendered.Theadditionofthishardwareallowsavisualizationthathadpreviouslyrequireddays tobe captured tonowbe captured in amatterofminutes.

Toallowdigitalcaptureofvisualizations,itwasalsonecessary to have scripted interactions (e.g., cameraangle changes, screen zooms, pauses, playback speedchanges)sothatthedigitalcapturecouldbeperformedwithoutuserinteraction.Tofacilitatethis,thevisual-izationsoftwarewaswrittenwithagenericeventinter-facethatallowedallactionstobedriveneitherbyauserinteractivelyinvokingeventsthroughthekeyboardandmouseorby insertingfixedevents intotheconfigura-tionscript.

Reconstruction VisualizationThe second class of engineering visualization—

reconstruction visualization—is built on the samevisualizationframeworkassingle-screennondistributedvisualization. Reconstruction visualization allows theweaponsystemengineertosimultaneouslycomparetheactualperformanceofreal-worldweaponsystemstotheperformancepredictionsofM&Stools.Thesourcesoftheengineeringinformationforthisvisualizationclassincludedatafromactualweaponsystemsandtelemetersfromweaponsystemsparticipatinginanat-seaorland-based test firing. Reconstruction visualization allows

theengineertovalidatetheM&Stoolsandpresentthefindingstothesponsor.

Thevisualizationentitiesinareconstructionvisual-izationarethesameasthoseforthesingle-screennon-distributed class of visualization. The data, however,arecollected fromseveralof theparticipatingweaponsystemsinthetest.Thedatagermanetotheshipandshipboard weapon systems—ship location, ship head-ing, rawradar track fromtheSPYradar,filteredradartrackfromtheweaponcontrolsystem(WCS),andillu-minatorinformationfromthefirecontrolsystem—areallcollectedbytheshipsystemsandsenttoAPLviaasecure link to the Aegis performance assessment net-work.Engineeringinformationgermanetothedefend-ingmissilesystem—themissilestatebilevels,position,attitude,velocity,acceleration,andgimbalanglesfromtheseeker—arecollectedfromtelemetersonboardthedefendingmissile and transferred toAPLvia a securenetwork link with the Naval Warfare AssessmentStation. The engineer must use several data fusiontechniquestoovercomethedisparitiesinthedataandintegratealloftheengineeringinformationintoasinglevisualization.

Telemetrydata for the reconstruction iscollectedfrom separate physical telemeters, all of which haveunique clock and coordinate systems. To integratethese data sources, the engineer must align all tele-meterclockstoastandardclock(usuallyGreenwichMean Time) and resample all of the variable fre-quencydatatoasingle,fixedsamplingrate.Thenhemust align the coordinate systems. The coordinatesystemofthedatacollectedfromthemissiletelemeterisusuallymissilebodyframe,whereasthecoordinatesystemof thedatacollected from the ship isusuallyeitherdownrange-crossrange-uporeast-north-up.Toimportthesedataintothevisualization,theengineermusttransformthedatafromtheirnativecoordinatesystemsintothecoordinatesystemofthevisualizationand thenmerge theprocesseddata fromthevarioustelemeterfilesintoseparatedatafilesforeachvisual-izationentity.

Thenextstepisfortheengineertocollectthedatafromactualsensorsinreal-worldweaponsystems.Thesedatawillinherentlycontainnoiseandsingularitiesdueto the physical characteristics of the sensor, so theengineer must remove the singularities from the dataorreplacethemwithanaverageofsurroundingpoints.Noise from the data must be removed with filteringtechniques appropriate to the type of trajectory thatis being reconstructed. Low-resolution, high-altitude,radiallyinboundthreattracksrequiredifferentnoisefil-tersthanhigh-resolution,low-altitude,weavingthreats.Theengineerdeterminestheproperfilterbasedonthequalityofthetrack,thedynamicsofthetrack,andanyknown peculiarities specific to the particular trackingsensorsandtelemetersrecordingthedata.

Page 11: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

306 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

To produce a best estimate of the actual filteredtrajectories,theengineerworkswithseveralquantitiesandweighsthembasedonconfidenceinthedata.Forexample,ifthereishighconfidenceintheactualinter-ceptpointofthethreatandmissile,theengineermaychoosetofixthefinalpointinbothtrajectoriestotheactualinterceptpointandfilterthepositiondatafromeitherthemissiletelemeterortheSPYradartrackdataback to the trajectories’ respective origins. However,if thereisnoconfidenceintheactual interceptpointbutinsteadconfidenceinthelaunchpositionandthevelocitiesofthetwotrajectories,theengineermayfixthelaunchlocationsanddeterminethe position of the two entitiesusing the velocity from either theSPYradartrackdataorthemissiletelemeterdata.

It is obvious from these twoexamples alone that the process ofconvertingrawreal-worlddatacol-lected from sensors and telemetersinto a filtered best estimate of theactual trajectory is a delicate andsensitive one in which data integ-ritymustbemaintainedyetknownanomalies filtered out. Throughoptimization and automation tech-niques,thetimerequiredtoprocessthesedata from the time that theyarrive at APL has been shortenedfromseveralweekstoaboutaday.

The visualizations of this real-world engineering information areused to analyze the actual perfor-mance of the SPY radar, illumina-tors, defending missile, and threatweaponsystemsparticipating inanactual at-sea test. Reconstructionvisualization may be compared toM&S predictions for the at-seatest performed before and aftertheactualtest.Byoverlayingallofthis engineering information intoa single visualization, the engineercan simultaneously observe APL’spretest modeled performance pre-diction, the actual performance oftheactualweaponsystemfromtheat-sea test, and the posttest mod-eled performance prediction (Figs.12and13).

Quicktime VisualizationThe third class of visualization,

quicktimevisualization,isinherentlydistinct from theother classes. It is

Figure 12. To the left is a reconstruction from the threat perspective. The beams are from the ship’s illuminators. To the right is the same reconstruction from the missile’s perspec-tive. The green cone emanating from the missile is its seeker.

Figure 13. A reconstruction of the successful Standard Missile-2 engagement of a Lance target over the White Sands Missile Range in 1997. The SM-2 telemetry data were ana-lyzed to determine the relative range, velocity, and attitude of the missile and target at end-game and were then used as input to the visualization. The visualization screen is divided into two parts: the first screen shows a scene of the endgame from a virtual camera oriented according to the SM-2’s infrared (IR) camera gimbal angles, and the second screen shows the seeker IR images captured from the actual flight test. By setting the field of view of the virtual camera and orienting the camera according to the actual IR seeker’s orientation, as reported in the telemetered data, APL was able to visually confirm that the visualized geometry was correct. The visualization was then used to extrapolate past the last available telemetry data point to not only conclude that the engagement was a “direct hit” but also pinpoint where on the target the missile collided.

bynaturea2Dvisualizationandmaybecreatedofflinebyvirtuallyanythird-partysoftwarepackagethatisusedtoanalyzeengineering information.APLdevelopedquick-timevisualizationstomeetarequirementforintegrating2D engineering information into 3D visualizations andsynchronizingtheplaybackoftheinformationtotheexe-cutingquicktime.Examplesofsuch2Dimagesequencesarereturnsfromonboardsensors,resultsoffiniteelementanalysisofintercepts,resultsofdiscriminationandhan-doveralgorithmanalysis,quadchartsdisplayingengineer-ing information exported from third-party applications,andslideshowpresentations(Fig.14).

Page 12: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002) 307

ENGINEERING VISUALIZATION

Figure 14. Three examples of quicktime visualizations: the left image is a visualization of the handover and discrimination process, the center image is a visualization of the finite element analysis of intercept, and the right image is a quad chart of some metrics relevant to the radar entity.

Togeneratethequicktimevisualization,theframesofthequicktimemovieareproducedusingthird-partysoftware applications. Next the images and times foreachrespectiveframeofthevisualizationarecollectedandtheimagesarecompressedintoaquicktimemovie.The engineer must associate a time to each of theframes in thequicktimemovie.Thenhebeginsplay-back of the quicktime visualization and synchronizesthecurrentframetimeoftherunningquicktimemovietothecurrenttimeintherunningmastervisualization.Quicktime visualizations may either be viewed as astand-alonevisualizationorasanodeinamultiscreendistributedvisualization,whichisdescribedinthefol-lowingsection.

Multiscreen Distributed VisualizationThe fourth class of visualization that APL has

developedisamultiscreenvisualizationthatisdistrib-uted across several computers. The Laboratory had arequirementtosimultaneouslyvisualizeseveralweaponsystemsingreatdetail.Thiscouldnotbeaccomplishedwithasinglescreen,sothevisualizationwasdesignedto span multiple screens. Instead of attempting torender thevisualizationontomultiple screensusing asingle computer, multiple computers were used, witheachcomputerrenderedtoanindividualdisplay.Thisallowed APL to take advantage of the resources ofseveralmidrangevisualizationcomputers for themul-tiscreen visualization that would otherwise require asinglehigh-endvisualizationcomputer.

Multiscreendistributedvisualizationisaccomplishedbybuildingavisualizationforeachmidrangecomputerthatdisplaysasubsetofthevisualizationwindowsandthenrunningthevisualizationssimultaneouslyoneachcomputer.Thevisualizationsoftwarecontainsnetworksoftware that enables APL to keep the visualizationssynchronized, creating the appearance of a single,coherentvisualization(Fig.15).

The synchronization software used for the multi-screen visualizations also enables synchronization ofothersoftwarewiththevisualization.Mostnotably, it

Figure 15. A multiscreen distributed visualization of the per-formance evaluation of the same scenario with three sepa-rate coordination algorithms. By visualizing all three algorithms simultaneously, APL was able to clearly see how each algorithm affected the overall performance.

Page 13: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

308 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

allowsfortheintegrationofthequicktimevisualizationcapturedfromothersources.

ARTEMIS Three-Screen, Nine-Window Visualization

APL developed the most recent visualizationclass—theAPLArea/TheaterEngagementMissile/ShipSimulation (ARTEMIS)1 three-screen, nine-windowvisualizationclass—torealizespecificvisualizationgoalsfor a new type of M&S effort in the Navy TheaterWideTBMDprogram.ARTEMISisahigh-level-archi-tecturefederationofengineeringmodelsthatintegratesexisting high-fidelity weapon system models into adistributed architecture. These distributed models, orfederates, exchange engineering information amongthemselvesastheyexecute,thuscreatingaclosed-loop,end-to-endsimulation.

TheARTEMISsimulationconsistsof several sepa-rate federates, each requiringauniquewindow in thevisualization. To simultaneously visualize all of thesefederates, APL designed the ARTEMIS visualizationforviewingintheSystemConceptDevelopmentLabo-ratory as a three-screen display in which the left andrightscreensaresubdividedintoquadrants.Thecenterscreenisa3Dwindowthatshowsawide-areaviewoftheentirescenario;eachofthequarter-screenwindowson the left and right shows informationpertaining toa single federate. Taking advantage of new high-endcomputers with advanced graphics capabilities, APLhas developed a single visualization for ARTEMISthat simultaneously displays each federate, allowingengineersandsponsorstoviewtheenormousamountofengineeringinformationfromanARTEMISsimulationrun(Fig.16).

To best represent each individual federate, APLdecided that several federateswouldbe shownas2Dplots or as text readouts instead of the 3D-renderedimagesforwhichthevisualizationsoftwarewasgeared.The solution was to write these non-3D federate

windowsusingtheMotifwidgetsetandcombinethewidgetswiththe3Dsoftware.TheMotif-basedfeder-atewindowswerewrittenintoalibraryseparatefromthecore3Dvisualizationsoftwaresothatthesefeder-atescouldruninaseparateprocess,takingadvantageof APL’s multiprocessor system and minimizing theimpactonthe3Drenderingsoftware.Sharedmemorywasused for communicationbetween theMotif andthe3Dprocessesandforintegratingthetwoprocessesontoasingledisplaybyreparentingthe3DwindowsintotheMotifscreen.

Even with three 1280  1024 pixel screens, thedisplayrealestateisatapremiumwhentryingtosimul-taneously visualize all of the federates; therefore, anyofthequarterscreenscanbeinteractivelyselectedandenlargedtotakeupone,two,orallthreescreens.Whena federatewindowisenlarged, theengineerpopulatesthe remaining window area with additional metricsand other engineering information pertinent to thewindow’sfederate.Toconservescreenspace,displayeddataaredynamicallychangedthroughtheuseofMotifselection lists, dynamic and logarithmic scales, scrol-lable text lists,andcontext-sensitivepopups thatcandisplayextendeddata.

The global window is the center visualizationwindow for ARTEMIS. It is the only window thatoccupies one entire screen and is similar to a single-screennondistributedvisualization.Theglobalwindowvisualizes the ARTEMIS run from a fixed view farabove theEarth. It contains the terrainvisualization,theshipfromwhichthedefendingmissileislaunched,thethreatanditslauncher,andanyrelevantbeamsorvolumes from the SPY radar or infrared (IR) seekeronboardthefourthstageofStandardMissile-3(SM-3).Italsocontains readouts formetrics suchas timeandlaunchpositionforthemissileandthreat.

In theupper left-handquadrantof the left screen,the engineer visualizes engineering information fromthe scenario manager federate. This window alsoshowsthemessagetrafficexchangedamongallof the

Figure 16. The ARTEMIS three-screen, nine-window visualization. From upper left to bottom right: the scenario manager federate window, systems engineering window, missile guidance federate window, missile signal processor federate window, global window, threat federate window, SPY radar federate window, command and decision federate window, and WCS federate window.

Page 14: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002) 309

ENGINEERING VISUALIZATION

ARTEMIS federates. As the visualization progresses,federate-to-federate message metrics scroll by in thescenario manager window. The window contains thesenderofthemessage,thereceiverofthemessage,themessagetype,thetimeatwhichthemessagewassent,andplotsofthemessagetrafficrates.Also,ifthemes-sagecontentsareavailable,theusercanselectmessagesanddisplaytheircontentsinapop-upwindow.

To the right of the scenario manager federatewindow is the systems engineering window, whichdisplaysaslideshowthatissynchronizedwiththevisu-alization.Asthevisualizationprogresses,theslideshowpresentscriticalevents,suchaslaunching,staging,andintercepting,astheyoccur.

In the lower left quadrant of the left screen is themissileguidancefederatewindowwhichdisplaysa3Dvisualization of the missile guidance federate and issimilartothemissileentitysingle-screenvisualizationsforwhich the softwarewasoriginallywritten.Entitiesdisplayedinthiswindowaretheterrain,theshipfromwhich the missile is launched, the defending missileitself, the beam representing the seeker onboard thefourth stage of SM-3, and the threat as it enters intoviewduringendgame.

Totherightofthemissileguidancefederatewindowis the missile signal processor federate window whichcontains engineering information that is germane tothemissilesignalprocessor,includingtheazimuthandelevationoftheIRreturnsfromtheSM-3’sfourthstageIRseeker,thecurrentmodeoftheseeker,andrawandstabilizedanimatedplotsofthereportedelevationandazimuth.

In the upper left quadrant of the right screen isthe threat federatewindow.Like themissileguidancefederate window, the threat federate window displaysa3Dvisualizationsimilar totheoriginal single-screenvisualizations.Itcontainstheterrainentity,thethreatandlauncherentity,andthemissileentityasitentersintoviewduringitsfourthstage.Italsoprovidesmenu-selectableoptionstoenablesubwindows,whichdisplaythe current IR signature and radar cross-section sig-nature that the threat is projecting to the defendingweaponsystems.

TotherightofthethreatwindowistheSPYradarfederatewindowwhichcontainssearchandtrackinfor-mationfromtheSPYradarfederate.Here,a3Dvisual-izationoftheSPY-filteredthreattrackandtheground-truth threat track provides a visual representation oftheSPYtrackerrors.Theseerrorsarealsodisplayedinthreeoverlaystripchartstobettershowtheirscaleanddirection.

Inthelowerleftquadrantoftherightscreenisthewindow for the command and decision federate. Thiswindow visualizes the engageability tests performedby the command and decision federate preceding themissile engage order. The instantaneous results of

engageability tests for engagement quality, altitudecheck, screens, and intercept point evaluation arepresentedasatableofpass/failbars.Totheimmediateright of the table, any of the individual engageabilityparametersmaybeplotted.

Totherightofthecommandanddecisionfederatewindow is the WCS federate window. This windowshowssuchmetricsasprelaunchcalculationsperformedbytheWCSfederate;metricsfromthemidcourseguid-anceofthemissile,whichishandledbytheWCS;andmetrics from the handover event in which the WCS“handsover”informationtothemissilesignalprocessor.Totheimmediaterightofthetabulardata,theusermayviewplotsforanyofthefederate’smetrics.

THE FUTUREAPL intends to develop many additional visualiza-

tioncapabilitiesandincorporatethemintothecurrentengineeringvisualization:

• Reconstructions:Includevideofootagefromlaunchsite as well as footage from airborne and onboardmissile/threatvideocamerasandengineering infor-mationfromthegroundstationandsatellitesinthevisualization.

• Communications: When relevant to the visualiza-tion, add adisplayof communications as theypassbackandforthamongthevisualizationentities.

• IRsignatures:Developamethodbywhichinstanta-neousIRsignaturesofthethreatcanbedynamicallymappedontoitsskinasitflies.

• Finiteelement:ImproveupontheSphinxhydrocodetool’s2DvisualizerusedatAPLtoevaluatepostint-ercept lethality of threats through finite elementanalysis.

• ARTEMIS:MakeAPLavisualizationfederateintheARTEMIS federation of models, enabling users tovisualize ARTEMIS concurrent to the simulation’sexecution.

• Undersea:Developanunderseavisualizationcapabil-itythatwillallowAPLtostudyundersea-launchedweaponsystems.

• Multimission:Developanengineeringvisualizationfor nodal analysis tools, such as the APL Coordi-natedEngagementSimulationortheExtendedAirDefenseSimulation.

• Radar: Add fuller functionality to the SPY radarvisualizationinascenario,includingerrorellipsoids,launch event correlations, and increased precisionforsearchbeamlocations,clusters,groups,andanyothermetricswhichmaybepertinent.

• Particle system: Develop a multiprocess particlesystemforefficientlyrenderinggreatnumbersofpar-ticlesforimprovedsmokeanddebrisrepresentationinthevisualization.

Page 15: Engineering Visualization - Applied Physics Laboratory · Engineering visualization allows the engineer to present the results of M&S-based ... with hundreds of slides and can require

310 JOHNSHOPKINSAPLTECHNICALDIGEST,VOLUME23,NUMBERS2and3(2002)

D.  E-P.  COLBERT  and  R.  E.  RALSTON

THE AUTHORS

DAVIDE-P.COLBERTreceivedaB.S.inengineeringphysicsfromtheUniver-sityofIllinoisatUrbana-Champaignin1996.HejoinedtheAirDefenseSystemsEngineeringGroupofADSDatAPLin1996.Sincethen,Mr.ColberthasbeendevelopingvisualizationsofnavalweaponsystemsforAreaandSelf-DefenseAnti-AirWarfare,AreaTBMD,NavyTheaterWideTBMD,OverlandCruiseMissileDefense, Cooperative Engagement Capability, and Joint Mission Defense. He iscurrentlyworkingonoceanographicsurfacewaveanalysis,oceanographicenviron-mentalanalysis,andengineeringvisualizationasamemberoftheAPLAssociateProfessionalStaff.Hise-mailaddressisdavid.colbert@jhuapl.edu.

R.EDWARDRALSTONisamemberofAPL’sSeniorProfessionalStaffintheAirDefenseSystemsEngineeringGroupofADSD.HereceivedaB.A.inmathemat-icsfromSt.Mary’sCollegeofMarylandin1992andanM.S.inmathematicsfromTexasTechUniversityin1995.PreviouslyheworkedforVeridianandElectronicArtsasagraphics softwareengineer.HefirstcametoAPLasa subcontractor in1996andwastaskedwithredesigningexistingvisualizationsoftwareintoareusableobject-orientedframework.Mr.RalstonbecameanAPLstaffmemberin2001andis currently developing testbed software for the MESSENGER team. His [email protected].

SUMMARYHigh-fidelity engineering information is critical to

facilitate the design, analysis, and M&S of complexweaponsystemstodefendagainstthreatstoournation.ThetechnicalcommunityandtheassociatedDoDspon-sorsmusthaveconfidenceinthatengineeringinforma-tionandmustunderstanditatvariouslevels,dependingon the functionality of the technical community andsponsorsintheoverallacquisitionprocess.Contempo-rary engineering information is becoming even morecomplicatedbecauseofadvancesinthreattechnology,which drive the need for more complex weapon sys-temstodefeatthem.Also,themorerecentDoD-widefocusonadistributedsimulationcapability,whichinte-grates physically separate high-fidelity physics-based

simulations, is also contributing to the increasingcomplexityof theengineering information.APLhasdeveloped a state-of-the-art engineering visualizationcapabilitywhichhasbecomeanessentialsystemsengi-neering tool by providing confidence in and compre-hensionofcomplexengineeringinformation.

REFERENCE 1Pollack,A.F.,andChrysostomou,A.K.,“ARTEMIS:AHigh-Fidel-

ity End-to-End TBMD Federation,” Johns Hopkins APL Tech. Dig.22(4),508–515(2001).

ACKNOWLEDGMENTS: The authors wish to acknowledge Bernie Kraus,R.KentKoehler,KevinWilmore,DougOusborne,DaveWu,PhilMiller,RichardFreas,BillCritchfield,NancyCrowley,ChadBates,andSimonMoskowitz,aswellasthemyriadAPLmodelerswhoprovidedtheirexpertiseanddatatous.


Recommended