+ All Categories
Home > Documents > Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… ·...

Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… ·...

Date post: 06-Feb-2018
Category:
Upload: vudien
View: 217 times
Download: 1 times
Share this document with a friend
7
Personal Panoramic Perception Terry Boult [email protected] Vision and Software Technology Lab, EECS Department Lehigh University Abstract For a myriad of military and educational situations, video imagery provides an important view into a remote location. These situations range from remote vehicle op- eration, to mission rehearsal, to troop training, to route planning, to perimeter security. These situations require a large field of view and most would benefit from the abil- ity to view in different directions. Recent research has led to the development of new technologies that may radically alter the way we view these situations. By combining a compact omni-directional imaging system and a body-worn display, we can pro- vide a new window into the remote environment: personal panoramic perception ( ). The main components of a system are the omni-directional camera, a body-worn display and, when appropriate, a computer for process- ing the video. This paper discusses levels of immersion and their as- sociated display/interface “needs”. It also looks at the capture system issues, including resolution issues, and the associated computational demands. Throughout the discussion we report on details of and experiences from using our existing systems. 1 Introduction The ability to generate panoramic video has been around for years, e.g. see [1, 2], but it has seen limited usage. What has changed recently, and is driving a growing in- terest, is the combination of simultaneous decreased size and increased quality in collection systems, coupled with low-cost means of presenting/processing this data to pro- vide perspective images. This paper looks at the compo- nent technologies and the systems issues involved in sup- porting a Personal panoramic perception ( ) system, where a user has a personal system for viewing differ- ent areas within a panoramic video stream. Unlike re- mote pan-tilt camera based systems, supports multi- ple users simultaneously looking in different directions, which makes it ideal for team oriented exercises. The main components of a system are the omni- directional camera (with video recording or transmission), a body-worn display and, when appropriate, a computer for processing the video. Let us begin with an overview of these components of the system. The paracamera based collection systems, pioneered by S. Nayar, is a compact camera system that images a hemisphere or more while maintaining a single perspec- tive viewpoint, [3]. The images can be processed to pro- duce a proper perspective image in any direction captur- ing the entire viewing hemisphere in a single image, see Figure 1. An example car-mount paracameras. figure 3. The paracameras can vary in size from small transmitting systems (about 9cm tall by 6cm in diameter), to compact recording systems, to self contained underwa- ter recording, to intensified night vision systems, see fig- ures 1–2 for some examples. Supporting geometrically correct, live omni-directional video in a small package is a key constraint for most of the aforementioned applica- tions. Figure 2. Second generation underwater para- camera. System dimensions are 25cm x 20cm x 18cm (plus 16cm for arm). For the body-worn display we have been experiment- ing with different ways of displaying the information in- cluding direct paraimages, panoramic views, and user- directed perspective windows. The display device can Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.
Transcript
Page 1: Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… · Personal Panoramic Perception Terry Boult tboult@eecs.lehigh.edu Vision and Software

Personal Panoramic PerceptionTerryBoult [email protected]

Vision andSoftwareTechnologyLab,EECSDepartmentLehighUniversity

AbstractFor a myriad of military and educationalsituations,

videoimagery providesan importantview into a remotelocation. Thesesituationsrange fromremotevehicleop-eration, to missionrehearsal, to troop training, to routeplanning, to perimetersecurity. Thesesituationsrequirea largefieldof view andmostwouldbenefitfromtheabil-ity to view in differentdirections.

Recentresearch has led to the developmentof newtechnologies that may radically alter the way we viewthesesituations.Bycombiningacompactomni-directionalimaging systemand a body-worndisplay, we can pro-videa new windowinto theremoteenvironment:personalpanoramicperception(

���). Themain componentsof a� �

systemare theomni-directionalcamera, a body-worndisplayand, whenappropriate, a computerfor process-ing thevideo.

Thispaperdiscusseslevelsof immersionandtheir as-sociateddisplay/interface“needs”. It also looksat thecapture systemissues,including resolutionissues,andthe associatedcomputationaldemands.Throughoutthediscussionwe report on detailsof and experiencesfromusingour existing

� �systems.

1 IntroductionTheability to generatepanoramicvideohasbeenaroundfor years,e.g. see[1, 2], but it hasseenlimited usage.Whathaschangedrecently, andis driving a growing in-terest,is thecombinationof simultaneousdecreasedsizeandincreasedquality in collectionsystems,coupledwithlow-costmeansof presenting/processingthisdatato pro-vide perspective images.This paperlooksat thecompo-nenttechnologiesandthesystemsissuesinvolvedin sup-porting a Personal panoramic perception (

� �) system,

wherea userhasa personalsystemfor viewing differ-ent areaswithin a panoramicvideo stream. Unlike re-mote pan-tilt camerabasedsystems,

���supportsmulti-

ple userssimultaneouslylooking in differentdirections,whichmakesit idealfor teamorientedexercises.

The main componentsof a���

systemare the omni-directionalcamera(with videorecordingor transmission),a body-worn displayand,whenappropriate,a computerfor processingthevideo. Let usbegin with anoverviewof thesecomponentsof thesystem.

The paracamera basedcollectionsystems,pioneeredby S. Nayar, is a compactcamerasystemthat imagesahemisphereor morewhile maintaininga singleperspec-tive viewpoint, [3]. Theimagescanbeprocessedto pro-ducea properperspective imagein any directioncaptur-ing theentireviewing hemispherein a singleimage,see

Figure 1. An example car-mount paracameras.

figure 3. The paracamerascanvary in size from smalltransmittingsystems(about9cmtall by 6cmin diameter),to compactrecordingsystems,to selfcontainedunderwa-ter recording,to intensifiednight vision systems,seefig-ures1–2 for someexamples. Supportinggeometricallycorrect,live omni-directionalvideoin a smallpackageisa key constraintfor mostof theaforementionedapplica-tions.

Figure 2. Second generation underwater para-camera. System dimensions are 25cm x 20cm x18cm (plus 16cm for arm).

For thebody-worn displaywe have beenexperiment-ing with differentwaysof displayingtheinformationin-cluding direct paraimages,panoramicviews, and user-directedperspective windows. The display device can

Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.

Page 2: Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… · Personal Panoramic Perception Terry Boult tboult@eecs.lehigh.edu Vision and Software

range� from an immersive HMD with head-tracking,to asmall monocularHMDs, to hand-helddisplaysor evencommercialTVs.

ForcurrentsystemsuseCOTSframegrabbers/processors.Ona233Mhzx86processorourRemoteRealitysoftwareallowstheHMD to view 30frame-per-second(fps)videoof the remotesite in whatever “direction” theuserlooksor directs.Thesystemis capableof updatingits viewingdirectionswith only a30to 60millisecond(15-30fps)de-lay.

Thispaperbeginsby examiningdifferentlevelsof im-mersionand their associateddisplay /interface“needs”,then looks at the capturesystemissues(including reso-lution issues)andendswith a looksat computationalde-mands.

2 Levels of Immersion and User InterfaceWhile therearemany potentialapplications,we usethedesiredlevel of “immersion” to separateour discussioninto threemaingroups:

� highly immersive:giving theusertheimpressiontheyareat theremoteapplication.

� informative: giving theuseraccessto remote“infor-mation” in any or all directions,while still maintain-ing theuser’s localsituationalawareness.

� augmentive: enhancingeitherof theabove interfacewith overlayedcontextual information. This reducesimmersionandaddscomplexity to thesystem,but itcanincreasesituationalawareness.

We briefly discusseachof theseapproaches.

2.1 High Immersion: Remote RealityOur first interfaceis a immersive, like in many virtualreality system,but becauseit provide video accessto aremotelocationwe referto it asRemoteReality. This in-terfaceusesa a bi-ocularHMD with a headtracker, seefigure4. Theheadtracker providesorientationinforma-tion which is usedto determinetheviewing directionforthe unwarpingmap. As the HMD turns(or if the usersrequesta software“zoom”) the virtual viewpoint is sta-tionary;only thedirectionof thevirtual “imaging array”is moved.Webriefly look at thesignificantissuesfor thistypeof interface.

While any panoramicimagegenerationprocessmightbeusedfor this typeof immersive display, our work hasconcentratedon paracamerasystems. In principle anyothercollectionsystemthat maintainsa singleperspec-tive viewpoint, e.g. [4], could be usedbut mostof themare larger, more difficult to calibrateor build.1 If theviewpoint is not constant(or at leastconstrainedto a be

1In [5] the completeclassof possiblelens& (single)mirror basedsystemsthat produceomni-directionalimagewas investigatedto seewhichsatisfythesingle-viewpoint assumption.

in a very smallvolume),theresultis a lurchingor bend-ing in theimagesastheHMD changesorientation.Suchartifactssignificantlyreducetheimmersion.

With thesingleviewpoint imagingandanHMD withhead-tracking,we canproducea systemthat providesavery smoothand very naturalvisual change. Howevermaintainingthe illusion of immersionalso dependsonacceptablesystemresponsetime. Making thesystemfastenoughtookafew, but straightforwardtricks: fixedpointmathfor mostcomputationsandtablelookupfor theex-pensive operations. Becausewe can boundthe size ofall inputsandaddresseswe canboundcalculationopera-tions,includingtable-lookup-baseddivision,canlimit er-rorsto lessthan1/16pixelsusingonly 32bit integeroper-ations.With this,a233Mhzx86processorcanupdatetheview mapsat15-30fps(dependingonothersystemload).

Figure 4. An immer sive interface: Remote Real-ity head-trac ked HMD. User is holding an earlycar-mounted para-camera.

To maintainthe immersion,the displayedhorizontalfield of view (HFOV) needsto bereasonablymatchedtothedisplay’s visual extentandtheusershouldseenoth-ing but the projectedremotevideo. SincemostHMDsonly have a 30-50degreeHFOV, theresultis a little likelookingthroughgoggles.If asignificantlylargerphysicalHFOV is mappedinto thesmalldisplay, theuserwill per-ceive an unnaturalwarpingor wobbling as they changetheir headposition. While our prototypesetupapproxi-matelymatchesthevisualandphysicalsensationsit doeslimit thesituationalawarenesssincethereis noperipheralvision. With betterHMDs, thepotentialexists to have amuchlargerFOV andincludeperipheralvision.

Wealsonotethat,theusersneedto turntheirhead,notjust their eyes,to seein a new direction. While this ini-tially distractsfromtheirimmersion,theuserveryquickly

Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.

Page 3: Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… · Personal Panoramic Perception Terry Boult tboult@eecs.lehigh.edu Vision and Software

Figure 3. An Omnidirectional (ParaImage) taken from a car. Note the “struts” are from an early version ofthe car mount, newer versions have onl y 1 (smaller) strut.

becomesacclimatedto thisconstraint.Thehigh immersionof RemoteRealityprecludesthe

userfrom seeingtheir local environment,thusthis is ap-propriateonly for applicationswherethe user is activein their observationbut passive with respectto their ownenvironment. If usedin a tele-operationscenario,theusercan control a remotevehicle’s motion. For otherusers,it is as if they arepassengersat the remoteloca-tion. Someobvious applicationsfor immersive remoterealityaretele-operation,education,trainingandmissionrehearsal.Exceptfor the tele-operation,the point is toacquaintthe userwith a remoteenvironment,acquiringknowledgeandexperience,andhencetheseapplicationslend themselves to recordedremotereality. A few lessobviousapplicationsincluderecording/replayingfor: cat-aloging the state/contentsof complex facilities suchasshipsor pipe complexesandsecuritysurveys of a route

or building.

2.2 InformativeFor othersituations,e.g. policeor military operationsinurbanterrain, is not acceptablefor the userto be com-pletely immersed. Insteadthe usermust be aware of,andoften moving within, their local environmentwhilethey simultaneouslyexpand their situationalawarenessof the remotelocation. Thus we have beeninvestigat-ing differenttypesof informative,but minimally invasive,interfaces. Theseinterfacesuseoneof two displayde-vices. Thefirst is a smallunobtrusive monocularHMD,seefigure5. Thesecondis ahand-helddevicesuchastheportableTV in figure6. (Of course,higherprice/qualitymodelsof bothof thesetypesof displaydevicesexist).

In theimmersiveinterfacethehead-trackerprovidedavery naturalmeansfor the userto choosea directiontoview. Evenif thedisplaywasunobtrusive,asin figure5,

Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.

Page 4: Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… · Personal Panoramic Perception Terry Boult tboult@eecs.lehigh.edu Vision and Software

the�

needto useone’s headto choosea viewing directionis impracticalwhile walking or takingpart in almostany“local” event. One of the most difficult aspectsof theinformative displaysis how, or if, to choosea viewingdirection.

Figure 5. An inf ormative monocular displa y with(a trac k-ball pointer).

A directanalogueof thehead-trackeddisplayis topro-videtheuserwith sometypeof apointingdevice,e.g.thebelt-worn track-ballin figure5. With thepointingdevicetheusercanchooseany directionof interest.Theadvan-tagesof this is thatthey canmaximizethedisplayedres-olution (many smallLCD canonly display320x240truepixels), and,whenneeded,canchoosenew viewpoints.Thedisadvantageis thatchoosinga view requiresa freehandandsomepracticeto get usedto the interface. Itcan be effective for teamoperationswheresomeoneistaskedwith a particularview direction. Sincethis inter-facerequiresboth an interactiondevice and reasonableCPU power, a machinesupportingthis canalsosupportthe following two interfaces,andonecould tradeoff be-tweenthethree.

The remaininginformative displaysarewhat we callinformation overview, they provide information on theentiresceneat onetime. The mostobvious informativeoverview displayis to generateapanoramicview. Unfor-tunatelytheaspectratioof a panoramais far from thatofmostdisplaytechnologiesanddirectdisplaywouldresultin very poor visible resolution. Thereis also the ques-tion of the type of panoramato show (spherical,cylin-drical, or somecustomversion). To help with the reso-lution issueswe displaythescenein a split view, with a

panoramafor the forward (with respectto vehicle)andonefor therear-view (with left-right reverseasin a rear-view-mirror). Thesearethenstackedto provide full cov-eragein a 4x3 aspectration display. We have experi-mentedwith varioustypesof panoramaandarecurrentlyusingonewherethe azimuthanglegrows linearly. Wehave foundthis providesa goodtradeoff betweenresolu-tion in regionsof mostimportanceandperceived imagedistortion. Note that this interfacerequireslittle trainingandno userinteraction,but placesthe highestdemandson the computingand I/O subsystem(we warp the full640x480image)anddisplayresolution.

The “simplest” interface, is simply to broadcasttheparaimageto a displaydevice. This approachhasthreeprimaryadvantages:

1. Thereis nouserneedto “point” asthedisplayshowsall directionsatonce.

2. Thereis noadded“computational”requirements.

3. Thedirectionwithin theimageis theactualdirectionfrom thecamerato theobjectof interest.

Theprimarydisadvantageis thattheinterpretationof theimageis not asintuitive. As canbeseenin figures3 and6, thelower partof theimageis relatively easyto under-stand(front of vehicle),but objectsbehindthevehicleareupsidedown. With a little training,however, it becomesquite understandable(andis now the preferredinterfaceby my studentsandI for operationsin complex environ-ments).If upside-down viewing is a problem,hand-helddisplacescanbe rotatedif needed,or inexpensive videoflipperscouldbeused.

2.3 Augmentive displaysThe final type of interface,or moreappropriatelyinter-faceoption, is being developedfor applicationswherethe userneedsto augmenttheir reality, ratherthansup-plant it. The goal hereis to add information,basedonadditionalsensorsandcollateraldata,to thevideostreamtheuseris seeing.Theapplicationshereincluderemotevehicleoperationandurbanpoliceactions.Both groundandhelicopter-basedsystemsarebeingdeveloped/tested.

For vehicleoperation(asopposedto remoteobserva-tion) it is generallynot sufficient to immerseoneselfinthevideoat theremotelocation.While thehead-trackinginterfaceis naturalfor view pointing,theuserneedsaddi-tionalinformationsuchasspeedandstatus,ataminimumthey shouldbeableto seetheir “dashboard”.In additionit might behelpful if they couldseevehiclepositionanddirectionwith respectto a map. This typeof augmenta-tion is what onewould expect in vehicleoperationandlikeexistingsystemswearedevelopingsystemto usere-moteGPS(or DGPS)andinertialnavigation.Initially weanticipatethevehiclepilot will beat a safelocationandwill usethebi-ocularHMD with headtrackingfor setting

Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.

Page 5: Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… · Personal Panoramic Perception Terry Boult tboult@eecs.lehigh.edu Vision and Software

Figure 6. A hand-held displa y (low cost TV)sho wing a raw paraima ge

view direction,leaving theirhandsfreeto operatetheve-hicle.

An addedtypeof augmentation,currentlyonly effec-tive when the vehicle hasstopped,is for us to providea trackingsystemto warn the userof motion within thescene,see[6] for detailson the algorithm. This is cur-rentlybeingaddedto theinformative“overview” typesofdisplays. (On a directedview interfacewe would haveto provide a meansfor the user to locate the target orto understandthenew viewing directionif automaticallyprovided). We notethat this canaddsignificantlyto thecomputationaldemandsof thesystem,but canstill beac-complishedat 15-30fswith COTS hardware(high powerdrain)or 5-10fpsonmorepowerefficienthardware.

2.4 So what interface to use?In urbanmaneuvers,adrivercanpilot thevehiclesfrom arelatively safelocation,but otherteammembersneedstobefollowing it for theclearing/securityactivities. Theve-hiclescantransmit(encryptedif needed)omni-directionalvideowhile teammembersuseaugmentingremotereal-ity to look for potentialthreatsaroundthevehicle’s loca-tion. Unlike what could be donewith a pan-tilt system,theteammemberscansimultaneouslylookin differentdi-rectionsa soldiercanwatchhis own back. Additionally,no teammemberneedsto transmitto thevehicleto con-trol the pan/tilt viewing direction; the forward teamcanall beradiosilent.

Informal observationsshow that for simple environ-ments,pilots using the immersive HMD spendmostof

their time facingdirectly ahead,but as the environmentbecomesmorecomplex andthedesiredpathincludesmanyturns,thepilotsincreasinglyusetheir freedomof viewingdirection.Otherthanthespeedof response,usingremotereality for a solo pilot is not significantlydifferent thanhaving aremotepan/tiltunit. Thedifferencebecomesap-parentwhen the pilot or other teammembersneedstonavigatewhile alsolocatingsignificantnon-navigationalfeatureswithin theenvironment.

Preparationsare underway for formal evaluationsofthis hypothesisalsoa subjective comparisonof the dif-ferentinterfacesfor a collectionof Military OperationinUrban Terrain (MOUT) type tasks. Thesewill includebothdriving, target localization/identification(by driver)and target localization/identificationby teams. The ex-perimentswill usea tele-operatedvehicle,our RROVer(RemoveRealityOmni-Vehicle),seefigure7

3 Systems issuesThefirst prototypeimmersive systemstrove to minimizecostwhile maintainingacceptablequality. Thusthesys-tem usesCOTS parts. Our currentdatacollectionsys-tem wasapproximately$4K (+$1K for underwater)andthe computing/HMDplay-backsystemwas about$3K.The systemusesa 233MhzK6 CPU (runningLinux) &$300videocapturecard. Thesystemcomputesbiocular320x24030 fps NTSCvideo. This resolutionis reason-ably matchedto the HMD used,which is currentlyVir-tual I-O glasses.TheVIO built-in headtracker providesyaw, pitchandroll, with updatesto theviewing directionat 15-30fps. With a betterheadtracker (e.g. IntersenseIS300)and300MhzCPUwe caninsureconsistent30fpsupdateof bothviewpoint andvideodata. BetterHMD’sare also commerciallyavailable, at costsranging from$2K to $10K,for low to mediumvolumeusageand$20Kvery ruggedhigh-volumeusage.We arenow porting tousea 640x480resolutionHMD andbetterheadtrackersandexpectto demothis improvedsystematCISST.

Wenotethattheabovedescribedhardwareisnot“wear-able”, but suitablefor a desktop/remotedriver. Unfortu-natelynoneof commerciallyavailablewearablecomput-ershave the video I/O bandwidthandresolutionneces-saryfor the640x48030fpsvideoprocessing.Wehaveas-sembleda wearableversionsusinga PC104+basedCPUwith a BT848videocapturecard.Thisoperatesat 30fps,but draws significantpower (25-30W).A second(lowerpower, lowerspeedandlowercost)usesa NetwinderTM

andoperatesat8fps.Thelimiting factorin thesesystemsis I/O requirementsof full resolutionvideo,not theactualcomputationsneededfor thedifferentuserinterfaces.Awearableversionis neededonly for the immersive dis-play, for dualdriving panoramas,thecomputercanbeonthe vehicleandtransmitthe processedvideo, or a sepa-ratemachinecanreceivetheraw videoandretransmittheprocessedviews.

Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.

Page 6: Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… · Personal Panoramic Perception Terry Boult tboult@eecs.lehigh.edu Vision and Software

Figure 7. The Remove Reality Omni-Vehic le, a testbed for our stud y of personal panoramic perception

4 Para-Cameras and ResolutionWhile remoterealitysystemscouldbebuilt with amulti-tudeof camerasat the remotelocation,centralin its de-signwastheomni-directionalcameradesignedby ShreeNayar [5]. This cameradirectly capturesa full hemi-sphere(or more)while maintaininga singleperspectiveviewpoint allowing it to be usedfor full motion video.Furthermore,placing two of theseparacamerasystemsback-to-backallows a true viewing sphere,i.e. 360 x360 viewing. Unlike fish-eye lenses,eachimagein theparacamerasystemcanbeprocessedto generategeomet-rically correctperspectiveimagesin any directionwithintheviewing hemisphere.

The paracamera’s omni-directionalimagercombinesan telecentric/orthographiclens and a parabolicmirrorwith theaxisof theparabolicmirror parallelto theopticaxisof thelenssystems.Theorthographiclensresultsintheenteringraysbeingparallel.Raysparallelto theaxisreflectoff a parabolicsurfaceat an anglesuchthat theyvirtually intersectat the focus of the parabolicsurface.Thusthe focusof theparacameraprovidesa single“vir -tual” viewpoint. The singlevirtual viewpoint is criticalfor theRemoteRealitysystemasit allows for consistentinterpretationof theworld with a very smoothtransitionas the userchangesthe viewing direction. While thereareothersystemswith large or even hemisphericfieldsof view, asshow in [7], fish-eye lensand hemisphericalmirrors donotsatisfythesingleviewpointconstraint.

Becauseomni-directionalimagingcompressesaview-ing hemisphereinto a small image,maintainingresolu-tion andcapturedimagequality is quite important,and

takescarefuldesign.While theprocessscalesto any sizeimager, thecurrentsystemsuseNTSC(640x480)or PAL(756x568)cameras.Notethat the“spatial resolution”oftheparaimageis notuniform. While it mayseemcounterintuitive,thespatialresolutionof theparaimagesis great-estalongthehorizon,justwhereobjectsaremostdistant.While the processscalesto any sizeimager, the currentsystemsuse640x480NTSC(or 756x568PAL) cameras.If we imagethewholehemisphere,thespatialresolution

along the horizon is��� pixels� � ���� degrees

����� � pixelsdegrees(5.1

PAL) which is 14.3arc-minutesperpixel (11.8PAL). Ifwezoomin on themirror, cuttingoff asmallpartof it, toincreasethecapturedmirror diameterto 640pixels(756PAL), wecanachieve10.7arc-minutesperpixel, i.e. 5.5pixel perdegree(6.6PAL).

As a point of comparison,let usconsidera traditional“wide-angle” perspective camera,suchasthoseusedinbuilding “multi-camera”panoramicsystems.If we allowfor asmalloverlapin fieldsof view, tosupportblendingattheseam,it wouldtake3 cameraswith abouta ������� hori-zontalfield-of-view (FOV) to form apanorama.Notethat

eachof thesewould have �� pixels���� degrees

����� � pixelsdegrees, i.e.

aboutthesameastheParacamera.Clearly, thetraditionalcameraswouldneedmorehardwareandcomputation.

Theparacamera’suniquedesignyieldswhatmaybeanew paretooptimaldesignchoicein the resolution/FOVtrade-off. We have the horizontalresolutionof a �������camerabut cover the full !#"$�#� of thehorizon. The“lostpixels” occur in the region above the horizonwherethepara-camera’sresolutiongoesdown,while traditionalcam-

Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.

Page 7: Personal Panoramic Perception - vast.uccs.edutboult/PAPERS/CISST99-Personal-panoramic-pe… · Personal Panoramic Perception Terry Boult tboult@eecs.lehigh.edu Vision and Software

eras% have increasingoverlap.As an informal point on the “quality”, we note that

somegraphics/VR-orientedpeoplehear about the out-put resolution,320x24016bit color, usedin the immer-sive display, andwant to dismissit asinadequate.How-ever, the initial systemhasbeendemonstratedto a largenumberof people( &'�(�$�#� ), e.g. see[8], [9] and[10],with verypositive feedbackfrom mostof them.Eventhe“skeptics”whohavetried it admittedthey weresurprisedatthequality. While theresolutionis farfrom thatof highendgraphicssystems,thenaturalnessof objects,fluidityof motion andthe complex/subtletextures(evenat low-resolution)of the video seemto make up for the pixelloss.

WenotethatCyclovisionnow sellsa1Kx1K still cam-eraversionandwe have built a 1Kx1K systemthatoper-ates(but cannotrecord)at 5fps system. Higher resolu-tion/speedsystemsarebeingdeveloped,thoughthey willbeconsiderablemoreexpensivethanthosebasedoncon-sumercameras.

5 Camera issuesWhile a numberof paracameramodelsarecommerciallyavailablefrom www.cyclovision.com,for mostof our re-moterealitysystemhavedevelopedourown smallercus-tomdesignsdirectlyincorporatingcamcordersratherthancameras,e.g.seefigures1. (Notesmall9cmtall systemsarenow commerciallyavailablefrom cyclovision.) Thedevelopmentof theunderwatercamerasandvehiclecam-erasinvolvedsolvingbothopticalandmechanicaldesignproblems. We are currently working on an omnidirec-tionalsystemfor helicoptersandoneto becarriedunder-waterby a dolphin.

Figure 1 shows somecustomcar mountsfor omni-cameras.The early vehiclemounts,see(left) usedtheCyclovisionparacamerasandaseparatetape-recorderin-sidethe vehicle. They canbe attachedto the car wind-shieldor roof via suction-cupsandstrapsand,while largeand obtrusive, were quite functional. The secondgen-erationusesour customdesignwith optical folding andintegratedcamcorder. Thisputstheuserandcamerabackbehindthe mirror and inside the vehicle. To useit oneonly needsto “pop-up” themirror abovea sunroof. Theinsetshowsasideview. In bothcases,dampingvehicularvibrationsareanissue.

Fromour experiencethereare3 mainissuesin omni-directionalcameradesignfor thesetypesof applications:

1. Resolutionlimits imposedbyopticalcomponents(lensesandmirrors).

2. Resolutionlimits imposedby cameraelectronicsin-cludingpixelcounts,light sensitivity andreadoutelec-tronics.Thesinglemostsignificantcameraissue,be-causeof the unwarping, is interlacevs progressivescan.Thesecondis camerapixel countsandgeneralCCD/color“resolution” issues.

3. Mechanicalmounting; Even small vibrationsintro-duceblurring.

6 ConclusionThis paperhasdiscussedsomeof themajorissuesin de-velopinga personalpanoramicperceptionsystem.Indi-vidualapplicationswill needto tailor theconceptto theirsituations,but the papershouldprovide a goodstartingpoint for theuserinterfaceissues,the imagingissueandsomesystemsissues.

Whencombinedwith thesmallsizeandeaseof useoftheparacamera-basedcapturedevices,personalpanoramicperceptionandremoterealityofferssignificantadvantagesin anumberof domainswheresimulatedworldsor simplevideoarecurrentlybeingused.

[1] D. Rees, “Panoramictelevision viewing system.”UnitedStatesPatentNo. 3,505,465,April 1970.

[2] J. Charles,R. Reeves,andC. Schur, “How to buildand usean all-sky camera,” AstronomyMagazine,April 1987.

[3] S. Nayar, “Omnidirectionalvideo camera,” in Pro-ceedingsof the1997DARPA Image UnderstandingWorkshop, May 1997.

[4] V. Nalwa, “A true omnidirectionalviewer,” tech.rep., Bell Laboratories,Holmdel, NJ 07733,USA,February1996.

[5] S.Nayar, “Catadioptricomnidirectionalcamera,” inProceedingsof the 1997 Conferenceon ComputerVision and Pattern Recognition, pp. 482–488,June1997.

[6] T. Boult, A. Erkin, P. Lewis, R. Micheals,C. Power,C. Qian,andW. Yin, “Frame-ratemulti-bodytrack-ing for surveillance,” in Proc. of the DARPA IUW,1998.

[7] S.K. NayarandS.Baker, “CompleteClassof Cata-dioptric Cameras,” Proc. of DARPA Image Under-standingWorkshop, May 1997.

[8] T. Boult, C. Qian, W. Yin, A. Erkin, P. Lewis,C.Power, andR.Micheals,“Applicationsof omnidi-rectionalimaging: Multi-body trackingandremotereality,” in Proc.of theIEEEWorkshoponComputerVisionApplications, Oct.1998.

[9] T. Boult, “Remotereality demonstration,” in Pro-ceedingsof IEEE Conferenceon ComputerVisionand Pattern Recognition, 1998. TechnicalDemon-stration.

[10] T. Boult, “Remotereality,” in Proc. of ACM SIG-GRAPH1998, 1998.TechnicalSketch.

Appeared in the 1999 Proc. Int. Conf. on Imaging Science, Systems And Technology and is copyrighted.


Recommended