+ All Categories
Home > Documents > Demo: BiFocus – Using Radio-Optical Beacons for …gruteser/papers/aashok...glasses along with a...

Demo: BiFocus – Using Radio-Optical Beacons for …gruteser/papers/aashok...glasses along with a...

Date post: 12-Jul-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
1
Demo: BiFocus – Using Radio-Optical Beacons for An Augmented Reality Search Application Ashwin Ashok , Chenren Xu , Tam Vu , Marco Gruteser , Richard Howard , Yanyong Zhang Narayan Mandayam , Wenjia Yuan , Kristin Dana WINLAB, Rutgers University, Department of ECE, Rutgers University {aashok, lendlice, tamvu, gruteser, reh, yyzhang, narayan}@winlab.rutgers.edu , [email protected] , [email protected] ABSTRACT Augmented Reality (AR) applications benefit from accurate detec- tion of the objects that are within a person’s view. Typically, it is not only desirable to identify what is currently within view, but also to navigate the users view to the item of interest - for example, find- ing a misplaced object. In this paper we demonstrate a low-power hybrid radio-optical beaconing system, where objects of interest are tagged with battery-powered RFID-like tags equipped with in- frared light emitting diodes (LED) that emit periodic infrared bea- cons. These beacons are used for accurately estimating the angle and distance from the object to the receiver so as to locate it. The beacons are synchronized using the radio link that is also used to convey the object’s unique ID. Categories and Subject Descriptors C.2.1 [Computer-Communication Networks]: Network Archi- tecture and Design—Wireless Communication Keywords Augmented Reality, Radio, RFID, Optical, Infrared, Angular Esti- mation, Ranging 1. INTRODUCTION We often need to find a specific item in a rather chaotic setting – for example, a toy in a house, items in a office room, or a product in the grocery store. In this application it is not only desirable to identify what is currently within view, but also to direct the users view to the item of interest. The problem can be reduced to deter- mining the angle of the object from a persons direction of view as well as the distance of the object from the viewer. Presumably, the object that is closest to the direction of view (smallest angle) and closest to the viewer (smallest distance) is the one the person is looking at. When the right object is within the user’s concentrated view (human eye has a concentrated vision within its fovea [3]), it is important that the angular resolution is sufficient to discriminate among these objects. With 50cm spac- ing between objects and a 3m distance between the user and the objects, the angular resolution required is about 10 degrees; with 10cm spacing and a 1-3m distance, the angle resolution required can be as small as 2 degrees. Radio–Optical Beaconing Approach: Current barcode tagging approaches for augmented reality are very limited in reading range, Copyright is held by the author/owner(s). MobiSys’13, June 25–28, 2013, Taipei, Taiwan. ACM 978-1-4503-1672-9/13/06. Figure 1: Illustration of the system demonstration while active RFID tags suffer from imprecise angle of arrival esti- mates [2]. To fill this void, we designed a low-power hybrid radio- optical beaconing system, where active RFID-like tags enhanced with a IR LED, emit a high energy light beacon which is detected and sampled by the photodiode elements at a receiver IR unit. The received IR signal energy is used to estimate the angle and distance between the transmitter and receiver with high accuracy, while the RF link provides tag ID and receiver synchronization of the ex- tremely short IR pulses to lower energy consumption. 2. DEMONSTRATION SETUP We will demonstrate our system using a receiver apparatus where the radio-IR receiver with two photodiodes is mounted onto eye- glasses along with a heads-up-display [1](HUD). The radio transceiver also periodically uploads the object posi- tion data to a web-server. We will tag objects with our transmitter tags (each bearing a unique ID) and let the attendees find the ob- ject through the navigation aids we will provide through an app on the heads-up-display. We will also demonstrate the details of the system functions using a calibrated setup. Figure 1 shows an ex- ample screen-shot of the visualization along with the glasses and transmitter tags. 3. REFERENCES [1] Recon mod-live heads-up-display. http://www.reconinstruments.com/products/snow-heads-up- display. [2] Artag. http://www.artag.net/, 2009. [3] C. Yuodelis and A. Hendrickson. A qualitative and quantitative analysis of the human fovea during development. Vision Research, 26(6):847 – 855, 1986.
Transcript
Page 1: Demo: BiFocus – Using Radio-Optical Beacons for …gruteser/papers/aashok...glasses along with a heads-up-display [1](HUD). The radio transceiver also periodically uploads the object

Demo: BiFocus – Using Radio-Optical Beacons for AnAugmented Reality Search Application

Ashwin Ashok†, Chenren Xu†, Tam Vu†, Marco Gruteser†, Richard Howard†, Yanyong Zhang†

Narayan Mandayam†, Wenjia Yuan‡, Kristin Dana‡

†WINLAB, Rutgers University, ‡ Department of ECE, Rutgers University{aashok, lendlice, tamvu, gruteser, reh, yyzhang, narayan}@winlab.rutgers.edu†,

[email protected]‡, [email protected]

ABSTRACTAugmented Reality (AR) applications benefit from accurate detec-tion of the objects that are within a person’s view. Typically, it isnot only desirable to identify what is currently within view, but alsoto navigate the users view to the item of interest - for example, find-ing a misplaced object. In this paper we demonstrate a low-powerhybrid radio-optical beaconing system, where objects of interestare tagged with battery-powered RFID-like tags equipped with in-frared light emitting diodes (LED) that emit periodic infrared bea-cons. These beacons are used for accurately estimating the angleand distance from the object to the receiver so as to locate it. Thebeacons are synchronized using the radio link that is also used toconvey the object’s unique ID.

Categories and Subject DescriptorsC.2.1 [Computer-Communication Networks]: Network Archi-tecture and Design—Wireless Communication

KeywordsAugmented Reality, Radio, RFID, Optical, Infrared, Angular Esti-mation, Ranging

1. INTRODUCTIONWe often need to find a specific item in a rather chaotic setting –

for example, a toy in a house, items in a office room, or a productin the grocery store. In this application it is not only desirable toidentify what is currently within view, but also to direct the usersview to the item of interest. The problem can be reduced to deter-mining the angle of the object from a persons direction of view aswell as the distance of the object from the viewer.

Presumably, the object that is closest to the direction of view(smallest angle) and closest to the viewer (smallest distance) isthe one the person is looking at. When the right object is withinthe user’s concentrated view (human eye has a concentrated visionwithin its fovea [3]), it is important that the angular resolution issufficient to discriminate among these objects. With 50cm spac-ing between objects and a 3m distance between the user and theobjects, the angular resolution required is about 10 degrees; with10cm spacing and a 1-3m distance, the angle resolution requiredcan be as small as 2 degrees.

Radio–Optical Beaconing Approach: Current barcode taggingapproaches for augmented reality are very limited in reading range,

Copyright is held by the author/owner(s).MobiSys’13, June 25–28, 2013, Taipei, Taiwan.ACM 978-1-4503-1672-9/13/06.

Figure 1: Illustration of the system demonstration

while active RFID tags suffer from imprecise angle of arrival esti-mates [2]. To fill this void, we designed a low-power hybrid radio-optical beaconing system, where active RFID-like tags enhancedwith a IR LED, emit a high energy light beacon which is detectedand sampled by the photodiode elements at a receiver IR unit. Thereceived IR signal energy is used to estimate the angle and distancebetween the transmitter and receiver with high accuracy, while theRF link provides tag ID and receiver synchronization of the ex-tremely short IR pulses to lower energy consumption.

2. DEMONSTRATION SETUPWe will demonstrate our system using a receiver apparatus where

the radio-IR receiver with two photodiodes is mounted onto eye-glasses along with a heads-up-display [1](HUD).

The radio transceiver also periodically uploads the object posi-tion data to a web-server. We will tag objects with our transmittertags (each bearing a unique ID) and let the attendees find the ob-ject through the navigation aids we will provide through an app onthe heads-up-display. We will also demonstrate the details of thesystem functions using a calibrated setup. Figure 1 shows an ex-ample screen-shot of the visualization along with the glasses andtransmitter tags.

3. REFERENCES[1] Recon mod-live heads-up-display.

http://www.reconinstruments.com/products/snow-heads-up-display.

[2] Artag. http://www.artag.net/, 2009.[3] C. Yuodelis and A. Hendrickson. A qualitative and

quantitative analysis of the human fovea during development.Vision Research, 26(6):847 – 855, 1986.

Recommended