+ All Categories
Home > Documents > TaxSeeMe: A Taxi Administering System for the Visually...

TaxSeeMe: A Taxi Administering System for the Visually...

Date post: 25-May-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
2
TaxSeeMe: A Taxi Administering System for the Visually Impaired S M Towhidul Islam, Bezawit Woldegebriel, Ashwin Ashok Department of Computer Science, Georgia State University, Atlanta, USA {sislam9,bwoldegebriel1}@student.gsu.edu, [email protected] Abstract—This work aims at providing complete independence to visually impaired personnel to travel using public commuting services. In this regard, we demonstrate an assistance system that helps administer a taxi ride for the visually impaired person. Our system consists of a wearable camera fit on the user, and which communicates with a smartphone app. The camera device uses computer vision to detect obstacles and navigate the person, and relays the information to the smartphone app. The app integrates that information with a taxi scheduling service on the phone. The interaction between the user and the phone is carried through speech and audio instructions. The app administers a taxi for pickup when the user clicks the screen. The person walks to the designated pick-up point and the taxi is identified using pre- registered WiFi and/or Bluetooth IDs. I. I NTRODUCTION Enabling visually impaired personnel to ‘see’ the world around them has been a topic of research across various disciplines for a long time. While the pure science community has been working to understand the cause and cure of disability of vision from a fundamental biological and psychological perspective, the technology community, particularly in com- puter science and engineering, has been working to develop solutions that can assist the visually impaired personnel to conduct daily activities. With the proliferation and cheap deployment of sensing systems in recent times, there has been significant amount of research and product development in building machine assistive systems for visually impaired per- sonnel by sensing obstacles in their immediate environment; particularly using computer vision assisted cameras [1], [2], [3], and ranging systems (determine distance between objects) through ultrasound [4], [5], light [6] and radio beaconing [7], [8]. However, a holistic solution that can enable visually impaired personnel to exactly realize what their human eyes, if not impaired, could see and perceive is yet to be found. In this work, we present TaxSeeMe, an assistance system that enables administering a taxi ride for the visually impaired. The goal of this system is to enable complete independence for visually impaired personnel to use generally available com- muting services. The functionality and the key components of the system are shown in Fig. 1 and 2, respectively. II. PRIOR AND RELATED ART Most of the visually impaired personnel today use human and/or animal assistance for navigation. There is no clear winner on an assisted system today that can enable complete independence for visually impaired personnel for commuting. Fig. 1: An illustration of the TaxSeeMe system functionality. Fig. 2: The system components include a Raspberry Pi with Pi Camera, Android App and a battery pack. 2018 IEEE Vehicular Networking Conference (VNC) 978-1-5386-9428-2/18/$31.00 ©2018 IEEE
Transcript
Page 1: TaxSeeMe: A Taxi Administering System for the Visually ...mobile.cs.gsu.edu/aashok/Papers/8_2018_VNC.pdf · device for the user to wear as a badge. We developed an app on an Android

TaxSeeMe: A Taxi Administering System for theVisually Impaired

S M Towhidul Islam, Bezawit Woldegebriel, Ashwin AshokDepartment of Computer Science, Georgia State University, Atlanta, USA

{sislam9,bwoldegebriel1}@student.gsu.edu, [email protected]

Abstract—This work aims at providing complete independenceto visually impaired personnel to travel using public commutingservices. In this regard, we demonstrate an assistance system thathelps administer a taxi ride for the visually impaired person. Oursystem consists of a wearable camera fit on the user, and whichcommunicates with a smartphone app. The camera device usescomputer vision to detect obstacles and navigate the person, andrelays the information to the smartphone app. The app integratesthat information with a taxi scheduling service on the phone. Theinteraction between the user and the phone is carried throughspeech and audio instructions. The app administers a taxi forpickup when the user clicks the screen. The person walks tothe designated pick-up point and the taxi is identified using pre-registered WiFi and/or Bluetooth IDs.

I. INTRODUCTION

Enabling visually impaired personnel to ‘see’ the worldaround them has been a topic of research across variousdisciplines for a long time. While the pure science communityhas been working to understand the cause and cure of disabilityof vision from a fundamental biological and psychologicalperspective, the technology community, particularly in com-puter science and engineering, has been working to developsolutions that can assist the visually impaired personnel toconduct daily activities. With the proliferation and cheapdeployment of sensing systems in recent times, there has beensignificant amount of research and product development inbuilding machine assistive systems for visually impaired per-sonnel by sensing obstacles in their immediate environment;particularly using computer vision assisted cameras [1], [2],[3], and ranging systems (determine distance between objects)through ultrasound [4], [5], light [6] and radio beaconing[7], [8]. However, a holistic solution that can enable visuallyimpaired personnel to exactly realize what their human eyes,if not impaired, could see and perceive is yet to be found.

In this work, we present TaxSeeMe, an assistance systemthat enables administering a taxi ride for the visually impaired.The goal of this system is to enable complete independencefor visually impaired personnel to use generally available com-muting services. The functionality and the key components ofthe system are shown in Fig. 1 and 2, respectively.

II. PRIOR AND RELATED ART

Most of the visually impaired personnel today use humanand/or animal assistance for navigation. There is no clearwinner on an assisted system today that can enable completeindependence for visually impaired personnel for commuting.

Fig. 1: An illustration of the TaxSeeMe system functionality.

Fig. 2: The system components include a Raspberry Pi withPi Camera, Android App and a battery pack.

2018 IEEE Vehicular Networking Conference (VNC)

978-1-5386-9428-2/18/$31.00 ©2018 IEEE

Page 2: TaxSeeMe: A Taxi Administering System for the Visually ...mobile.cs.gsu.edu/aashok/Papers/8_2018_VNC.pdf · device for the user to wear as a badge. We developed an app on an Android

Recent advances in deep learning based computer visionapproaches have enabled high accuracy obstacle detectionand 3D scene estimation, particularly useful for self–drivingvehicles and drones. However, these systems [9] typicallyrequire heavy computing through a GPU enabled device andthus do not translate for general usage for visually impairedpersonnel assistance.

To the best of our knowledge, there is no app current in thethe market that provides an independent commute assistanceto visually impaired personnel. Rideshare taxi services likeUber [10] and Lyft [11] provide extra assistance to visuallyimpaired personnel, however, through enabling appropriatedrivers and educating the drivers on the special needs of theperson. Research has proposed a variety of smart gadgets, suchas smart canes and bluetooth beacons for navigation, however,the robustness of the system across real world navigation of thevisually impaired user is yet to be improved. Keeping theserealistic issues in consideration, we take a step forward inbuilding a realistic assisted taxi service system for the visuallyimpaired.

III. SYSTEM COMPONENTS AND FUNCTIONALITY

The TaxSeeMe system consists of a wearable camera unit, acloud database and the TaxSeeMe Android App. The wearableunit is a Raspberry Pi fit with its compatible camera. We usea Raspberry Pi 3 B+ (latest release) and Pi Camera. We 3Dprinted a housing for the same such that it can be worn bythe user like a badge. In future versions, we will integrateour system on a miniature version of the Raspberry Pi, theRaspberry Pi Zero, and develop an appropriate form factordevice for the user to wear as a badge. We developed anapp on an Android tablet that integrates the information fromthe Pi with its direction navigation system using its built-inmaps. We use Google’s Maps API to build the navigation onthe Android device and integrate it with the information fromthe Pi through the cloud database. We use Google’s Firebasedatabase service as our cloud database server.

The functionality of the system is as follows:1) User wears the RaspberryPi camera device and uses

TaxSeeMe app on the Android smartphone or tabletdevice. The user enables the app using a voice command.

2) User is instructed to touch the screen and feed in thedestination address through voice. The system will usethe GPS coordinates to locate the current address of theuser. The app will also contain pre–defined locations suchas home, work, gym.

3) User is instructed to walk towards the taxi pick up point.The walking directions are provided by the app alongwith STOP warnings when the person nears an obstacle.We consider that the user will move to the left or rightonce instructed to STOP, and when signaled WALK willcontinue to walk.

4) When the user reaches the designated pick up point, thetaxi is identified using WiFi direct ID and/or BluetoothID. As a secondary identification procedure, the systemuses computer vision recognition to identify the taxi

using object recognition and marker/text identification.We consider that these IDs are registered in the app’scloud server database.

IV. DISCUSSION

We discuss the highlights of our system across differentusability and impact metrics:

Innovation and Technical focus. Navigating the visuallyimpaired outdoors and enabling a commuting service is verychallenging. There exists no app today that has successfullyaccomplished this. Our system makes a step forward inaddressing this issue. Our approach blends technical depthin IoT, computer vision and human-computer interactionsystems.

Broader impact. Our system will be extremely beneficialto the visually impaired community and personnel withchallenged vision and even elderly personnel. In this way,there is a huge value proposition for our system.

User Interface and User Experience. The key idea in ourapproach is to not have any graphical visualization as it isnot useful for the visually impaired person. Instead, we focusour efforts on building a robust audio based interactive system.

REFERENCES

[1] A. S. Rao, J. Gubbi, M. Palaniswami, and E. Wong. A vision-basedsystem to detect potholes and uneven surfaces for assisting blind people.In 2016 IEEE International Conference on Communications (ICC),pages 1–6, May 2016.

[2] V. C. Sekhar, S. Bora, M. Das, P. K. Manchi, S. Josephine, and R. Paily.Design and implementation of blind assistance system using real timestereo vision algorithms. In 2016 29th International Conference on VLSIDesign and 2016 15th International Conference on Embedded Systems(VLSID), pages 421–426, Jan 2016.

[3] R. Tapu, B. Mocanu, and T. Zaharia. A computer vision system thatensure the autonomous navigation of blind people. In 2013 E-Healthand Bioengineering Conference (EHB), pages 1–4, Nov 2013.

[4] D. T. Batarseh, T. N. Burcham, and G. M. McFadyen. An ultrasonicranging system for the blind. In Proceedings of the 1997 16 SouthernBiomedical Engineering Conference, pages 411–413, Apr 1997.

[5] S. Ram and J. Sharf. The people sensor: a mobility aid for the visuallyimpaired. In Digest of Papers. Second International Symposium onWearable Computers (Cat. No.98EX215), pages 166–167, Oct 1998.

[6] H. E. Kallmann. Optar, a method of optical automatic ranging, as appliedto a guidance device for the blind. Proceedings of the IRE, 42(9):1438–1446, Sept 1954.

[7] M. R. B. Kumar and C. M. Sibu. Design for visually impaired towork at industry using rfid technology. In 2015 International Confer-ence on Electronic Design, Computer Networks Automated Verification(EDCAV), pages 29–33, Jan 2015.

[8] B. Ando, S. Baglio, V. Marletta, and N. Pitrone. A mixed inertial andrfid orientation tool for the visually impaired. In 2009 6th InternationalMulti-Conference on Systems, Signals and Devices, pages 1–6, March2009.

[9] Beyond star trek’s visor: How a gpu-powered visual aid allows the blindto see. https://blogs.nvidia.com/blog/2017/04/30/gpu-powered-glasses/.

[10] Uber. https://ubr.to/2Pc0xAs.[11] Lyft is making the app more accessible for visually impaired riders and

here is why it is important. https://bit.ly/2B4QQtu.

2018 IEEE Vehicular Networking Conference (VNC)


Recommended