Exploiting SenseCam for Helping the Blind in Business Negotiations Shuaib Karim, Amin Andjomshoaa, A...

Post on 28-Dec-2015

217 views 0 download

Tags:

transcript

Exploiting SenseCam for Helping the Blind in Business

NegotiationsShuaib Karim, Amin Andjomshoaa, A Min Tjoa

(skarim, andjo, amin@ifs.tuwien.ac.at)Institute of Software Technology & Interactive

Systemshttp://www.ifs.tuwien.ac.at/

SemanticLIFE Project (http://storm.ifs.tuwien.ac.at/)Vienna University of Technology

14.7.2006 ICCHP 2006 2

Motivation

Availability of automatic data capture devices, and the semantic web technology can greatly enhance the knowledge accessibility for people with special needs

14.7.2006 ICCHP 2006 3

Meeting Room

• Consists of meeting place, meeting time, participants, meeting room objects, agenda AND the interplay between all these

• The meaningful gestures and movements of participants possess valuable information

14.7.2006 ICCHP 2006 4

Some screenshots of business meetings• Ref: www.travelershub.com

14.7.2006 ICCHP 2006 8

Disadvantage for visually impaired participants

– Unable to capture the gestures made by other participants

– Unable to see the meaningful movementsIf this information is made available to blind participants then they can better plan their future meetings

14.7.2006 ICCHP 2006 9

Suggested solution

• Capture the meeting proceedings using devices like SenseCam

• Make associations between meeting constituents– Associations can be static (meeting <->

meeting place), or dynamic which may change over time, or based upon some event (meeting <-> participant‘s presence, meeting <-> discussed issue, etc.)

14.7.2006 ICCHP 2006 10

Introduction to SenseCam – 1/2 (Ref: Microsoft Research)

- Size: badge-sized wearable camera- Storage: 128Mbyte FLASH memory- Recording mechanism: Sensors

trigger a new recording automatically. The triggers are of various types like time, sudden movement, or a person nearby, light transition, change due to another person coming into the room, opening / closing of door, changing posture like sitting down, standing up, running etc.

14.7.2006 ICCHP 2006 11

- Capture rate: 2000 VGA images per day. Sensor data such as movement, light level and temperature is recorded every second. Manual capture is also possible by using a hand gesture.

- Configurable sensors: using xml configuration files- GPS and continuous audio recording capability- Ability to detect other SenseCams in vicinity- The data is stored on sqlServer which is accessible

using API provided with the device

Introduction to SenseCam – 2/2

14.7.2006 ICCHP 2006 12

Possible movements capturable by SenseCam

– leaving or entering the room– sitting down, standing up– whispering with someone while leaning– relaxing on the chair– sitting alert– hand gestures by the participants etc.

14.7.2006 ICCHP 2006 13

Our approach : SemanticLIFE

• SemanticLIFE research project is an attempt to come a step closer to Vannevar Bush's vision to realize Memex as a device in which an individual stores lifetime information

• SemanticLIFE is a Personal Information Management system that captures user activities like:– Browsed web pages– Emails– Chat sessions– Local processes– Telephone logs– Appointments

• SemanticLIFE uses Semantic Web technology to glue up the events and domain concepts

14.7.2006 ICCHP 2006 14

SemanticLIFE Architecture

Message Bus Plug-in

Google Explorer Plug-in

Repository Plug-in

PersonalRepository

Ontologies

Pipeline Plug-in

Pipelines Style sheets

User ProfilePlug-in

AnnotationPlug-in

Web ServicePlug-in

Other data feeds

AnalysisPlug-in

Visualization plug-in

14.7.2006 ICCHP 2006 15

Workflow:• Blind person wears the SenseCam during the

meeting• Pictures are uploaded in SemanticLIFE repository as

our file upload datafeed module• Retrieval of day‘s pictures and identification of

participants either manually by the caregiver or automatically using multimedia analysis plugins

• Annotation of pictures– Structural enhancement of information items– Manual associations with other information items– Dynamic associations

• Enrichment of associations (updating concerned contact profile)

14.7.2006 ICCHP 2006 16

Workflow of annotation sub-system

14.7.2006 ICCHP 2006 17

The problem faced is:• The number of pictures to be annotated is

too large (about 3600 pictures per hour).

The proposed solution is to “categorize the pictures based upon different criteria“. This will give the possibility to interact with this huge amount of information like OLAP cubes.

14.7.2006 ICCHP 2006 18

OLAP (OnLine Analytical Processing)• Multidimensional conceptual view /

arrangement of data to allow fast analysis (E.F. Codd & Associates, 1994)

• The multidimensional structure (OLAP cube) provides data views based upon different criteria

• Each dimension is a category• There can be hierarchy in each category

14.7.2006 ICCHP 2006 19

Information distribution over multiple axes

14.7.2006 ICCHP 2006 20

The proposed “Project Meeting“ ontology

14.7.2006 ICCHP 2006 21

SemanticLIFE UI

14.7.2006 ICCHP 2006 22

14.7.2006 ICCHP 2006 23

Conclusions

• Automatic data capture devices are available of capturing user‘s activities

• A project meeting ontology using semantic web technology can be used to build associations between meeting constituents which can be very useful for all, especially for visually impaired people

• Accessibility criteria is equally useful while making associations as well as while presenting information

• More Info. distribution axes to incorporate in future• Privacy issues to be investigated

14.7.2006 ICCHP 2006 24

Thanks!