WUD 2010 D.Miniotas - Gaze-Based Interaction

Post on 10-Jul-2015

464 views 1 download

transcript

Gaze-Based Interaction

Darius Miniotas

Dept. of Electronic Systems, VGTU

darius.miniotas@el.vgtu.lt

Gaze-Based Interaction: Why?*

• The only option for disabledusers

• An option for others in hands-busy situations

• Eye movements

- are extremely fast

- are natural

- require little conscious effort

* Some of the information in these slides is based on the materials presented in a

tutorial at NordiCHI 2004 by Kari-Jouko Räihä, Aulikki Hyrskykari and Päivi Majaranta

Demo from the German Research

Center for AI

• Text 2.0 [2009] – Imagine there were input devices

which could allow text to know if and how it is being read

- how would this change the reading experience?

http://text20.net/node/4

Technological Challenges

• Cost of equipment

- 2000 – 25 000 EUR

- mass production could lower the cost by an order of magnitude

• Usability of equipment

- remote trackers convenient, but allow only small movements

- head-mounted trackers more accurate but obtrusive

• Need for calibration

- for every user at the beginning of a tracking session

- often recalibration required during prolonged use

Types of Eye Tracking Applications

• Off-line applications

- visualizing gaze data

- analyzing gaze behavior

- modifying images based on viewing data

• On-line (interactive) applications

- command-based

- attentive

Command-based interaction:

challenges

• Eyes are normally used for observation, not for control

- humans are not used to activating objects just by

looking at them

• Gaze behaves very differently from other ways used for

controlling computers (hands, voice)

- intentional control of eyes is difficult and stressful

- the gaze is easily attracted by external events

- precise control of eyes is difficult

- poorly implemented eye control can be extremely

annoying

“Midas Touch” Problem

• Most of the time the eyes are used for obtaining

information with no intent to initiate commands

• Users are easily afraid of looking at the “eye-active”

objects or areas of the window

• Using eyes for commands requires development of new

forms of interaction

Expanding Targets [CHI 2004]

Selecting Standard-Size Menu Items

[ICMI 2005]

Gaze-Aware Applications

• Command-and-Control

applications

- typing (conventional)

- typing (using gaze gestures)

- drawing

- other

• Multimodal Applications

• Gaze-Contingent Displays

• Attentive Interfaces

Typing by Gaze

• A typical eye typing system has

- an on-screen keyboard

- an eye tracker to record eye

movements

- a computer to analyze gaze

behavior

• To type by gaze the user

- focuses on a letter

- gets feedback from the system

- selects the item in focus

EC Key, a typical keyboard

Compact Keyboard Layouts

Dasher

Demo: http://www.youtube.com/watch?v=0d6yIquOKQ0

Using Gaze Gestures for Typing

Drawing with the Eye [2003]

Other eye-controlled applications

• e-mail

• Internet browsing

• accessing online libraries

• games

• interaction with online virtual communities

EyeScroll [2007]

• gaze-enhanced scrolling allows

for automatic, adaptive scrolling

on content being viewed on the

screen

• supports multiple scrolling modes

depending on the user's

preference and reading style

• users can read the content as is

scrolls smoothly or scrolls once

the user has reached the bottom

of the screen

EyePassword, EyeSaver [2007]

• gaze-based password/pin entry:

prevents shoulder surfing and

does not generate any keyboard

or mouse events – more difficult

to use standard event loggers

• screen saver turned on when a

user looks away from the

screen, off when the user looks

back at the screen

EyePhone [2010]

• developed at Dartmouth

College (USA)

• tracks a person’s eye relative

to a phone’s screen

• users activate applications by

blinking

Demo: http://www.technologyreview.com/computing/25369/

Interactive Applications

• Command-based interaction

- typing (conventional)

- typing (using gaze gestures)

- drawing

• Gaze-aware interfaces

- multimodal input

- gaze-contingent displays

- attentive interfaces

Gaze-Aware Applications

• Command-and-Controlapplications

- typing (conventional)

- typing (using gaze gestures)

- drawing

- other

• Multimodal Applications

• Gaze-Contingent Displays

• Attentive Interfaces

Gaze as Mouse Accelerator [1999]

• MAGIC pointing

• Two strategies for warping

- always when the point of gaze moves (“liberal”)

- only after moving the mouse a little (“cautious”)

• Empirical results

- liked by the users

- interaction was slightly slowed down by the cautious strategy, but the liberal strategy was faster than using just the mouse

Gaze + Hotkeys [2007]

• performs basic mouse

operation

• reduces / eliminates

dependency on the

mouse for most everyday

tasks such as surfing the

web

• look-press-look-release

action to allow for

increasingly accurate

selection

Demo: http://hci.stanford.edu/research/GUIDe/index.html

Gaze + Speech [2006]

Put-That-There [1982]

• Multimodal input (speech, pointing gestures, gaze)

• Eye gaze used for disambiguation (together with

pointing)

• Demo: http://www.poetv.com/video.php?vid=44316

Gaze-Aware Applications

• Command-and-Controlapplications

- typing (conventional)

- typing (using gaze gestures)

- drawing

- other

• Multimodal Applications

• Gaze-Contingent Displays

• Attentive Interfaces

Gaze-Aware Applications

• Command-and-Control

applications

- typing (conventional)

- typing (using gaze gestures)

- drawing

- other

• Multimodal Applications

• Gaze-Contingent Displays

• Attentive Interfaces

iDict [2004]

• automatically detects

irregularities in reading

process

• consults the embedded

dictionaries and provides

assistance

Attentive Videoconferencing [1999]

• Multiparty teleconferencing and

document sharing system

• Images rotate to show gaze

direction (who is talking to whom)

• Document “lightspot” (“look at

this” reference)

PONG: The Attentive Robot [2001]

• A robot that understands and reacts to human presence

and visual communication messages

• Detects when a human walks sufficiently close and then

greets the person verbally and visually by displaying a

smile

• Tries to mimic the user’s facial expressions

Attention Sensors: eyePLIANCES

Eye aRe glasses eyeContact sensor

Light fixture with

eyeContact sensor

Time for a demo: EyeChess