Date post: | 15-Jan-2016 |
Category: |
Documents |
View: | 230 times |
Download: | 0 times |
TAUCHI – Tampere Unit for Computer-Human Interaction
Gaze-BasedHuman-Computer Interaction
Kari-Jouko RäihäTampere Unit for Computer-Human Interaction
TAUCHI – Tampere Unit for Computer-Human Interaction
Gaze-based interaction
• A possibility in hands-busy situations • Increasing number of computer users
suffer from RSI (repetitive strain injury) • Eye movements
– are extremely fast– are natural– require little conscious effort
• Direction of gaze implicitly indicates the focus of attention
TAUCHI – Tampere Unit for Computer-Human Interaction
Contents
• Eye-tracking technology• Challenges
– technological– interaction related
• Using eye-movements for application development– algorithms for processing eye movement data
• Examples of applications
TAUCHI – Tampere Unit for Computer-Human Interaction
Eye-tracking equipment
• Rough taxonomy – Electronic methods– Mechanical methods– Video-based methods
• Single point• Two point
TAUCHI – Tampere Unit for Computer-Human Interaction
Electronic methods
• The most common method is to place skin electrodes around the eyes and measure the potential differences in the eye
• Does not constrain head movements
• Poor accuracy• Better for relative than
absolute eye movements• Mainly used in neurological
diagnosis
TAUCHI – Tampere Unit for Computer-Human Interaction
Mechanical methods
• Based on contact lenses with – mirror planes +
reflecting IR-light– coil + magnetic field
• Very accurate • Very uncomfortable
for users who are not used to wearing lenses– usable only for lab
studies
TAUCHI – Tampere Unit for Computer-Human Interaction
Single point video-based methods
• Tracking one visible feature of the eyeball, usually center of the pupil
• A video camera focused on one of the user's eyes
• Image processing software analyzes the video image and traces the tracked feature
• Based on calibration, the system determines where the user is currently looking at
• Head movements not allowed– Bite bar or head rest is
needed
TAUCHI – Tampere Unit for Computer-Human Interaction
Two point video-based methods
• Same basic principle as in single-point video-based method
• Now two points are tracked to allow for (restricted) head movements
• Uses infrared light• Larger scale head
movements require head tracking
Pupil
Corneal reflection
TAUCHI – Tampere Unit for Computer-Human Interaction
The Bright Pupil Effect
• On-axis IR produces a bright pupil image
Camera
IR source reflected beam
(on axis)Eye ball
Bright pupil Glint
TAUCHI – Tampere Unit for Computer-Human Interaction
The Dark Pupil Effect
• The off-axis IR produces a dark pupil image
Camera
IR source
reflected beam
(off axis)
Eye ball
Dark pupil Glint
TAUCHI – Tampere Unit for Computer-Human Interaction
ASL (Applied Science Laboratories)
Head-mounted system
Floor-mounted system
TAUCHI – Tampere Unit for Computer-Human Interaction
SensoMotoric Instruments
EyeLink
iViewX
and many others…
TAUCHI – Tampere Unit for Computer-Human Interaction
Technological challenges
• High cost of the equipment– currently in the order of 2000-50000 euros– mass production can bring the cost down to hundreds of
euros
• Usability of the equipment– floor-mounted systems convenient but restrict the user’s
movements– head-mounted systems reliable but uncomfortable– size (getting smaller and smaller, can soon be embedded
in eyeglasses)– future: increased use of video analysis
• Need for calibration – for all users in the beginning of each session– also during the use
TAUCHI – Tampere Unit for Computer-Human Interaction
Interaction challenges
• Requires the development of new forms of interaction
• Eyes are normally used for observation, not for control – humans are not used to activating objects just by
looking at them– poorly implemented eye control can be extremely
annoying
• The device produces lots of noisy data– the data stream needs to be compacted in order to
make it suitable for input (fixations, input tokens, intentions)
– the physiological properties of the eye yield limits for accuracy that cannot be overcome
TAUCHI – Tampere Unit for Computer-Human Interaction
Processing of eye-movement data
•Experiment by Yarbus in 1967: gaze paths, when three different persons answered different questions about the same painting
•Data contains jitter, inaccuracies, tracking errors
•Raw data must be filtered and the fixations must be computed in real time
TAUCHI – Tampere Unit for Computer-Human Interaction
Concepts
• Fixation– The eyegaze is (almost) still– All information about the target is perceived during
fixations– Duration varies: 120-1000 ms, typically 200-600 ms– No more than 3-4 per second
• Saccade– Movement between fixations– Typically last for 40-120 ms – Very fast, therefore practically no information is
perceived during saccades– Ballistic: end point cannot be changed after saccade has
begun
TAUCHI – Tampere Unit for Computer-Human Interaction
Filtering the noisy data
Simple algorithm:
1) Fixation starts when theeye position stays within 0.5o > 100 ms (spatialand temporal thresholds filter the jitter)
2) Fixation continues as long as the position stays within 1o
3) 200 ms failures totrack the eye do not terminate the fixation
Time
x-coordinateof eyegaze
TAUCHI – Tampere Unit for Computer-Human Interaction
Visualizing the fixations• Circles denote fixations (centered at
the point of gaze)• Radius corresponds to duration• Lines represent saccades
• Studies of gaze behaviour while driving
TAUCHI – Tampere Unit for Computer-Human Interaction
Gaze-based input tokens
• The fixations are then turned into input tokens, e.g.– start of fixation– continuation of fixation (every 50 ms)– end of fixation– failure to locate eye position– entering monitored regions
• The tokens form eye events– are multiplexed into the event queue stream
with other input events
• The eye events can also carry information of the fixated screen object
TAUCHI – Tampere Unit for Computer-Human Interaction
Inferring user’s intentions
• Goals– to refine the data further for recognizing the user’s
intentions– to implement a higher level programming interface for
gaze aware applications
• Eye Interpretation Engine (Greg Edwards, http://eyetracking.stanford.edu/): claims to be able to identify such behaviors as– the user is reading – just “looking around” – starts and stops searching for an object (e.g. a button) – wants to select an object
TAUCHI – Tampere Unit for Computer-Human Interaction
Eye as a control device
• Gaze behaves very differently from other ways used for controlling computers (hands, voice)– intentional control of eyes is difficult and
stressful, the gaze is easily attracted by external events
– precise control of eyes is difficult
• “Midas touch” problem– Most of the time the eyes are used for obtaining
information with no intent to initiate commands– Users are easily afraid of looking at the “eye
active” objects or areas of the window
TAUCHI – Tampere Unit for Computer-Human Interaction
• Even though eye movements are an old research area, gaze-aware applications practically do not exist
• Exception: applications for disabled
© Erica, Inc. http://www.ericainc.com
Command-based gaze interaction
TAUCHI – Tampere Unit for Computer-Human Interaction
Object selection
• The most common task • How is the selection activated
(avoiding Midas touch)?– dwell time – special on-screen buttons– activation by eyes (e.g. blink or wink) – hardware buttons
• Empirical observations:– selection by gaze can be faster than selection by mouse– precision is a problem: targets must be large enough– in general, dwell time seems to be the best option,
when carefully chosen (not too short, not too long)
TAUCHI – Tampere Unit for Computer-Human Interaction
Gaze as mouse accelerator
• Magic pointing (Zhai, 1999)– gaze is used to warp the mouse pointer
to the vicinity of the target object– within a threshold circle gaze no longer
affects the mouse pointer– fine-grain adjustment done using the mouse
Target sufficientlyclose
Mouse pointer iswarped to eyegazeposition
Gaze positionreported byeye tracker
Threshold circle
• Two strategies for warping– always when the point of gaze moves (“liberal”)– only after moving the mouse (“cautious”)
• Empirical results– liked by the users– interaction was slightly slowed down by the cautious strategy, but the liberal strategy was faster than using just the mouse
TAUCHI – Tampere Unit for Computer-Human Interaction
Natural interfaces
• “Non-command user interfaces” (Nielsen, 1993)
• Multimodal interfaces are developing towards task- and user-centered operation, instead of command-based operation
• The computer monitors the user’s actions instead of waiting for commands (“proactive computing”)
• The point of gaze can provide valuable additional information
TAUCHI – Tampere Unit for Computer-Human Interaction
• Ship database (Jacob, 1993)
Examples
TAUCHI – Tampere Unit for Computer-Human Interaction
iEye
iEye video
TAUCHI – Tampere Unit for Computer-Human Interaction
Current research
• European Conference on Eye Movements (ECEM)– 11th conference in Turku, August 2001
• Eye Tracking Research and Applications (ETRA)– 2nd conference in New Orleans, March 2002
TAUCHI – Tampere Unit for Computer-Human Interaction
Wooding: Fixation Maps
1. The original image
2. Map of fixations of 131 traces
3. Corresponding contour plot
4. Image redrawn, areas with higher number of fixations appearing brighter
TAUCHI – Tampere Unit for Computer-Human Interaction
GAZE Groupware System
• multiparty telecon-ferencing and do-cument sharingsystem
• images rotate to show gaze direction (who is talking to whom)
• document “lightspot” (“look at this” reference)
Fig.53: GAZE Groupware display
TAUCHI – Tampere Unit for Computer-Human Interaction
Eye-gaze correction for videoconferencing
• Problem: Camera and screen not aligned eye contact lost
• Solution: Track the eyes and automatically warp the eyes in each frame to create illusion of eye contact
TAUCHI – Tampere Unit for Computer-Human Interaction
Don’t think while you drive
TAUCHI – Tampere Unit for Computer-Human Interaction
Is this desirable?
• Hard to predict• Clifford Nass and Byron Reeves, Stanford:
CASA(Computers As Social Actors)– humans tend to treat computers as fellow
humans
• BlueEyes project at IBM: In the future, ordinary household devices – such as televisions, refrigerators, and ovens – will do their jobs when we look at them and speak to them..
TAUCHI – Tampere Unit for Computer-Human Interaction
Thanks
• For slides, thoughts and discussions– Antti Aaltonen – Aulikki Hyrskykari– Päivi Majaranta– Shumin Zhai