+ All Categories
Home > Documents > Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction...

Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction...

Date post: 31-Mar-2015
Category:
Upload: kane-mckinney
View: 216 times
Download: 1 times
Share this document with a friend
18
Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere Universit y Computer Human Interacti on Group
Transcript
Page 1: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Introduction to Eye Tracking

Antti Aaltonen

17.1.2000

TampereUniversityComputerHumanInteractionGroup

Page 2: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Content

• Physiology of Eye (1)

• Eye Movements (3)

• Vision (2)

• Visual Perception (1)

• Terms (1)

• Eye Tracking Methods (6)

• Video-based Eye Tracking Systems (2)

Page 3: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Physiology of Eye

• Cornea is a transparent structure that covers the iris and pupil; a part of the focusing system of an eye.

• Pupil is the adjustable opening at the center of the iris that allows varying amounts of light to enter the eye.

• Lens helps to focus light on the retina.

• Retina includes rods (94%), which are sensitive to light and cones (6%) that capture colors. Cones are concentrated in the centre of the retina - the fovea

Page 4: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Eye Movements

• Eyes move all the time (even during sleep)

• Several different movement types, such as

• Pursuit

• Tremor

• Rotation

• Drift

• But the most interesting types are

• Fixation

• Saccade

Page 5: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Fixation

• Eye is a (relatively) still and “fixated” to the certain point. E.g. reading a single word.

• All the information from the scene is (mainly) acquired during fixation.

• Duration varies from 120-1000 ms, typically 200-600 ms.

• Typical fixation frequency is < 3 Hz

• Interspersed with saccades...

Page 6: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Saccade

• “Jumps” which connect fixations

• Very rapid -- duration is typically only 40-120 ms

• Very fast (up to 600 o/s) and therefore the vision system is suppressed during the movement

• Ballistic; the end point of saccade cannot be changed during the movement

• Saccades are used to move the fixation point

• If larger than 30 degree movement is required, head moves along with eyes

Page 7: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Vision1

• Vision field is divided in to three regions

• Fovea provides the sharpest vision

• Parafovea previews foveal information

• Peripheral vision reacts to flashing objects and sudden movements

• Peripheral vision has 15-50% of the acuity of the fovea and it is also less colour-sensitive

Page 8: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

0 1 5

Parafoveal

Peripheral

Foveal

[deg]

Vision2

• In real life the regions are asymmetric

• E.g., in reading so-called Perceptual span (size of the effective vision), is 3-4 letter spaces to the left of fixation and 14-15 letter spaces to the right

• 1O of visual angle is roughly equivalent to 3-4 letter spaces

Page 9: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Visual Perception

• The perception of a complex scene involves a complicated pattern of fixations, where the eye is held (fairly) still, and saccades, where the eye moves to foveate a new part of the scene

• The main issue is “Where to look next?” Answer may be that perception is influenced with

• Cognition, e.g. “meaningful” and parts previewed with parafoveal vision

• Visual cues; highly visually attractive areas attracted by peripheral vision

Page 10: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Eye Tracking Methods

• Rough taxonomy

• Electronic methods

• Mechanical methods

• Video-based methods– Single point– Two point

Page 11: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Electronic methods

• The most used method is to place skin electrodes around the eyes and measure the potential differences in eye

• Wide range -- poor accuracy

• Better for relative than absolute eye movements

• Mainly used in neurological diagnosis

Page 12: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Mechanical methods

• Based on contact lenses with

• mirror planes + reflecting IR-light

• coil + magnetic field

• Very accurate

• Very uncomfortable for users who are not used to wear lenses

• Usable only for lab studies

Page 13: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Single point video-based methods• Tracking one visible feature of

the eyeball, e.g.:

• limbus (bondary of sclera and iris)

• pupil

• A video camera observes one of the user's eyes

• Image processing software analyzes the video image and traces the tracked feature

• Based on calibration, the system determines where the user is currently looking

• Head movements not allowed

• Bite bar or head rest is needed

Page 14: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Two point video-based method1

• The same idea as in the single point method except now two features of eye are tracked – typically

• corneal reflection

• pupil

• Uses IR light (invisible to human eye) to

• produce corneal reflection

• cause bright or dark pupil, which helps the system to recognize pupil from video image

Bright pupil

Corneal reflection

Page 15: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Two point video-based methods2

• The optics of the system can be mounted on

• head

• floor

• If optics are floor mounted, the system is not in contact with the user

• Generally head movements are not restricted and they can be separated from eye movements, but...

• With floor mounted optics the system has to track the user’s head in order to keep the eye in the field of view of camera, which limits the head movements. The performance can be improved with

– servo controlled tracking mirrors, or – a camera taking a wide-angled view of the user’s head and by using artificial

neural network, the system searches the eye from the image.

Page 16: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

Some terms

• Accuracy

• The expected difference in degrees of visual angle between true eye position and mean computed eye position during a fixation.

• Because of the vision system and physiology of eye the accuracy is usually 0.5-1O.

• Spatial Resolution

• The smallest change in eye position that can be measured.

• Temporal Resolution (sampling rate)

• Number of recorded eye positions per second.

Page 17: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

ASL 4000-series (1996)

• Main components

• Floor mounted optics

• Control unit

• 2 computers (control & subject)

• Head movements (partially) compensated with tracking mirrors and extended head movement options

• Temporal resolution 50 Hz

• Spatial resolution 0.5O

• Tracks only one eye

• Poor analysing software and no API

Page 18: Introduction to Eye Tracking Antti Aaltonen 17.1.2000 Tampere University Computer Human Interaction Group.

SMI EyeLink (1999)

• Contains

• Head mounted optics

• 2 computers (control & subject)

• Temporal resolution 250 Hz

• Spatial resolution <0.01O

• Tracks both eyes

• Reasonable analysis software

• WIN API’s for Microsoft Visual C++


Recommended