Carol O’Sullivan - Carnegie Mellon School of …jkh/462_s07/CMU_VPClass.pdf · Carol O’Sullivan...

Post on 06-Sep-2018

219 views 0 download

transcript

Graphics and Perception

Carol O’SullivanCarol.OSullivan@cs.tcd.ie

Trinity College Dublin

Outline

• Some basics• Why perception is important

– For Modelling– For Rendering– For Animation

• Future research - multisensory perception and crowds

Why do we care?• Rapid Developments in Graphics:

– Algorithmic: e.g., fluid, cloth, humans, etc…– Hardware: desktop PCs – Low-end mobile devices, PDAs, cell phones

• More than just technical challenges:– Fidelity? Plausibility? Presence? – Perceptibility of errors? Evaluation? Metrics?

Must consider human perception!

Modelling

Example – Model simplification

http://www.ixbt.com/

How to measure fidelity?

Used experimental measures to evaluate simplification algorithms– Naming times– Ratings– Forced-choice preferences

Original (top), QSlim at 80% (middle) and Vclust 80% (bottom)

Watson et al. 2001

• Users can guide the simplification process

Pojar and Schmalstieg 2003Kho and Garland 2003

• Or salient features can be found automatically, or using an eye-tracker

Howlett et al.2004Lee et al.2005

Perceptual simplification

We used an eye-tracker to determine the prominent features of models

Finding Salient Features

• We gathered information on where a participant was fixating while viewing a set of models.

Evaluation• We incorporated

fixation data to produce a perceptual simplification metric

• Then asked people to name them, match them, and choose between them

Rendering

Image Fidelity• Fidelity to what ?

Stimulus Response Semantics• How to assess fidelity ?

– Metrics• How to apply the notion to image production ?

– Perceptually informed rendering• To image reproduction ?

– Tone mapping, contrast reduction, display device design

• How to evaluate perceptual methods ?

Measuring ErrorProblem:Problem:

Model the Human Visual System (HVS)VDP: Visible Differences

VQEG/ModelfestSSIM: Structured Similarity

Solution:Solution:

Visible Difference Predictors

Two Images(e.g., frames of an animation)

Probabilities of difference detection

Daly/Myskowski

Example: Perceptual Rendering• Can do user studies to

find out what rendering components are perceptually important

• Then develop a metric to guide where to concentrate computationally expensive illumination components with respect to image quality.

Ferwerda et al., 2004

High Dynamic Range image reproduction

High Dynamic Range display device• Seetzen et al. 2004 present two designs for a HDR display

device, based on the idea of a modulated backlight– 1) LEDs array and 2) Digital Mirror Device

• A standard LCD device provides for color and further intensity modulation

• Limitations of the eye’s ability to perceive local contrast (ratio of 150 to 1) are taken into account to determine:– The minimum number of LEDs necessary in the array– Adequate resolution and blur for the DMD projector– The optimal number of bits necessary to drive each backlight

High Dynamic Range image reproduction

Spatial kernel f

Influence g in the intensity domain for the central pixel

Input

Output

Weights

Images courtesy of Fredo Durand and

Julie Dorsey

Visual Attention and Tasks

Attention is largely controlled by task:– Scene rendered at low,

high and selective resolution.

– Task allocated…– Difference in quality

largely not noticed by participants

Selective Quality Image

Cater et al. 2003

Three varieties of realism

• Physical realism– Same visual stimulation as scene– Highly computationally expensive

• Photo-realism– Same visual response as scene– Takes observer’s visual system into account

• Functional realism– Same visual information as scene– Information useful for completing a task

Ferwerda 2003

Human facial illustrationsGooch et al. 2004

Images courtesy of Bruce Gooch

• Presents a new technique for automatic NPR generation of faces from photos– Photo illustration caricature

• Evaluates functional value of images– Recognition task– Learning task – slower with photos– Accuracy, speed

Human facial illustrationsGooch et al. 2004

• Radar display– Triangles– Circles

• Which is the threat?– Invariant of a sharp

shape invokes connotations of threat

Rendering and affectDuke et al. 2003

Experiments

• Assessment of danger and safety– Radar, door, house & trees

• Assessment of strength and weakness– Radar, strongest man, weakest man

• Goal-directed interaction– Paths, object selection

Results

• Demonstration of how rendering style can convey meaning and influence judgement

• Illustrate how semantics, affect and other high-level invariants need to be taken into account when analysing rendering methods, not just perceptual adequacy and realism.

Distance and Scale in VEs• What affects perception of distance in VEs?

– ‘Triangulated’ walking task Thompson et al. 2003– Image Quality? – Makes no difference

• Is it the HMD? Creem-Regehr et al. 2002– No… artificially generating these restrictions in the

real world did not produce the same problems– No need to see own body (see also Lok et al. 2003)

• However, some cues are important Hu et al. 2002– Shadows and Interreflection affected performance in

a placement task

Presence

• Presence = the sense of “being there”.– In the past, questionnaires and interviews

were predominant– Physiological measurements proposed as a

viable alternative– Shock of entering room with precipice induces

physiological response Meehan et al. 2002– Perhaps can also be used to measure and

predict Breaks in Presence (BIP)

Slater et al. 2004

Animation

Measuring Error

Reitsma and Pollard’03

Accuracy vs. Plausibility

Chenney and Forsyth’00

Collision HandlingCollision Handling

Collisions and PerceptionCollisions and Perception

Evaluating the Visual Fidelity of Physically Based Animations.

Collisions and Attention

Multisensory Perception: 1The multisensory brain:• Areas of brain not

unisensory but active to other sensory information

• Growing body of evidence:– Visual areas active during

tactile perception– Visual areas active during

tactile object recognition– Auditory areas of brain are

active during lip reading (no sound)

Multisensory Perception: 2

• You simply cannot predict perception by studying the senses in isolation

The senses influence each other:

Perceptual metrics for crowds• New metrics to evaluate human simulations, taking

account of:– Multisensory information: vision, motion and sound– Crowd and scene scale– Task

• …based on psychophysical and neuroimagingresults

• …which will be used to:– Devise new multisensory strategies for optimal LOD

control– Implement new error and comparison measurements for

evaluation

State of the Art

Dead Rising (Xbox®360), Capcom, 2006

State of the Art

Madden NFL (Xbox®360), EA, 2006

State of the Art

Project Gotham Racing (Xbox®360), Microsoft, 2005

State of the Art

Assassin’s Creed (Xbox®360), Ubisoft

Tsingos et al. - SIGGRAPH 2004

Sound rendering for crowds

Wand and Straβer PBG 2005