Date post: | 05-Jan-2016 |
Category: |
Documents |
Upload: | kellie-merritt |
View: | 214 times |
Download: | 1 times |
LInfoVis Winter 2011 Chris Culy
Evaluation of visualizations
LInfoVis Winter 2011 Chris Culy
Levels: perception and interpetation
Visual/perceptual how well can people perceive the distinctions the
visualization intends?
Interpretation how well can people interpret the visualization?
LInfoVis Winter 2011 Chris Culy
Levels: Use
Use in isolation how accurately can people use the visualization for
a particular task in isolation? how quickly can people use the visualization for a
particular task in isolation? Use as part of a broader goal
how accurately can people use the visualization for a particular task as part of a broader goal?
how quickly can people use the visualization for a particular task as part of a broader goal?
LInfoVis Winter 2011 Chris Culy
Levels: Satisfaction
how satisfied are people with the visualization? e.g. easy/hard, "cool", etc. ≠ how well they can use it
how useful is the visualization for what they want to do? how well do they "get it"? people may use/prefer a more difficult tool
if it's "cool" if they already know it – there's a learning curve for a new
tool if it's cheaper
LInfoVis Winter 2011 Chris Culy
Goals
Formative evaluation Goal is to get specific information to improve the software Done during the development cycle Often informal
Summative evaluation Goal is to get general information about how the software
performs Done at the end of (a) development cycle Often (more) formal Can also be used to improve the next iteration
LInfoVis Winter 2011 Chris Culy
Some principles for experiments
There should be a specific purpose for the experiment What are you evaluating?
Experiments should be task based The user should be introduced to the software,
especially something new The user should not be “interfered with” during
the experiment
LInfoVis Winter 2011 Chris Culy
Some techniques for experiments
"instrumenting" the program
tracking clicks, etc, and then trying to analyse the patterns
also for tracking speed, accuracy
LInfoVis Winter 2011 Chris Culy
Some techniques for experiments
"instrumenting" the user various means for perception eye tracking
shows where people are paying attention
LInfoVis Winter 2011 Chris Culy
Some techniques for experiments
Observation with note taking, timing of the user's actions
User feedback pre/post session questionnaire "think aloud protocol": user explains what they're
doing and why as they're doing it explicit questions during demo (informal) post session interview (“debriefing”)
LInfoVis Winter 2011 Chris Culy
Formal vs. informal
Formal Several subjects (6+) Experimental design (comparison) Often statistical analysis of results Experiment usually recorded
Informal, "guerilla" testing Few subjects (3-4) Quick, informal test to get user feedback on s.t.
LInfoVis Winter 2011 Chris Culy
Techniques for deployed software
"Bug" reports When is a bug not a bug?
Feature requests
Interviews (or narratives) with "customers" Case studies (positive = “success stories”)
Observation of "customers" using the software
LInfoVis Winter 2011 Chris Culy
Interpreting users' reactions
Users can only react in terms of what they know/assume.
They don't analyse what they do, so their suggestions tend to be too concrete.
They often make specific suggestions that aren't necessarily the best solutions, since they don't know what the possibilities are
They are honestly trying to be helpful