Date post: | 23-Dec-2015 |
Category: |
Documents |
Upload: | margaret-wilkerson |
View: | 212 times |
Download: | 0 times |
Evaluation in HCI
Angela Kessell
Oct. 13, 2005
Evaluation
• Heuristic Evaluation
• Measuring API Usability
• Methodology Matters: Doing Research in the Behavioral and Social Sciences
Evaluation
• Heuristic Evaluation– “Discount usability engineering method”
• Measuring API Usability
• Methodology Matters: Doing Research in the Behavioral and Social Sciences
Evaluation
• Heuristic Evaluation– “Discount usability engineering method”
• Measuring API Usability– Usability applied to APIs
• Methodology Matters: Doing Research in the Behavioral and Social Sciences
Evaluation
• Heuristic Evaluation– “Discount usability engineering method”
• Measuring API Usability– Usability applied to APIs
• Methodology Matters: Doing Research in the Behavioral and Social Sciences– Designing, carrying out, and evaluating
human subjects studies
Heuristic EvaluationJakob Nielsen
Heuristic EvaluationJakob Nielsen
Most usability engineering methods will contribute substantially to the usability of an interface …
Heuristic EvaluationJakob Nielsen
Most usability engineering methods will contribute substantially to the usability of an interface …
…if they are actually used.
Heuristic Evaluation
• What is it?
Heuristic Evaluation
• What is it?A discount usability engineering method
Heuristic Evaluation
• What is it?A discount usability engineering method
- Easy (can be taught in ½ day seminar)
- Fast (about a day for most evaluations)
- Cheap (e.g. $(4,000 + 600i))
Heuristic Evaluation
• How does it work?
Heuristic Evaluation
• How does it work?– Evaluators use a checklist of basic usability
heuristics – Evaluators go through an interface twice
• 1st pass get a feel for the flow and general scope• 2nd pass refer to checklist of usability heuristics and
focus on individual elements
– The findings of evaluators are combined and assessed
Heuristic EvaluationUsability Heuristics (original, unrevised list)
• Simple and natural dialogue• Speak the users’ language• Minimize the users’ memory load• Consistency• Feedback• Clearly marked exits• Shortcuts• Precise and constructive error messages• Prevent errors• Help and documentation
Heuristic EvaluationUsability Heuristics (original, unrevised list)
• Simple and natural dialogue• Speak the users’ language• Minimize the users’ memory load• Consistency• Feedback• Clearly marked exits• Shortcuts• Precise and constructive error messages• Prevent errors• Help and documentation
COMMENTS?
Heuristic Evaluation
• One expert won’t due• Need 3 - 5 evaluators• Exact number needed
depends on cost-benefit analysis
Heuristic Evaluation
• Who are these evaluators?– Typically not domain experts / real users– No official “usability specialist” certification
exists
• Optimal performance requires double experts
Heuristic Evaluation
• Debriefing session– Conducted in brain-storming mode– Evaluators rate the severity of all problems
identified– Use a 0 – 4, absolute scale
• 0 I don’t agree that this is a prob at all• 1 Cosmetic prob only• 2 Minor prob – low priority• 3 Major prob – high priority• 4 Usability catastrophe – imperative to fix
Heuristic Evaluation
• Debriefing session– Conducted in brain-storming mode– Evaluators rate the severity of all problems
identified– Use a 0 – 4, absolute scale
• 0 I don’t agree that this is a prob at all• 1 Cosmetic prob only• 2 Minor prob – low priority• 3 Major prob – high priority• 4 Usability catastrophe – imperative to fix
COMMENTS?
Heuristic Evaluation
• How does H.E. differ from User Testing?
Heuristic Evaluation
• How does H.E. differ from User Testing?– Evaluators have checklists – Evaluators are not the target users – Evaluators decide on their own how they want
to proceed – Observer can answer evaluators’ questions
about the domain or give hints for using the interface
– Evaluators say what they didn’t like and why; observer doesn’t interpret evaluators’ actions
Heuristic Evaluation
• What are the shortcomings of H.E.?
Heuristic Evaluation
• What are the shortcomings of H.E.?– Identifies usability problems without indicating
how they are to be fixed. • “Ideas for appropriate redesigns have to appear
magically in the heads of designers on the basis of their sheer creative powers.”
– Cannot expect it to address all usability issues when evaluators are not domain experts / actual users
Measuring API Usability Steven Clarke
Measuring API Usability Steven Clarke
• User-centered design approach– Understanding both your users and the way
they work
• Scenario-based design approach– Ensures API reflects the tasks that users want
to perform
• Use Cognitive Dimensions Framework
Measuring API Usability
• Cognitive dimensions framework describes:
– What users expect– What the API actually
provides
• Cognitive dimensions framework provides:
– A common vocabulary for developers
– Draws attention to important aspects
The Dimensions:– Abstraction level– Learning style– Working framework– Work-step unit– Progressive evaluation– Premature commitment– Penetrability– API elaboration– API viscosity– Consistency– Role expressiveness– Domain correspondence
Measuring API Usability
• Cognitive dimensions framework describes:
– What users expect– What the API actually
provides
• Cognitive dimensions framework provides:
– A common vocabulary for developers
– Draws attention to important aspects
The Dimensions:– Abstraction level– Learning style– Working framework– Work-step unit– Progressive evaluation– Premature commitment– Penetrability– API elaboration– API viscosity– Consistency– Role expressiveness– Domain correspondence
COMMENTS?
Measuring API Usability
• Use Personas:– Profiles describing the
stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic)
• Compare API evaluation with the profile requirements
Measuring API Usability
• Use Personas:– Profiles describing the
stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic)
• Compare API evaluation with the profile requirements
COMMENTS?
Methodology Matters: Doing Research in the Behavioral and Social Sciences
Joseph McGrath
Methodology Matters: Doing Research in the Behavioral and Social Sciences
Key points:
• All methods are valuable, but all have limitations/weaknesses
• Offset the weaknesses by using multiple methods
Methodology Matters: Doing Research in the Behavioral and Social Sciences
In conducting research, try to maximize:
• Generalizability
• Precision
• Realism
Methodology Matters: Doing Research in the Behavioral and Social Sciences
In conducting research, try to maximize:
• Generalizability
• Precision
• Realism
-You cannot maximize all three simultaneously.
Methodology Matters: Doing Research in the Behavioral and Social Sciences
From http://pages.cpsc.ucalgary.ca/%7Esaul/hci_educ_papers/bgbg95/mcgrath-summary.pdf
So…
• 1st 2 papers focus on computer programs / GUIs
• 3rd paper presents the whole gamut of methodologies available to study any human behavior
But… what’s missing?
But…
• Where are the statistics?
• Are there objective “right” answers in HCI?
• How do we evaluate other kinds of interfaces?
• Other thoughts on what’s missing?
How do we evaluate…
• “Embodied virtuality” / ubiquitous computing “interfaces”
• (Aura video… http://www.cs.cmu.edu/~aura/)
• Try to pick out one capability presented, and think about how you might evaluate it
Evaluating Aura
• Do we evaluate the whole system at once? Or bit by bit?
• Where / What is the interface?
• Is anyone not a target user?
From http://www.usability.uk.com/images/cartoons/cart5.htm