+ All Categories
Home > Documents > Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API...

Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API...

Date post: 23-Dec-2015
Category:
Upload: margaret-wilkerson
View: 212 times
Download: 0 times
Share this document with a friend
Popular Tags:
40
Evaluation in HCI Angela Kessell Oct. 13, 2005
Transcript
Page 1: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Evaluation in HCI

Angela Kessell

Oct. 13, 2005

Page 2: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Evaluation

• Heuristic Evaluation

• Measuring API Usability

• Methodology Matters: Doing Research in the Behavioral and Social Sciences

Page 3: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Evaluation

• Heuristic Evaluation– “Discount usability engineering method”

• Measuring API Usability

• Methodology Matters: Doing Research in the Behavioral and Social Sciences

Page 4: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Evaluation

• Heuristic Evaluation– “Discount usability engineering method”

• Measuring API Usability– Usability applied to APIs

• Methodology Matters: Doing Research in the Behavioral and Social Sciences

Page 5: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Evaluation

• Heuristic Evaluation– “Discount usability engineering method”

• Measuring API Usability– Usability applied to APIs

• Methodology Matters: Doing Research in the Behavioral and Social Sciences– Designing, carrying out, and evaluating

human subjects studies

Page 6: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic EvaluationJakob Nielsen

Page 7: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic EvaluationJakob Nielsen

Most usability engineering methods will contribute substantially to the usability of an interface …

Page 8: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic EvaluationJakob Nielsen

Most usability engineering methods will contribute substantially to the usability of an interface …

…if they are actually used.

Page 9: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• What is it?

Page 10: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• What is it?A discount usability engineering method

Page 11: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• What is it?A discount usability engineering method

- Easy (can be taught in ½ day seminar)

- Fast (about a day for most evaluations)

- Cheap (e.g. $(4,000 + 600i))

Page 12: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• How does it work?

Page 13: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• How does it work?– Evaluators use a checklist of basic usability

heuristics – Evaluators go through an interface twice

• 1st pass get a feel for the flow and general scope• 2nd pass refer to checklist of usability heuristics and

focus on individual elements

– The findings of evaluators are combined and assessed

Page 14: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic EvaluationUsability Heuristics (original, unrevised list)

• Simple and natural dialogue• Speak the users’ language• Minimize the users’ memory load• Consistency• Feedback• Clearly marked exits• Shortcuts• Precise and constructive error messages• Prevent errors• Help and documentation

Page 15: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic EvaluationUsability Heuristics (original, unrevised list)

• Simple and natural dialogue• Speak the users’ language• Minimize the users’ memory load• Consistency• Feedback• Clearly marked exits• Shortcuts• Precise and constructive error messages• Prevent errors• Help and documentation

COMMENTS?

Page 16: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• One expert won’t due• Need 3 - 5 evaluators• Exact number needed

depends on cost-benefit analysis

Page 17: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• Who are these evaluators?– Typically not domain experts / real users– No official “usability specialist” certification

exists

• Optimal performance requires double experts

Page 18: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• Debriefing session– Conducted in brain-storming mode– Evaluators rate the severity of all problems

identified– Use a 0 – 4, absolute scale

• 0 I don’t agree that this is a prob at all• 1 Cosmetic prob only• 2 Minor prob – low priority• 3 Major prob – high priority• 4 Usability catastrophe – imperative to fix

Page 19: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• Debriefing session– Conducted in brain-storming mode– Evaluators rate the severity of all problems

identified– Use a 0 – 4, absolute scale

• 0 I don’t agree that this is a prob at all• 1 Cosmetic prob only• 2 Minor prob – low priority• 3 Major prob – high priority• 4 Usability catastrophe – imperative to fix

COMMENTS?

Page 20: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• How does H.E. differ from User Testing?

Page 21: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• How does H.E. differ from User Testing?– Evaluators have checklists – Evaluators are not the target users – Evaluators decide on their own how they want

to proceed – Observer can answer evaluators’ questions

about the domain or give hints for using the interface

– Evaluators say what they didn’t like and why; observer doesn’t interpret evaluators’ actions

Page 22: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• What are the shortcomings of H.E.?

Page 23: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Heuristic Evaluation

• What are the shortcomings of H.E.?– Identifies usability problems without indicating

how they are to be fixed. • “Ideas for appropriate redesigns have to appear

magically in the heads of designers on the basis of their sheer creative powers.”

– Cannot expect it to address all usability issues when evaluators are not domain experts / actual users

Page 24: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Measuring API Usability Steven Clarke

Page 25: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Measuring API Usability Steven Clarke

• User-centered design approach– Understanding both your users and the way

they work

• Scenario-based design approach– Ensures API reflects the tasks that users want

to perform

• Use Cognitive Dimensions Framework

Page 26: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Measuring API Usability

• Cognitive dimensions framework describes:

– What users expect– What the API actually

provides

• Cognitive dimensions framework provides:

– A common vocabulary for developers

– Draws attention to important aspects

The Dimensions:– Abstraction level– Learning style– Working framework– Work-step unit– Progressive evaluation– Premature commitment– Penetrability– API elaboration– API viscosity– Consistency– Role expressiveness– Domain correspondence

Page 27: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Measuring API Usability

• Cognitive dimensions framework describes:

– What users expect– What the API actually

provides

• Cognitive dimensions framework provides:

– A common vocabulary for developers

– Draws attention to important aspects

The Dimensions:– Abstraction level– Learning style– Working framework– Work-step unit– Progressive evaluation– Premature commitment– Penetrability– API elaboration– API viscosity– Consistency– Role expressiveness– Domain correspondence

COMMENTS?

Page 28: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Measuring API Usability

• Use Personas:– Profiles describing the

stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic)

• Compare API evaluation with the profile requirements

Page 29: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Measuring API Usability

• Use Personas:– Profiles describing the

stereotypical behavior of three main developer groups (Opportunistic, Pragmatic, Systematic)

• Compare API evaluation with the profile requirements

COMMENTS?

Page 30: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Methodology Matters: Doing Research in the Behavioral and Social Sciences

Joseph McGrath

Page 31: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Methodology Matters: Doing Research in the Behavioral and Social Sciences

Key points:

• All methods are valuable, but all have limitations/weaknesses

• Offset the weaknesses by using multiple methods

Page 32: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Methodology Matters: Doing Research in the Behavioral and Social Sciences

In conducting research, try to maximize:

• Generalizability

• Precision

• Realism

Page 33: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Methodology Matters: Doing Research in the Behavioral and Social Sciences

In conducting research, try to maximize:

• Generalizability

• Precision

• Realism

-You cannot maximize all three simultaneously.

Page 34: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Methodology Matters: Doing Research in the Behavioral and Social Sciences

From http://pages.cpsc.ucalgary.ca/%7Esaul/hci_educ_papers/bgbg95/mcgrath-summary.pdf

Page 35: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

So…

• 1st 2 papers focus on computer programs / GUIs

• 3rd paper presents the whole gamut of methodologies available to study any human behavior

Page 36: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

But… what’s missing?

Page 37: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

But…

• Where are the statistics?

• Are there objective “right” answers in HCI?

• How do we evaluate other kinds of interfaces?

• Other thoughts on what’s missing?

Page 38: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

How do we evaluate…

• “Embodied virtuality” / ubiquitous computing “interfaces”

• (Aura video… http://www.cs.cmu.edu/~aura/)

• Try to pick out one capability presented, and think about how you might evaluate it

Page 39: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

Evaluating Aura

• Do we evaluate the whole system at once? Or bit by bit?

• Where / What is the interface?

• Is anyone not a target user?

Page 40: Evaluation in HCI Angela Kessell Oct. 13, 2005. Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.

From http://www.usability.uk.com/images/cartoons/cart5.htm


Recommended