Post on 22-Nov-2014
description
transcript
Evaluating crowdsourcing websites(Why evaluation isn’t a party at the end)
Donelle McKinleyPhD candidate, Victoria University of Wellington
Supervisors: Dr Sydney Shep and Dr Brenda Chawnerwww.digitalglam.org
National Digital Forum conference, 27 November 2013, Wellington, New Zealand
Heuristics for user interface design
• Heuristics can resemble high-level (conceptual) design principles (Rogers et al., 2011)
• Heuristics serve as both criteria to guide the design process and a basis for evaluation (Cockton et al., 2012; Hartson & Pyla, 2012).
• Heuristic evaluation is inexpensive, informal, relatively intuitive, people are easily motivated to participate in the process, it requires no advance planning, and can be used early in the development process (Nielsen and
Molich,1990)
Potential usability problems identified in the heuristic evaluation of the UK-RED task interface
UK-RED task interface problems identified by survey respondents
Requirements for a NZ-RED task interface1. Minimize user effort2. Support integration of the task with research
processes3. Enable new visitors and contributors to
understand what the task involves quickly and easily
4. Support accurate and controlled data entry5. Be easy to use for people reasonably confident
with the Web6. Support flexible, structured data entry7. Support bilingual data entry
What did I learn?
• The UK-RED task interface only partially meets four of the seven NZ-RED requirements
• Using an existing project ‘template’ may not be the most effective way to serve the needs of your volunteers or your project objectives. The only way you can determine this is by subjecting it to evaluation.
• Heuristics can be an efficient and effective method of website evaluation
What’s next?
Develop a set of heuristics for
non-profit crowdsourcing
Interested?
Email donelle.mckinley@vuw.ac.nz
Follow @donellemckinley
Research updates at www.digitalglam.org
Thanks!