+ All Categories
Home > Documents > Growing an Acceptable Computer Human Interaction Computer Human Interaction Lifecycle Development...

Growing an Acceptable Computer Human Interaction Computer Human Interaction Lifecycle Development...

Date post: 20-Dec-2015
Category:
View: 213 times
Download: 0 times
Share this document with a friend
Popular Tags:
34
Growing an Acceptable Computer Human Interaction Computer Human Interaction Lifecycle Development Team Who are the Users Characteristics of a well- designed CHI Effective & Efficient Feedback Mechanisms
Transcript

Growing an Acceptable Computer Human Interaction

Computer Human Interaction Lifecycle

Development Team

Who are the Users

Characteristics of a well-designed CHI

Effective & Efficient Feedback Mechanisms

DVB CIS221

What is Computer Human Interaction (CHI)?

• Focus on creating a tool for a particular User performing a particular Task:– “The quality of the swordsmith is measured by

the longevity of his customers”

• Becomes far more complex with multiple users and multiple tasks– It is much harder to design a good pocket knife

than a good paring knife

Bring Time and Money

• Good CHI will take considerably more time & money• Are results worth it?

– Tool only worth cost if actually used• “if the user can’t use it it doesn’t work”

– Consider “indirect costs of poor interaction• Training, interruptions of experts, unhappy customers, mistakes

make employees feel inadequate or humiliated

• Can be difficult to qualify for cost justification, but it is possible using fairly simple user-observation studies

– AT&T Information

Development Team Roles

• Expert user who ideally knows something about tool development

• Expert tool builder who ideally knows something about user task

• Director

• Producer

Computer Human Interaction Life Cycle If you really want to screw up, then misunderstand the problem

Task Analysis

Global Design

Prototype Construction

User Testing & Evaluation

$

$$$

$$$$$$

Iterative Process

The further into process a flawis discovered, the more time and money that will be needed to correct the problem

Task Analysis: Users• Who are the Users?

– Education, background, Computer experience in general and in particular, typing skills, emotional level (computer phobic?)

• Feeling comfortable and in control may be more important than speed for strategic decision-support systems

• What do the users do?– Current work habits both official and real

• In what environment do they operate?– Noise, lighting, etc?

• How do the users VISUALIZE the current system’s data and information?– Ask them to draw pictures !!

Task Analysis: Users

• Most Important: You can only build good tools for users whom you respect– Not too hard with $250K radiologists– Must harder with secretaries, part clerks, and

catalog sales phone operators• Solution: tool builder periodically mans the phones

Task Analysis: Context

• Related information systems– Ideally users’ family of systems should have a

similar “look and feel”

• Hardware, software, cost, time limitations

• Politics

Task Analysis Methods

• Study Existing Systems• Conduct semi-structured interviews and focus

groups (Marketing 325 will help here!)• Observe users working

– Video tape unobtrusively– Eyetracker study – finger pointing alternative– E.g., do they draw on their forms!?!

• Ideal: Tool builder becomes user– Best systems are developed by users

Task Analysis Feedback Methods• Task Analysis is iterative – must repeatedly feed

design back to users for critique and improvement– Ego less – critique is necessary step in developing a

good system – nothing personal

• Both formal and informal feedback• Find the right users

– Interested and willing -- but not too much• Avoid the gadget freak who will feature creep to death

• Key questions: accuracy, relevance, completeness• Must “walk through” example user activities and

tasks, not just present task analysis

Design: Topics

• Direct Manipulation interface (DMI)– Xerox Alto, MAC, Win9x– Mouse, desk top metaphor, etc

• Why is the direct manipulation approach often the most appropriate?

• What are the characteristics of a well-designed DMI?

Why Direct Manipulation

• People spatially access information– “that thing there (pointing)”

• With WELL-DESIGNED DMI– “useful in 5 minutes plus little/no loss in high

performance work

• Poorly-designed DMI is less than worthless – Just because it uses a mouse and menu doesn’t mean

it can’t be really bad

DMI Characteristics: Overview

• Mental Model

• Consistent/spatial grammar

• Recognition v. remembering

• Visible State and Modes

• Orthogonal state and Modes

• Well-Organized Menus

• Layered Complexity

Mental Models• The Users’ “Understanding” of the system• Users having to mentally construct an appropriate

mental model of a complex system is what makes things hard

• Metaphor as Mental Model– Only way for fast learning of a complex system is to

build the system analogous to an existing mental model• Mac/alto/win9x -> desk top

– Surface with items that can be moved, opened/closed/piled– Folders– Secretaries learn MAC/WIN9x must faster than DOC

• Visual scheduler example

Mental Models

• Observation (videotape) can find design flaws

• Repeated errors often indicate a difference between the users’ mental model and how the actual system operates– Problem is with system – change system, not user– If cannot change, then improve

documentation/training– Users do not understand a system until they have

constructed a mental model• Demon model of VB vs. ‘actual”

Consistent/Spatial Grammar• Items(or sets of items) are selected with mouse –

become “subjects” of command sentences• All commands are one-place verbs

– <selected item> <open | cut | copy | paste>?

• cut/paste (2 -1 place verbs) v. move (1-2 place)– Slower

– Eliminates mode or the need for a sentence with object

• Approach works better with menus

Metaphor

• REALLY BAD if start design interaction as metaphor to existing mental model (e.g., desk top) and then do something that does not fit- select object and cut for delete- Then what is trashcan for?

Recognition v. Remembering

• Menus exploit recognition

• Both subjects and verbs can be seen on screen and in menus

• Psych studies: one can recognize far more than can be remembered

Visible/Orthogonal State & Mode

• There is “Mode” (e.g., textbox placement mode in VB), but it is visible in the shape of the cursor

• Infrequently changing state is hidden (e.g., text size and font)

• Orthogonal(independent): one can change size without having to wade through a menu to also have to change font.

Orthogonal Tools (verbs)

• Each tool does a single thing well• Combinations of tools become very powerful (5 tools

-> many combinations)– Reduced learning/remembering/implementation– At cost of slower interaction (allow macros)

• Essential that designer chooses the “right” small set of tools !!

• Less is more – don’t add something just because you can. – KISS

Layered Complexity

• Goal: something useful in 5 minutes plus easy evolution to full performance

• Example: menu commands reduce cognitive load at expense of slower interaction– Solution: “hot key” high speed keystroke

alternatives allow everyone to evolve into advanced high speed user

Drawbacks to DMI/CHI• Framebuffer cost

– Not an issue anymore really

• Easy to design nice DMI looking interaction that stinks

• More complex to construct– Less issue with VB now days

• Important: Complex requirements and/or feature creep eventually may overrun metaphor and kill system.– Know your limitations– Know and accept the limitations of your metaphor

Evaluating The CHI• Tools are only “good” or “bad” relative to a

user and a task– What’s “better” a Rolls-Royce or a Pickup?

• Goal: How to quickly and cost-effectively evaluation a CHI relative to a user and a task.– When to evaluate?– What to Evaluate?– Who should Evaluate? – Evaluation Output?

Marketing Parallels

• Product and CHI design have parallels

• Product Evaluation and CHI evaluation also have similarities– Pay attention in 325

When To Evaluate?

• Early low cost feedback– Deadly to locate task analysis errors during the

final product evaluation

What to Evaluate?

• Problem Documentation

• Specification Documentation

• Design Documentation

• Prototypes and Mockups

• Final Product – Post mortem to educate development team

What should not be evaluated?

• Team members– Difficult to avoid evaluating both others and self

• a bit of your soul is in everything you create/build

– Use Yordan method: design evaluation, like code walkthrough, identifies problems

• Responsible team member is given responsibility to develop solution

Who Should Evaluate?• CHI specialist

• USER

• Another Designer/Director

• Designer /Director– Designer/director is ideal If and only IF:

• Has the training

• Has ego under control

• Training: have someone evaluate the designer evaluating the product.

Evaluation Output

• What users should be selected for evaluation?

• What tasks should be selected for evaluation– Was the user able to complete the task?

• In how much time?

• With how much accuracy?

• Did the user feel in control or feel overwhelmed?

• Repeated errors

• “count expletives”

Task Analysis/Design Evaluation Methods

• User Walkthrough– Help user understand how tool will be used to

address particular tasks– Time Motion Analysis

• Forces you to understand how user will perform task using tool design – down to the button press level

• Intuition

Prototype Evaluation Methods

• Controlled Subject Experiments– How fast, how many errors, etc– Requires expert to set this up

• IRB – legal – ethical issues

• Surveys and Interviews

Prototype Evaluation Methods

• Structured Observation– Plan users, tasks, and what information you need to

obtain

– Write out a script

– Automate data collection (inside code!)

– Videotape observer(detects bias!), computer, and user

– Ideally, entire process occurs in isolated room (avoids bias)

– Anthropology methodology

Prototype Evaluation Methods

• Intuition– While the tool builder should strive to develop

CHI intuition, it is very often flawed• Example: At&T information operator system

– The tool builder must discover how to determine as early as possible the following:

• When a design is “Good” (relative to user/task)• When a design is “bad”• And when he/she does not know

Final Product Evaluation

• Postmortem– Avoid a marketing disaster– Field test final product before market (alpha,

beta)


Recommended