Professional Documents
Culture Documents
User Interfaces Evaluation: DCO10104: User-Centered Design and Testing
User Interfaces Evaluation: DCO10104: User-Centered Design and Testing
DCO10104: User-
centered Design and
Testing
Rose C W Chang
1
Overview
Participatory Design
Methods for Involving the User
Evaluating User Interface
– Naturalistic Approach vs. Experimental Approach
Usability Evaluation Methods
– Inspection
– Conceptual Model Extraction
– Direct Observations
– Simple Observation Method
– Think Aloud Method
– Constructive Interaction Method
– Interviews
– Questionnaires and Surveys
– Continuous Evaluation
2
User Centered System Design
Analyze
Analyze Design
Design Build
Build
Test
Test
responses
4
The
user
is
just
Participatory Design like
me
Problem
– intuitions wrong
– interviews etc not precise
– designer cannot know the user sufficiently well to
answer all issues that come up during the design
Solution
– designers should have access to representative
users
END users, not their managers or union reps!
5
Participatory Design
6
Participatory Design
Up side
– users are excellent at reacting to suggested system designs
designs must be concrete and visible
– users bring in important “folk” knowledge of work context
knowledge may be otherwise inaccessible to design team
– greater buy-in for the system often results
Down side
– hard to get a good pool of end users
expensive, reluctance ...
– users are not expert designers
don’t expect them to come up with design ideas from scratch
– the user is not always right
don’t expect them to know what they want
7
Methods for Involving the User
At the very least, talk to users
– surprising how many designers don’t!
Contextual interviews + site visits
– interview users in their workplace, as they are doing their job
– discover user’s culture, requirements, expectations,…
Explain designs
– describe what you’re going to do
– get input at all design stages
all designs subject to revision
– important to have visuals and/or demos
people react far differently with verbal explanations
8
design
evaluation implementation
9
10 Process of Usability Evaluation
Naturalistic Approach
Problems
– hard to arrange and do
– time consuming
– may not generalize
11
Experimental Approach
Experimenter controls all environmental factors
– study relations by manipulating independent variables
– observe effect on one or more dependent variables
– Nothing else changes
12
Discount Usability Evaluation
Low cost methods to gather usability problems
– approximate: capture most large and many minor problems
How?
– qualitative:
observe user interactions
gather user explanations and opinions
produces a description, usually in non-numeric terms
anecdotes, transcripts, problem areas, critical incidents…
– quantitative
count, log, measure something of interest in user actions
speed, error rate, counts of activities,
13
Usability Evaluation Methods
Inspection
Extracting the conceptual model
Direct observation
– think-aloud
Query techniques (interviews and questionnaires)
Continuous evaluation (user feedback and field
studies)
14
Inspection
Designer tries the system (or prototype)
– does the system “feel right”?
– benefits
can catch some major problems in early versions
– problems
not reliable as completely subjective
not valid as introspector is a non-typical user
intuitions and introspection are often wrong
Inspection methods help
– Task Analysis
– Heuristic Evaluation
15
Conceptual Model Extraction
How?
– show the user static images of
the prototype or screens during use
– ask the user explain
the function of each screen element
how they would perform a particular task
What?
– Initial conceptual model
how person perceives a screen the very first time it is viewed
– Formative conceptual model
How person perceives a screen after its been used for a while
Value?
– good for eliciting people’s understanding before & after use
– poor for examining system exploration and learning
16
Direct Observations
17
Simple Observation Method
18
Hmm, what does
this do? I’ll try
it… Ooops, now
what happened?
19
Interviews
Good for pursuing specific issues
– vary questions to suit the context
– probe more deeply on interesting issues as they arise
– good for exploratory studies via open-ended questioning
– often leads to specific constructive suggestions
Problems:
– accounts are subjective
– time consuming
– evaluator can easily bias the interview
– prone to rationalization of events/thoughts by user
user’s reconstruction may be wrong
20
Questionnaires and Surveys
Questionnaires / Surveys
– preparation “expensive,” but
administration cheap
can reach a wide subject
group (e.g. mail)
– does not require presence of
evaluator
– results can be quantified
But
– only as good as the questions
asked
21
Continuous Evaluation
Monitor systems in actual use
– usually late stages of development
ie beta releases, delivered system
– fix problems in next release
User feedback via gripe lines
– users can provide feedback to designers while using the
system
help desks
bulletin boards
email
built-in gripe facility
– best combined with trouble-shooting facility
users always get a response (solution?) to their gripes
22
Continuous Evaluation
Case/field studies
– careful study of “system usage” at the site
– good for seeing “real life” use
– external observer monitors behavior
– site visits
23
Revised Design Process
Analyze
Analyze Design
Design Build
Build
Heuristic
Evaluation
Test
Test
Think-aloud Studies responses
24
Readings/ Questions
Stone:
Stone: Chapter
Chapter 20
20 && 27
27
25