Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 25

User Interfaces Evaluation

DCO10104: User-
centered Design and
Testing

Rose C W Chang

1
Overview
 Participatory Design
 Methods for Involving the User
 Evaluating User Interface
– Naturalistic Approach vs. Experimental Approach
 Usability Evaluation Methods
– Inspection
– Conceptual Model Extraction
– Direct Observations
– Simple Observation Method
– Think Aloud Method
– Constructive Interaction Method
– Interviews
– Questionnaires and Surveys
– Continuous Evaluation
2
User Centered System Design

 Design is based upon a user’s


– abilities and real needs
– context
– work
– tasks
– need for usable and useful product

Golden rule of interface design: Know The User


3
Revised Design Process

facts/ requirements Visual Basic

Analyze
Analyze Design
Design Build
Build

Test
Test
responses
4
The
user
is
just
Participatory Design like
me

 Problem
– intuitions wrong
– interviews etc not precise
– designer cannot know the user sufficiently well to
answer all issues that come up during the design
 Solution
– designers should have access to representative
users
 END users, not their managers or union reps!

5
Participatory Design

 Users are 1st class members in the design


process
– active collaborators vs passive participants
 Users considered subject matter experts
– know all about the work context
 Iterative process
– all design stages subject to revision

6
Participatory Design

 Up side
– users are excellent at reacting to suggested system designs
 designs must be concrete and visible
– users bring in important “folk” knowledge of work context
 knowledge may be otherwise inaccessible to design team
– greater buy-in for the system often results

 Down side
– hard to get a good pool of end users
 expensive, reluctance ...
– users are not expert designers
 don’t expect them to come up with design ideas from scratch
– the user is not always right
 don’t expect them to know what they want

7
Methods for Involving the User
At the very least, talk to users
– surprising how many designers don’t!
Contextual interviews + site visits
– interview users in their workplace, as they are doing their job
– discover user’s culture, requirements, expectations,…
Explain designs
– describe what you’re going to do
– get input at all design stages
 all designs subject to revision
– important to have visuals and/or demos
 people react far differently with verbal explanations
8
design

evaluation implementation

Evaluating User Interface

 Tied to the usability engineering lifecycle


 Initial design stages
– develop and evaluate initial design ideas with the user
 Iterative design
– does system behavior match the user’s task
requirements?
– are there specific problems with the design?
– what solutions work?

9
10 Process of Usability Evaluation
Naturalistic Approach

 Observation occurs in realistic setting


– real life

 Problems
– hard to arrange and do
– time consuming
– may not generalize

11
Experimental Approach
 Experimenter controls all environmental factors
– study relations by manipulating independent variables
– observe effect on one or more dependent variables
– Nothing else changes

File Edit View Insert File


New
New Edit
Open
Open View
Close
Close
There is no difference in user performance (time
Insert and error rate) when
Save Save
selecting an item from a pull down or a pull right menu of 4 items

12
Discount Usability Evaluation
 Low cost methods to gather usability problems
– approximate: capture most large and many minor problems
 How?
– qualitative:
 observe user interactions
 gather user explanations and opinions
 produces a description, usually in non-numeric terms
 anecdotes, transcripts, problem areas, critical incidents…
– quantitative
 count, log, measure something of interest in user actions
 speed, error rate, counts of activities,
13
Usability Evaluation Methods

 Inspection
 Extracting the conceptual model
 Direct observation
– think-aloud
 Query techniques (interviews and questionnaires)
 Continuous evaluation (user feedback and field
studies)

14
Inspection
 Designer tries the system (or prototype)
– does the system “feel right”?
– benefits
 can catch some major problems in early versions
– problems
 not reliable as completely subjective
 not valid as introspector is a non-typical user
 intuitions and introspection are often wrong
 Inspection methods help
– Task Analysis
– Heuristic Evaluation
15
Conceptual Model Extraction
 How?
– show the user static images of
 the prototype or screens during use
– ask the user explain
 the function of each screen element
 how they would perform a particular task
 What?
– Initial conceptual model
 how person perceives a screen the very first time it is viewed
– Formative conceptual model
 How person perceives a screen after its been used for a while
 Value?
– good for eliciting people’s understanding before & after use
– poor for examining system exploration and learning
16
Direct Observations

 Evaluator observes users interacting with system


– in lab: user asked to complete a set of pre-determined
tasks
– in field: user goes through normal duties
 Value
– excellent at identifying gross design/interface problems
– validity depends on how controlled/contrived the situation
is

17
Simple Observation Method

 User is given the task


 Evaluator just watches the user
 Problem
– does not give insight into the user’s decision
process or attitude

18
Hmm, what does
this do? I’ll try
it… Ooops, now
what happened?

Think Aloud Method

 Users speak their thoughts while doing the task


– what they are trying to do
– why they took an action
– how they interpret what the system did
– gives insight into what the user is thinking
– most widely used evaluation method in industry
 may alter the way users do the task
 unnatural (awkward and uncomfortable)
 hard to talk if they are concentrating

19
Interviews
 Good for pursuing specific issues
– vary questions to suit the context
– probe more deeply on interesting issues as they arise
– good for exploratory studies via open-ended questioning
– often leads to specific constructive suggestions
 Problems:
– accounts are subjective
– time consuming
– evaluator can easily bias the interview
– prone to rationalization of events/thoughts by user
 user’s reconstruction may be wrong
20
Questionnaires and Surveys
 Questionnaires / Surveys
– preparation “expensive,” but
administration cheap
 can reach a wide subject
group (e.g. mail)
– does not require presence of
evaluator
– results can be quantified
 But
– only as good as the questions
asked

21
Continuous Evaluation
 Monitor systems in actual use
– usually late stages of development
 ie beta releases, delivered system
– fix problems in next release
 User feedback via gripe lines
– users can provide feedback to designers while using the
system
 help desks
 bulletin boards
 email
 built-in gripe facility
– best combined with trouble-shooting facility
 users always get a response (solution?) to their gripes
22
Continuous Evaluation

 Case/field studies
– careful study of “system usage” at the site
– good for seeing “real life” use
– external observer monitors behavior
– site visits

23
Revised Design Process

Task Analysis/ Requirement Visual Basic

Analyze
Analyze Design
Design Build
Build
Heuristic
Evaluation

Test
Test
Think-aloud Studies responses
24
Readings/ Questions

Stone:
Stone: Chapter
Chapter 20
20 && 27
27

25

You might also like